Sample records for autonomous space software

  1. Development of Autonomous Aerobraking - Phase 2

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.

    2013-01-01

    Phase 1 of the Development of Autonomous Aerobraking (AA) Assessment investigated the technical capability of transferring the processes of aerobraking maneuver (ABM) decision-making (currently performed on the ground by an extensive workforce and communicated to the spacecraft via the deep space network) to an efficient flight software algorithm onboard the spacecraft. This document describes Phase 2 of this study, which was a 12-month effort to improve and rigorously test the AA Development Software developed in Phase 1. Aerobraking maneuver; Autonomous Aerobraking; Autonomous Aerobraking Development Software; Deep Space Network; NASA Engineering and Safety Center

  2. Incorporating Manual and Autonomous Code Generation

    NASA Technical Reports Server (NTRS)

    McComas, David

    1998-01-01

    Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.

  3. Advanced Autonomous Systems for Space Operations

    NASA Astrophysics Data System (ADS)

    Gross, A. R.; Smith, B. D.; Muscettola, N.; Barrett, A.; Mjolssness, E.; Clancy, D. J.

    2002-01-01

    New missions of exploration and space operations will require unprecedented levels of autonomy to successfully accomplish their objectives. Inherently high levels of complexity, cost, and communication distances will preclude the degree of human involvement common to current and previous space flight missions. With exponentially increasing capabilities of computer hardware and software, including networks and communication systems, a new balance of work is being developed between humans and machines. This new balance holds the promise of not only meeting the greatly increased space exploration requirements, but simultaneously dramatically reducing the design, development, test, and operating costs. New information technologies, which take advantage of knowledge-based software, model-based reasoning, and high performance computer systems, will enable the development of a new generation of design and development tools, schedulers, and vehicle and system health management capabilities. Such tools will provide a degree of machine intelligence and associated autonomy that has previously been unavailable. These capabilities are critical to the future of advanced space operations, since the science and operational requirements specified by such missions, as well as the budgetary constraints will limit the current practice of monitoring and controlling missions by a standing army of ground-based controllers. System autonomy capabilities have made great strides in recent years, for both ground and space flight applications. Autonomous systems have flown on advanced spacecraft, providing new levels of spacecraft capability and mission safety. Such on-board systems operate by utilizing model-based reasoning that provides the capability to work from high-level mission goals, while deriving the detailed system commands internally, rather than having to have such commands transmitted from Earth. This enables missions of such complexity and communication` distances as are not otherwise possible, as well as many more efficient and low cost applications. In addition, utilizing component and system modeling and reasoning capabilities, autonomous systems will play an increasing role in ground operations for space missions, where they will both reduce the human workload as well as provide greater levels of monitoring and system safety. This paper will focus specifically on new and innovative software for remote, autonomous, space systems flight operations. Topics to be presented will include a brief description of key autonomous control concepts, the Remote Agent program that commanded the Deep Space 1 spacecraft to new levels of system autonomy, recent advances in distributed autonomous system capabilities, and concepts for autonomous vehicle health management systems. A brief description of teaming spacecraft and rovers for complex exploration missions will also be provided. New on-board software for autonomous science data acquisition for planetary exploration will be described, as well as advanced systems for safe planetary landings. A new multi-agent architecture that addresses some of the challenges of autonomous systems will be presented. Autonomous operation of ground systems will also be considered, including software for autonomous in-situ propellant production and management, and closed- loop ecological life support systems (CELSS). Finally, plans and directions for the future will be discussed.

  4. Autonomous Science on the EO-1 Mission

    NASA Technical Reports Server (NTRS)

    Chien, S.; Sherwood, R.; Tran, D.; Castano, R.; Cichy, B.; Davies, A.; Rabideau, G.; Tang, N.; Burl, M.; Mandl, D.; hide

    2003-01-01

    In mid-2003, we will fly software to detect science events that will drive autonomous scene selectionon board the New Millennium Earth Observing 1 (EO-1) spacecraft. This software will demonstrate the potential for future space missions to use onboard decision-making to detect science events and respond autonomously to capture short-lived science events and to downlink only the highest value science data.

  5. The Use of Software Agents for Autonomous Control of a DC Space Power System

    NASA Technical Reports Server (NTRS)

    May, Ryan D.; Loparo, Kenneth A.

    2014-01-01

    In order to enable manned deep-space missions, the spacecraft must be controlled autonomously using on-board algorithms. A control architecture is proposed to enable this autonomous operation for an spacecraft electric power system and then implemented using a highly distributed network of software agents. These agents collaborate and compete with each other in order to implement each of the control functions. A subset of this control architecture is tested against a steadystate power system simulation and found to be able to solve a constrained optimization problem with competing objectives using only local information.

  6. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  7. Range Safety for an Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Lanzi, Raymond J.; Simpson, James C.

    2010-01-01

    The Range Safety Algorithm software encapsulates the various constructs and algorithms required to accomplish Time Space Position Information (TSPI) data management from multiple tracking sources, autonomous mission mode detection and management, and flight-termination mission rule evaluation. The software evaluates various user-configurable rule sets that govern the qualification of TSPI data sources, provides a prelaunch autonomous hold-launch function, performs the flight-monitoring-and-termination functions, and performs end-of-mission safing

  8. Solving Autonomy Technology Gaps through Wireless Technology and Orion Avionics Architectural Principles

    NASA Astrophysics Data System (ADS)

    Black, Randy; Bai, Haowei; Michalicek, Andrew; Shelton, Blaine; Villela, Mark

    2008-01-01

    Currently, autonomy in space applications is limited by a variety of technology gaps. Innovative application of wireless technology and avionics architectural principles drawn from the Orion crew exploration vehicle provide solutions for several of these gaps. The Vision for Space Exploration envisions extensive use of autonomous systems. Economic realities preclude continuing the level of operator support currently required of autonomous systems in space. In order to decrease the number of operators, more autonomy must be afforded to automated systems. However, certification authorities have been notoriously reluctant to certify autonomous software in the presence of humans or when costly missions may be jeopardized. The Orion avionics architecture, drawn from advanced commercial aircraft avionics, is based upon several architectural principles including partitioning in software. Robust software partitioning provides "brick wall" separation between software applications executing on a single processor, along with controlled data movement between applications. Taking advantage of these attributes, non-deterministic applications can be placed in one partition and a "Safety" application created in a separate partition. This "Safety" partition can track the position of astronauts or critical equipment and prevent any unsafe command from executing. Only the Safety partition need be certified to a human rated level. As a proof-of-concept demonstration, Honeywell has teamed with the Ultra WideBand (UWB) Working Group at NASA Johnson Space Center to provide tracking of humans, autonomous systems, and critical equipment. Using UWB the NASA team can determine positioning to within less than one inch resolution, allowing a Safety partition to halt operation of autonomous systems in the event that an unplanned collision is imminent. Another challenge facing autonomous systems is the coordination of multiple autonomous agents. Current approaches address the issue as one of networking and coordination of multiple independent units, each with its own mission. As a proof-of-concept Honeywell is developing and testing various algorithms that lead to a deterministic, fault tolerant, reliable wireless backplane. Just as advanced avionics systems control several subsystems, actuators, sensors, displays, etc.; a single "master" autonomous agent (or base station computer) could control multiple autonomous systems. The problem is simplified to controlling a flexible body consisting of several sensors and actuators, rather than one of coordinating multiple independent units. By filling technology gaps associated with space based autonomous system, wireless technology and Orion architectural principles provide the means for decreasing operational costs and simplifying problems associated with collaboration of multiple autonomous systems.

  9. Capturing Requirements for Autonomous Spacecraft with Autonomy Requirements Engineering

    NASA Astrophysics Data System (ADS)

    Vassev, Emil; Hinchey, Mike

    2014-08-01

    The Autonomy Requirements Engineering (ARE) approach has been developed by Lero - the Irish Software Engineering Research Center within the mandate of a joint project with ESA, the European Space Agency. The approach is intended to help engineers develop missions for unmanned exploration, often with limited or no human control. Such robotics space missions rely on the most recent advances in automation and robotic technologies where autonomy and autonomic computing principles drive the design and implementation of unmanned spacecraft [1]. To tackle the integration and promotion of autonomy in software-intensive systems, ARE combines generic autonomy requirements (GAR) with goal-oriented requirements engineering (GORE). Using this approach, software engineers can determine what autonomic features to develop for a particular system (e.g., a space mission) as well as what artifacts that process might generate (e.g., goals models, requirements specification, etc.). The inputs required by this approach are the mission goals and the domain-specific GAR reflecting specifics of the mission class (e.g., interplanetary missions).

  10. A Robust Compositional Architecture for Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Deney, Ewen; Farrell, Kimberley; Giannakopoulos, Dimitra; Jonsson, Ari; Frank, Jeremy; Bobby, Mark; Carpenter, Todd; Estlin, Tara

    2006-01-01

    Space exploration applications can benefit greatly from autonomous systems. Great distances, limited communications and high costs make direct operations impossible while mandating operations reliability and efficiency beyond what traditional commanding can provide. Autonomous systems can improve reliability and enhance spacecraft capability significantly. However, there is reluctance to utilizing autonomous systems. In part this is due to general hesitation about new technologies, but a more tangible concern is that of reliability of predictability of autonomous software. In this paper, we describe ongoing work aimed at increasing robustness and predictability of autonomous software, with the ultimate goal of building trust in such systems. The work combines state-of-the-art technologies and capabilities in autonomous systems with advanced validation and synthesis techniques. The focus of this paper is on the autonomous system architecture that has been defined, and on how it enables the application of validation techniques for resulting autonomous systems.

  11. PHM Enabled Autonomous Propellant Loading Operations

    NASA Technical Reports Server (NTRS)

    Walker, Mark; Figueroa, Fernando

    2017-01-01

    The utility of Prognostics and Health Management (PHM) software capability applied to Autonomous Operations (AO) remains an active research area within aerospace applications. The ability to gain insight into which assets and subsystems are functioning properly, along with the derivation of confident predictions concerning future ability, reliability, and availability, are important enablers for making sound mission planning decisions. When coupled with software that fully supports mission planning and execution, an integrated solution can be developed that leverages state assessment and estimation for the purposes of delivering autonomous operations. The authors have been applying this integrated, model-based approach to the autonomous loading of cryogenic spacecraft propellants at Kennedy Space Center.

  12. Autonomous Power System intelligent diagnosis and control

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.; Quinn, Todd M.; Merolla, Anthony

    1991-01-01

    The Autonomous Power System (APS) project at NASA Lewis Research Center is designed to demonstrate the abilities of integrated intelligent diagnosis, control, and scheduling techniques to space power distribution hardware. Knowledge-based software provides a robust method of control for highly complex space-based power systems that conventional methods do not allow. The project consists of three elements: the Autonomous Power Expert System (APEX) for fault diagnosis and control, the Autonomous Intelligent Power Scheduler (AIPS) to determine system configuration, and power hardware (Brassboard) to simulate a space based power system. The operation of the Autonomous Power System as a whole is described and the responsibilities of the three elements - APEX, AIPS, and Brassboard - are characterized. A discussion of the methodologies used in each element is provided. Future plans are discussed for the growth of the Autonomous Power System.

  13. Autonomous power system intelligent diagnosis and control

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.; Quinn, Todd M.; Merolla, Anthony

    1991-01-01

    The Autonomous Power System (APS) project at NASA Lewis Research Center is designed to demonstrate the abilities of integrated intelligent diagnosis, control, and scheduling techniques to space power distribution hardware. Knowledge-based software provides a robust method of control for highly complex space-based power systems that conventional methods do not allow. The project consists of three elements: the Autonomous Power Expert System (APEX) for fault diagnosis and control, the Autonomous Intelligent Power Scheduler (AIPS) to determine system configuration, and power hardware (Brassboard) to simulate a space based power system. The operation of the Autonomous Power System as a whole is described and the responsibilities of the three elements - APEX, AIPS, and Brassboard - are characterized. A discussion of the methodologies used in each element is provided. Future plans are discussed for the growth of the Autonomous Power System.

  14. Integrated System for Autonomous Science

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Sherwood, Robert; Tran, Daniel; Cichy, Benjamin; Davies, Ashley; Castano, Rebecca; Rabideau, Gregg; Frye, Stuart; Trout, Bruce; Shulman, Seth; hide

    2006-01-01

    The New Millennium Program Space Technology 6 Project Autonomous Sciencecraft software implements an integrated system for autonomous planning and execution of scientific, engineering, and spacecraft-coordination actions. A prior version of this software was reported in "The TechSat 21 Autonomous Sciencecraft Experiment" (NPO-30784), NASA Tech Briefs, Vol. 28, No. 3 (March 2004), page 33. This software is now in continuous use aboard the Earth Orbiter 1 (EO-1) spacecraft mission and is being adapted for use in the Mars Odyssey and Mars Exploration Rovers missions. This software enables EO-1 to detect and respond to such events of scientific interest as volcanic activity, flooding, and freezing and thawing of water. It uses classification algorithms to analyze imagery onboard to detect changes, including events of scientific interest. Detection of such events triggers acquisition of follow-up imagery. The mission-planning component of the software develops a response plan that accounts for visibility of targets and operational constraints. The plan is then executed under control by a task-execution component of the software that is capable of responding to anomalies.

  15. Autonomous Agents and Intelligent Assistants for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2000-01-01

    Human exploration of space will involve remote autonomous crew and systems in long missions. Data to earth will be delayed and limited. Earth control centers will not receive continuous real-time telemetry data, and there will be communication round trips of up to one hour. There will be reduced human monitoring on the planet and earth. When crews are present on the planet, they will be occupied with other activities, and system management will be a low priority task. Earth control centers will use multi-tasking "night shift" and on-call specialists. A new project at Johnson Space Center is developing software to support teamwork between distributed human and software agents in future interplanetary work environments. The Engineering and Mission Operations Directorates at Johnson Space Center (JSC) are combining laboratories and expertise to carry out this project, by establishing a testbed for hWl1an centered design, development and evaluation of intelligent autonomous and assistant systems. Intelligent autonomous systems for managing systems on planetary bases will commuicate their knowledge to support distributed multi-agent mixed-initiative operations. Intelligent assistant agents will respond to events by developing briefings and responses according to instructions from human agents on earth and in space.

  16. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  17. Advances in Autonomous Systems for Missions of Space Exploration

    NASA Astrophysics Data System (ADS)

    Gross, A. R.; Smith, B. D.; Briggs, G. A.; Hieronymus, J.; Clancy, D. J.

    New missions of space exploration will require unprecedented levels of autonomy to successfully accomplish their objectives. Both inherent complexity and communication distances will preclude levels of human involvement common to current and previous space flight missions. With exponentially increasing capabilities of computer hardware and software, including networks and communication systems, a new balance of work is being developed between humans and machines. This new balance holds the promise of meeting the greatly increased space exploration requirements, along with dramatically reduced design, development, test, and operating costs. New information technologies, which take advantage of knowledge-based software, model-based reasoning, and high performance computer systems, will enable the development of a new generation of design and development tools, schedulers, and vehicle and system health monitoring and maintenance capabilities. Such tools will provide a degree of machine intelligence and associated autonomy that has previously been unavailable. These capabilities are critical to the future of space exploration, since the science and operational requirements specified by such missions, as well as the budgetary constraints that limit the ability to monitor and control these missions by a standing army of ground- based controllers. System autonomy capabilities have made great strides in recent years, for both ground and space flight applications. Autonomous systems have flown on advanced spacecraft, providing new levels of spacecraft capability and mission safety. Such systems operate by utilizing model-based reasoning that provides the capability to work from high-level mission goals, while deriving the detailed system commands internally, rather than having to have such commands transmitted from Earth. This enables missions of such complexity and communications distance as are not otherwise possible, as well as many more efficient and low cost applications. One notable example of such missions are those to explore for the existence of water on planets such as Mars and the moons of Jupiter. It is clear that water does not exist on the surfaces of such bodies, but may well be located at some considerable depth below the surface, thus requiring a subsurface drilling capability. Subsurface drilling on planetary surfaces will require a robust autonomous control and analysis system, currently a major challenge, but within conceivable reach of planned technology developments. This paper will focus on new and innovative software for remote, autonomous, space systems flight operations, including flight test results, lessons learned, and implications for the future. An additional focus will be on technologies for planetary exploration using autonomous systems and astronaut-assistance systems that employ new spoken language technology. Topics to be presented will include a description of key autonomous control concepts, illustrated by the Remote Agent program that commanded the Deep Space 1 spacecraft to new levels of system autonomy, recent advances in distributed autonomous system capabilities, and concepts for autonomous vehicle health management systems. A brief description of teaming spacecraft and rovers for complex exploration missions will also be provided. New software for autonomous science data acquisition for planetary exploration will also be described, as well as advanced systems for safe planetary landings. Current results of autonomous planetary drilling system research will be presented. A key thrust within NASA is to develop technologies that will leverage the capabilities of human astronauts during planetary surface explorations. One such technology is spoken dialogue interfaces, which would allow collaboration with semi-autonomous agents that are engaged in activities that are normally accomplished using language, e.g., astronauts in space suits interacting with groups of semi-autonomous rovers and other astronauts. This technology will be described and discussed in the context of future exploration missions and the major new capabilities enabled by such systems. Finally, plans and directions for the future of autonomous systems will be presented.

  18. Automated Operations Development for Advanced Exploration Systems

    NASA Technical Reports Server (NTRS)

    Haddock, Angie; Stetson, Howard K.

    2012-01-01

    Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide single button intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system on-board the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System [1] , along with the execution component design from within the HAL 9000 Space Operating System [2] , this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA s Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.

  19. Automated Operations Development for Advanced Exploration Systems

    NASA Technical Reports Server (NTRS)

    Haddock, Angie T.; Stetson, Howard

    2012-01-01

    Automated space operations command and control software development and its implementation must be an integral part of the vehicle design effort. The software design must encompass autonomous fault detection, isolation, recovery capabilities and also provide "single button" intelligent functions for the crew. Development, operations and safety approval experience with the Timeliner system onboard the International Space Station (ISS), which provided autonomous monitoring with response and single command functionality of payload systems, can be built upon for future automated operations as the ISS Payload effort was the first and only autonomous command and control system to be in continuous execution (6 years), 24 hours a day, 7 days a week within a crewed spacecraft environment. Utilizing proven capabilities from the ISS Higher Active Logic (HAL) System, along with the execution component design from within the HAL 9000 Space Operating System, this design paper will detail the initial HAL System software architecture and interfaces as applied to NASA's Habitat Demonstration Unit (HDU) in support of the Advanced Exploration Systems, Autonomous Mission Operations project. The development and implementation of integrated simulators within this development effort will also be detailed and is the first step in verifying the HAL 9000 Integrated Test-Bed Component [2] designs effectiveness. This design paper will conclude with a summary of the current development status and future development goals as it pertains to automated command and control for the HDU.

  20. Agent Based Software for the Autonomous Control of Formation Flying Spacecraft

    NASA Technical Reports Server (NTRS)

    How, Jonathan P.; Campbell, Mark; Dennehy, Neil (Technical Monitor)

    2003-01-01

    Distributed satellite systems is an enabling technology for many future NASA/DoD earth and space science missions, such as MMS, MAXIM, Leonardo, and LISA [1, 2, 3]. While formation flying offers significant science benefits, to reduce the operating costs for these missions it will be essential that these multiple vehicles effectively act as a single spacecraft by performing coordinated observations. Autonomous guidance, navigation, and control as part of a coordinated fleet-autonomy is a key technology that will help accomplish this complex goal. This is no small task, as most current space missions require significant input from the ground for even relatively simple decisions such as thruster burns. Work for the NMP DS1 mission focused on the development of the New Millennium Remote Agent (NMRA) architecture for autonomous spacecraft control systems. NMRA integrates traditional real-time monitoring and control with components for constraint-based planning, robust multi-threaded execution, and model-based diagnosis and reconfiguration. The complexity of using an autonomous approach for space flight software was evident when most of its capabilities were stripped off prior to launch (although more capability was uplinked subsequently, and the resulting demonstration was very successful).

  1. An autonomous fault detection, isolation, and recovery system for a 20-kHz electric power distribution test bed

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Walters, Jerry L.

    1991-01-01

    Future space explorations will require long term human presence in space. Space environments that provide working and living quarters for manned missions are becoming increasingly larger and more sophisticated. Monitor and control of the space environment subsystems by expert system software, which emulate human reasoning processes, could maintain the health of the subsystems and help reduce the human workload. The autonomous power expert (APEX) system was developed to emulate a human expert's reasoning processes used to diagnose fault conditions in the domain of space power distribution. APEX is a fault detection, isolation, and recovery (FDIR) system, capable of autonomous monitoring and control of the power distribution system. APEX consists of a knowledge base, a data base, an inference engine, and various support and interface software. APEX provides the user with an easy-to-use interactive interface. When a fault is detected, APEX will inform the user of the detection. The user can direct APEX to isolate the probable cause of the fault. Once a fault has been isolated, the user can ask APEX to justify its fault isolation and to recommend actions to correct the fault. APEX implementation and capabilities are discussed.

  2. Collaboration Between NASA Centers of Excellence on Autonomous System Software Development

    NASA Technical Reports Server (NTRS)

    Goodrich, Charles H.; Larson, William E.; Delgado, H. (Technical Monitor)

    2001-01-01

    Software for space systems flight operations has its roots in the early days of the space program when computer systems were incapable of supporting highly complex and flexible control logic. Control systems relied on fast data acquisition and supervisory control from a roomful of systems engineers on the ground. Even though computer hardware and software has become many orders of magnitude more capable, space systems have largely adhered to this original paradigm In an effort to break this mold, Kennedy Space Center (KSC) has invested in the development of model-based diagnosis and control applications for ten years having broad experience in both ground and spacecraft systems and software. KSC has now partnered with Ames Research Center (ARC), NASA's Center of Excellence in Information Technology, to create a new paradigm for the control of dynamic space systems. ARC has developed model-based diagnosis and intelligent planning software that enables spacecraft to handle most routine problems automatically and allocate resources in a flexible way to realize mission objectives. ARC demonstrated the utility of onboard diagnosis and planning with an experiment aboard Deep Space I in 1999. This paper highlights the software control system collaboration between KSC and ARC. KSC has developed a Mars In-situ Resource Utilization testbed based on the Reverse Water Gas Shift (RWGS) reaction. This plant, built in KSC's Applied Chemistry Laboratory, is capable of producing the large amount of Oxygen that would be needed to support a Human Mars Mission. KSC and ARC are cooperating to develop an autonomous, fault-tolerant control system for RWGS to meet the need for autonomy on deep space missions. The paper will also describe how the new system software paradigm will be applied to Vehicle Health Monitoring, tested on the new X vehicles and integrated into future launch processing systems.

  3. Autonomic Computing for Spacecraft Ground Systems

    NASA Technical Reports Server (NTRS)

    Li, Zhenping; Savkli, Cetin; Jones, Lori

    2007-01-01

    Autonomic computing for spacecraft ground systems increases the system reliability and reduces the cost of spacecraft operations and software maintenance. In this paper, we present an autonomic computing solution for spacecraft ground systems at NASA Goddard Space Flight Center (GSFC), which consists of an open standard for a message oriented architecture referred to as the GMSEC architecture (Goddard Mission Services Evolution Center), and an autonomic computing tool, the Criteria Action Table (CAT). This solution has been used in many upgraded ground systems for NASA 's missions, and provides a framework for developing solutions with higher autonomic maturity.

  4. Knowledge Acquisition for the Onboard Planner of an Autonomous Spacecraft

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Rajan, Kanna

    1997-01-01

    This paper discusses the knowledge acquisition issues involved in transitioning their novel technology in to space flight software, developing the planer in the context of a large software projet and completing the work under a compressed development schedule.

  5. Physics Simulation Software for Autonomous Propellant Loading and Gas House Autonomous System Monitoring

    NASA Technical Reports Server (NTRS)

    Regalado Reyes, Bjorn Constant

    2015-01-01

    1. Kennedy Space Center (KSC) is developing a mobile launching system with autonomous propellant loading capabilities for liquid-fueled rockets. An autonomous system will be responsible for monitoring and controlling the storage, loading and transferring of cryogenic propellants. The Physics Simulation Software will reproduce the sensor data seen during the delivery of cryogenic fluids including valve positions, pressures, temperatures and flow rates. The simulator will provide insight into the functionality of the propellant systems and demonstrate the effects of potential faults. This will provide verification of the communications protocols and the autonomous system control. 2. The High Pressure Gas Facility (HPGF) stores and distributes hydrogen, nitrogen, helium and high pressure air. The hydrogen and nitrogen are stored in cryogenic liquid state. The cryogenic fluids pose several hazards to operators and the storage and transfer equipment. Constant monitoring of pressures, temperatures and flow rates are required in order to maintain the safety of personnel and equipment during the handling and storage of these commodities. The Gas House Autonomous System Monitoring software will be responsible for constantly observing and recording sensor data, identifying and predicting faults and relaying hazard and operational information to the operators.

  6. The Logical Extension

    NASA Technical Reports Server (NTRS)

    2003-01-01

    The same software controlling autonomous and crew-assisted operations for the International Space Station (ISS) is enabling commercial enterprises to integrate and automate manual operations, also known as decision logic, in real time across complex and disparate networked applications, databases, servers, and other devices, all with quantifiable business benefits. Auspice Corporation, of Framingham, Massachusetts, developed the Auspice TLX (The Logical Extension) software platform to effectively mimic the human decision-making process. Auspice TLX automates operations across extended enterprise systems, where any given infrastructure can include thousands of computers, servers, switches, and modems that are connected, and therefore, dependent upon each other. The concept behind the Auspice software spawned from a computer program originally developed in 1981 by Cambridge, Massachusetts-based Draper Laboratory for simulating tasks performed by astronauts aboard the Space Shuttle. At the time, the Space Shuttle Program was dependent upon paper-based procedures for its manned space missions, which typically averaged 2 weeks in duration. As the Shuttle Program progressed, NASA began increasing the length of manned missions in preparation for a more permanent space habitat. Acknowledging the need to relinquish paper-based procedures in favor of an electronic processing format to properly monitor and manage the complexities of these longer missions, NASA realized that Draper's task simulation software could be applied to its vision of year-round space occupancy. In 1992, Draper was awarded a NASA contract to build User Interface Language software to enable autonomous operations of a multitude of functions on Space Station Freedom (the station was redesigned in 1993 and converted into the international venture known today as the ISS)

  7. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George I.; Stetson, Howard K.

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto-Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner- TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders

  8. Autonomous Real Time Requirements Tracing

    NASA Technical Reports Server (NTRS)

    Plattsmier, George; Stetson, Howard

    2014-01-01

    One of the more challenging aspects of software development is the ability to verify and validate the functional software requirements dictated by the Software Requirements Specification (SRS) and the Software Detail Design (SDD). Insuring the software has achieved the intended requirements is the responsibility of the Software Quality team and the Software Test team. The utilization of Timeliner-TLX(sup TM) Auto- Procedures for relocating ground operations positions to ISS automated on-board operations has begun the transition that would be required for manned deep space missions with minimal crew requirements. This transition also moves the auto-procedures from the procedure realm into the flight software arena and as such the operational requirements and testing will be more structured and rigorous. The autoprocedures would be required to meet NASA software standards as specified in the Software Safety Standard (NASASTD- 8719), the Software Engineering Requirements (NPR 7150), the Software Assurance Standard (NASA-STD-8739) and also the Human Rating Requirements (NPR-8705). The Autonomous Fluid Transfer System (AFTS) test-bed utilizes the Timeliner-TLX(sup TM) Language for development of autonomous command and control software. The Timeliner-TLX(sup TM) system has the unique feature of providing the current line of the statement in execution during real-time execution of the software. The feature of execution line number internal reporting unlocks the capability of monitoring the execution autonomously by use of a companion Timeliner-TLX(sup TM) sequence as the line number reporting is embedded inside the Timeliner-TLX(sup TM) execution engine. This negates I/O processing of this type data as the line number status of executing sequences is built-in as a function reference. This paper will outline the design and capabilities of the AFTS Autonomous Requirements Tracker, which traces and logs SRS requirements as they are being met during real-time execution of the targeted system. It is envisioned that real time requirements tracing will greatly assist the movement of autoprocedures to flight software enhancing the software assurance of auto-procedures and also their acceptance as reliable commanders.

  9. Demonstration of Autonomous Rendezvous Technology (DART) Project Summary

    NASA Technical Reports Server (NTRS)

    Rumford, TImothy E.

    2003-01-01

    Since the 1960's, NASA has performed numerous rendezvous and docking missions. The common element of all US rendezvous and docking is that the spacecraft has always been piloted by astronauts. Only the Russian Space Program has developed and demonstrated an autonomous capability. The Demonstration of Autonomous Rendezvous Technology (DART) project currently funded under NASA's Space Launch Initiative (SLI) Cycle I, provides a key step in establishing an autonomous rendezvous capability for the United States. DART's objective is to demonstrate, in space, the hardware and software necessary for autonomous rendezvous. Orbital Sciences Corporation intends to integrate an Advanced Video Guidance Sensor and Autonomous Rendezvous and Proximity Operations algorithms into a Pegasus upper stage in order to demonstrate the capability to autonomously rendezvous with a target currently in orbit. The DART mission will occur in April 2004. The launch site will be Vandenburg AFB and the launch vehicle will be a Pegasus XL equipped with a Hydrazine Auxiliary Propulsion System 4th stage. All mission objectives will be completed within a 24 hour period. The paper provides a summary of mission objectives, mission overview and a discussion on the design features of the chase and target vehicles.

  10. Autonomous Scheduling Requirements for Agile Cubesat Constellations in Earth Observation

    NASA Astrophysics Data System (ADS)

    Nag, S.; Li, A. S. X.; Kumar, S.

    2017-12-01

    Distributed Space Missions such as formation flight and constellations, are being recognized as important Earth Observation solutions to increase measurement samples over space and time. Cubesats are increasing in size (27U, 40 kg) with increasing capabilities to host imager payloads. Given the precise attitude control systems emerging commercially, Cubesats now have the ability to slew and capture images within short notice. Prior literature has demonstrated a modular framework that combines orbital mechanics, attitude control and scheduling optimization to plan the time-varying orientation of agile Cubesats in a constellation such that they maximize the number of observed images, within the constraints of hardware specs. Schedule optimization is performed on the ground autonomously, using dynamic programming with two levels of heuristics, verified and improved upon using mixed integer linear programming. Our algorithm-in-the-loop simulation applied to Landsat's use case, captured up to 161% more Landsat images than nadir-pointing sensors with the same field of view, on a 2-satellite constellation over a 12-hour simulation. In this paper, we will derive the requirements for the above algorithm to run onboard small satellites such that the constellation can make time-sensitive decisions to slew and capture images autonomously, without ground support. We will apply the above autonomous algorithm to a time critical use case - monitoring of precipitation and subsequent effects on floods, landslides and soil moisture, as quantified by the NASA Unified Weather Research and Forecasting Model. Since the latency between these event occurrences is quite low, they make a strong case for autonomous decisions among satellites in a constellation. The algorithm can be implemented in the Plan Execution Interchange Language - NASA's open source technology for automation, used to operate the International Space Station and LADEE's in flight software - enabling a controller-in-the-loop demonstration. The autonomy software can then be integrated with NASA's open source Core Flight Software, ported onto a Raspberry Pi 3.0 for a software-in-the-loop demonstration. Future use cases can be time critical events such as cloud movement, storms or other disasters, and in conjunction with other platforms in a Sensor Web.

  11. Autonomous Deep-Space Optical Navigation Project

    NASA Technical Reports Server (NTRS)

    D'Souza, Christopher

    2014-01-01

    This project will advance the Autonomous Deep-space navigation capability applied to Autonomous Rendezvous and Docking (AR&D) Guidance, Navigation and Control (GNC) system by testing it on hardware, particularly in a flight processor, with a goal of limited testing in the Integrated Power, Avionics and Software (IPAS) with the ARCM (Asteroid Retrieval Crewed Mission) DRO (Distant Retrograde Orbit) Autonomous Rendezvous and Docking (AR&D) scenario. The technology, which will be harnessed, is called 'optical flow', also known as 'visual odometry'. It is being matured in the automotive and SLAM (Simultaneous Localization and Mapping) applications but has yet to be applied to spacecraft navigation. In light of the tremendous potential of this technique, we believe that NASA needs to design a optical navigation architecture that will use this technique. It is flexible enough to be applicable to navigating around planetary bodies, such as asteroids.

  12. Autonomous docking system for space structures and satellites

    NASA Astrophysics Data System (ADS)

    Prasad, Guru; Tajudeen, Eddie; Spenser, James

    2005-05-01

    Aximetric proposes Distributed Command and Control (C2) architecture for autonomous on-orbit assembly in space with our unique vision and sensor driven docking mechanism. Aximetric is currently working on ip based distributed control strategies, docking/mating plate, alignment and latching mechanism, umbilical structure/cord designs, and hardware/software in a closed loop architecture for smart autonomous demonstration utilizing proven developments in sensor and docking technology. These technologies can be effectively applied to many transferring/conveying and on-orbit servicing applications to include the capturing and coupling of space bound vehicles and components. The autonomous system will be a "smart" system that will incorporate a vision system used for identifying, tracking, locating and mating the transferring device to the receiving device. A robustly designed coupler for the transfer of the fuel will be integrated. Advanced sealing technology will be utilized for isolation and purging of resulting cavities from the mating process and/or from the incorporation of other electrical and data acquisition devices used as part of the overall smart system.

  13. Development of an automated electrical power subsystem testbed for large spacecraft

    NASA Technical Reports Server (NTRS)

    Hall, David K.; Lollar, Louis F.

    1990-01-01

    The NASA Marshall Space Flight Center (MSFC) has developed two autonomous electrical power system breadboards. The first breadboard, the autonomously managed power system (AMPS), is a two power channel system featuring energy generation and storage and 24-kW of switchable loads, all under computer control. The second breadboard, the space station module/power management and distribution (SSM/PMAD) testbed, is a two-bus 120-Vdc model of the Space Station power subsystem featuring smart switchgear and multiple knowledge-based control systems. NASA/MSFC is combining these two breadboards to form a complete autonomous source-to-load power system called the large autonomous spacecraft electrical power system (LASEPS). LASEPS is a high-power, intelligent, physical electrical power system testbed which can be used to derive and test new power system control techniques, new power switching components, and new energy storage elements in a more accurate and realistic fashion. LASEPS has the potential to be interfaced with other spacecraft subsystem breadboards in order to simulate an entire space vehicle. The two individual systems, the combined systems (hardware and software), and the current and future uses of LASEPS are described.

  14. A New Look at NASA: Strategic Research In Information Technology

    NASA Technical Reports Server (NTRS)

    Alfano, David; Tu, Eugene (Technical Monitor)

    2002-01-01

    This viewgraph presentation provides information on research undertaken by NASA to facilitate the development of information technologies. Specific ideas covered here include: 1) Bio/nano technologies: biomolecular and nanoscale systems and tools for assembly and computing; 2) Evolvable hardware: autonomous self-improving, self-repairing hardware and software for survivable space systems in extreme environments; 3) High Confidence Software Technologies: formal methods, high-assurance software design, and program synthesis; 4) Intelligent Controls and Diagnostics: Next generation machine learning, adaptive control, and health management technologies; 5) Revolutionary computing: New computational models to increase capability and robustness to enable future NASA space missions.

  15. Situation Awareness of Onboard System Autonomy

    NASA Technical Reports Server (NTRS)

    Schreckenghost, Debra; Thronesbery, Carroll; Hudson, Mary Beth

    2005-01-01

    We have developed intelligent agent software for onboard system autonomy. Our approach is to provide control agents that automate crew and vehicle systems, and operations assistants that aid humans in working with these autonomous systems. We use the 3 Tier control architecture to develop the control agent software that automates system reconfiguration and routine fault management. We use the Distributed Collaboration and Interaction (DCI) System to develop the operations assistants that provide human services, including situation summarization, event notification, activity management, and support for manual commanding of autonomous system. In this paper we describe how the operations assistants aid situation awareness of the autonomous control agents. We also describe our evaluation of the DCI System to support control engineers during a ground test at Johnson Space Center (JSC) of the Post Processing System (PPS) for regenerative water recovery.

  16. Systems Architecture for Fully Autonomous Space Missions

    NASA Technical Reports Server (NTRS)

    Esper, Jamie; Schnurr, R.; VanSteenberg, M.; Brumfield, Mark (Technical Monitor)

    2002-01-01

    The NASA Goddard Space Flight Center is working to develop a revolutionary new system architecture concept in support of fully autonomous missions. As part of GSFC's contribution to the New Millenium Program (NMP) Space Technology 7 Autonomy and on-Board Processing (ST7-A) Concept Definition Study, the system incorporates the latest commercial Internet and software development ideas and extends them into NASA ground and space segment architectures. The unique challenges facing the exploration of remote and inaccessible locales and the need to incorporate corresponding autonomy technologies within reasonable cost necessitate the re-thinking of traditional mission architectures. A measure of the resiliency of this architecture in its application to a broad range of future autonomy missions will depend on its effectiveness in leveraging from commercial tools developed for the personal computer and Internet markets. Specialized test stations and supporting software come to past as spacecraft take advantage of the extensive tools and research investments of billion-dollar commercial ventures. The projected improvements of the Internet and supporting infrastructure go hand-in-hand with market pressures that provide continuity in research. By taking advantage of consumer-oriented methods and processes, space-flight missions will continue to leverage on investments tailored to provide better services at reduced cost. The application of ground and space segment architectures each based on Local Area Networks (LAN), the use of personal computer-based operating systems, and the execution of activities and operations through a Wide Area Network (Internet) enable a revolution in spacecraft mission formulation, implementation, and flight operations. Hardware and software design, development, integration, test, and flight operations are all tied-in closely to a common thread that enables the smooth transitioning between program phases. The application of commercial software development techniques lays the foundation for delivery of product-oriented flight software modules and models. Software can then be readily applied to support the on-board autonomy required for mission self-management. An on-board intelligent system, based on advanced scripting languages, facilitates the mission autonomy required to offload ground system resources, and enables the spacecraft to manage itself safely through an efficient and effective process of reactive planning, science data acquisition, synthesis, and transmission to the ground. Autonomous ground systems in turn coordinate and support schedule contact times with the spacecraft. Specific autonomy software modules on-board include mission and science planners, instrument and subsystem control, and fault tolerance response software, all residing within a distributed computing environment supported through the flight LAN. Autonomy also requires the minimization of human intervention between users on the ground and the spacecraft, and hence calls for the elimination of the traditional operations control center as a funnel for data manipulation. Basic goal-oriented commands are sent directly from the user to the spacecraft through a distributed internet-based payload operations "center". The ensuing architecture calls for the use of spacecraft as point extensions on the Internet. This paper will detail the system architecture implementation chosen to enable cost-effective autonomous missions with applicability to a broad range of conditions. It will define the structure needed for implementation of such missions, including software and hardware infrastructures. The overall architecture is then laid out as a common thread in the mission life cycle from formulation through implementation and flight operations.

  17. CCSDS File Delivery Protocol (CFDP): Why it's Useful and How it Works

    NASA Technical Reports Server (NTRS)

    Ray, Tim

    2003-01-01

    Reliable delivery of data products is often required across space links. For example, a NASA mission will require reliable delivery of images produced by an on-board detector. Many missions have their own (unique) way of accomplishing this, requiring custom software. Many missions also require manual operations (e.g. the telemetry receiver software keeps track of what data is missing, and a person manually inputs the appropriate commands to request retransmissions). The Consultative Committee for Space Data Systems (CCSDS) developed the CCSDS File Delivery Protocol (CFDP) specifically for this situation. CFDP is an international standard communication protocol that provides reliable delivery of data products. It is designed for use across space links. It will work well if run over the widely used CCSDS Telemetry and Telecommand protocols. However, it can be run over any protocol, and will work well as long as the underlying protocol delivers a reasonable portion of the data. The CFDP receiver will autonomously determine what data is missing, and request retransmissions as needed. The CFDP sender will autonomously perform the requested transmissions. When the entire data product is delivered, the CFDP receiver will let the CFDP sender know that the transaction has completed successfully. The result is that custom software becomes standard, and manual operations become autonomous. This paper will consider various ways of achieving reliable file delivery, explain why CFDP is the optimal choice for use over space links, explain how the core protocol works, and give some guidance on how to best utilize CFDP within various mission scenarios. It will also touch on additional features of CFDP, as well as other uses for CFDP (e.g. the loading of on-board memory and tables).

  18. The HAL 9000 Space Operating System Real-Time Planning Engine Design and Operations Requirements

    NASA Technical Reports Server (NTRS)

    Stetson, Howard; Watson, Michael D.; Shaughnessy, Ray

    2012-01-01

    In support of future deep space manned missions, an autonomous/automated vehicle, providing crew autonomy and an autonomous response planning system, will be required due to the light time delays in communication. Vehicle capabilities as a whole must provide for tactical response to vehicle system failures and space environmental effects induced failures, for risk mitigation of permanent loss of communication with Earth, and for assured crew return capabilities. The complexity of human rated space systems and the limited crew sizes and crew skills mix drive the need for a robust autonomous capability on-board the vehicle. The HAL 9000 Space Operating System[2] designed for such missions and space craft includes the first distributed real-time planning / re-planning system. This paper will detail the software architecture of the multiple planning engine system, and the interface design for plan changes, approval and implementation that is performed autonomously. Operations scenarios will be defined for analysis of the planning engines operations and its requirements for nominal / off nominal activities. An assessment of the distributed realtime re-planning system, in the defined operations environment, will be provided as well as findings as it pertains to the vehicle, crew, and mission control requirements needed for implementation.

  19. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  20. Autonomous Payload Operations Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Stetson, Howard K.; Deitsch, David K.; Cruzen, Craig A.; Haddock, Angie T.

    2007-01-01

    Operating the International Space Station (ISS) involves many complex crew tended, ground operated and combined systems. Over the life of the ISS program, it has become evident that by having automated and autonomous systems on board, more can be accomplished and at the same time reduce the workload of the crew and ground operators. Engineers at the National Aeronautics and Space Administration's (NASA) Marshall Space Flight Center in Huntsville Alabama, working in collaboration with The Charles Stark Draper Laboratory have developed an autonomous software system that uses the Timeliner User Interface Language and expert logic to continuously monitor ISS payload systems, issue commands and signal ground operators as required. This paper describes the development history of the system, its concept of operation and components. The paper also discusses the testing process as well as the facilities used to develop the system. The paper concludes with a description of future enhancement plans for use on the ISS as well as potential applications to Lunar and Mars exploration systems.

  1. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  2. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  3. Simulation and Control Lab Development for Power and Energy Management for NASA Manned Deep Space Missions

    NASA Technical Reports Server (NTRS)

    McNelis, Anne M.; Beach, Raymond F.; Soeder, James F.; McNelis, Nancy B.; May, Ryan; Dever, Timothy P.; Trase, Larry

    2014-01-01

    The development of distributed hierarchical and agent-based control systems will allow for reliable autonomous energy management and power distribution for on-orbit missions. Power is one of the most critical systems on board a space vehicle, requiring quick response time when a fault or emergency is identified. As NASAs missions with human presence extend beyond low earth orbit autonomous control of vehicle power systems will be necessary and will need to reliably function for long periods of time. In the design of autonomous electrical power control systems there is a need to dynamically simulate and verify the EPS controller functionality prior to use on-orbit. This paper presents the work at NASA Glenn Research Center in Cleveland, Ohio where the development of a controls laboratory is being completed that will be utilized to demonstrate advanced prototype EPS controllers for space, aeronautical and terrestrial applications. The control laboratory hardware, software and application of an autonomous controller for demonstration with the ISS electrical power system is the subject of this paper.

  4. VML 3.0 Reactive Sequencing Objects and Matrix Math Operations for Attitude Profiling

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher A.; Riedel, Joseph E.

    2012-01-01

    VML (Virtual Machine Language) has been used as the sequencing flight software on over a dozen JPL deep-space missions, most recently flying on GRAIL and JUNO. In conjunction with the NASA SBIR entitled "Reactive Rendezvous and Docking Sequencer", VML version 3.0 has been enhanced to include object-oriented element organization, built-in queuing operations, and sophisticated matrix / vector operations. These improvements allow VML scripts to easily perform much of the work that formerly would have required a great deal of expensive flight software development to realize. Autonomous turning and tracking makes considerable use of new VML features. Profiles generated by flight software are managed using object-oriented VML data constructs executed in discrete time by the VML flight software. VML vector and matrix operations provide the ability to calculate and supply quaternions to the attitude controller flight software which produces torque requests. Using VML-based attitude planning components eliminates flight software development effort, and reduces corresponding costs. In addition, the direct management of the quaternions allows turning and tracking to be tied in with sophisticated high-level VML state machines. These state machines provide autonomous management of spacecraft operations during critical tasks like a hypothetic Mars sample return rendezvous and docking. State machines created for autonomous science observations can also use this sort of attitude planning system, allowing heightened autonomy levels to reduce operations costs. VML state machines cannot be considered merely sequences - they are reactive logic constructs capable of autonomous decision making within a well-defined domain. The state machine approach enabled by VML 3.0 is progressing toward flight capability with a wide array of applicable mission activities.

  5. CrossTalk. The Journal of Defense Software Engineering. Volume 26, Number 1

    DTIC Science & Technology

    2013-02-01

    ANTS) mission that may be used to explore the asteroid belt. Basically, the mission entails 1,000 two-pound autonomous space vehicles that will be...be used to collect data from asteroids that will be periodically transmitted back to earth. For autonomous operation, the ANTS will need to...priory information. In other words, these indicators are used to support any one of a number of situation assessments that have been predeter- mined

  6. NASA/NBS (National Aeronautics and Space Administration/National Bureau of Standards) standard reference model for telerobot control system architecture (NASREM)

    NASA Technical Reports Server (NTRS)

    Albus, James S.; Mccain, Harry G.; Lumia, Ronald

    1989-01-01

    The document describes the NASA Standard Reference Model (NASREM) Architecture for the Space Station Telerobot Control System. It defines the functional requirements and high level specifications of the control system for the NASA space Station document for the functional specification, and a guideline for the development of the control system architecture, of the 10C Flight Telerobot Servicer. The NASREM telerobot control system architecture defines a set of standard modules and interfaces which facilitates software design, development, validation, and test, and make possible the integration of telerobotics software from a wide variety of sources. Standard interfaces also provide the software hooks necessary to incrementally upgrade future Flight Telerobot Systems as new capabilities develop in computer science, robotics, and autonomous system control.

  7. RIACS Workshop on the Verification and Validation of Autonomous and Adaptive Systems

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles; Visser, Willem; Simmons, Reid

    2001-01-01

    The long-term future of space exploration at NASA is dependent on the full exploitation of autonomous and adaptive systems: careful monitoring of missions from earth, as is the norm now, will be infeasible due to the sheer number of proposed missions and the communication lag for deep-space missions. Mission managers are however worried about the reliability of these more intelligent systems. The main focus of the workshop was to address these worries and hence we invited NASA engineers working on autonomous and adaptive systems and researchers interested in the verification and validation (V&V) of software systems. The dual purpose of the meeting was to: (1) make NASA engineers aware of the V&V techniques they could be using; and (2) make the V&V community aware of the complexity of the systems NASA is developing.

  8. NASA's Swarm Missions: The Challenge of Building Autonomous Software

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Hinchey, Mike; Rash, James; Rouff, Christopher

    2004-01-01

    The days of watching a massive manned cylinder thrust spectacularly off a platform into space might rapidly become ancient history when the National Aeronautics and Space Administration (NASA) introduces its new millenium mission class. Motivated by the need to gather more data than is possible with a single spacecraft, scientists have developed a new class of missions based on the efficiency and cooperative nature of a hive culture. The missions, aptly dubbed nanoswarm will be little more than mechanized colonies cooperating in their exploration of the solar system. Each swarm mission can have hundreds or even thousands of cooperating intelligent spacecraft that work in teams. The spacecraft must operate independently for long periods both in teams and individually, as well as have autonomic properties - self-healing, -configuring, -optimizing, and -protecting- to survive the harsh space environment. One swarm mission under concept development for 2020 to 2030 is the Autonomous Nano Technology Swarm (ANTS), in which a thousand picospacecraft, each weighing less than three pounds, will work cooperatively to explore the asteroid belt. Some spacecraft will form teams to catalog asteroid properties, such as mass, density, morphology, and chemical composition, using their respective miniature scientific instruments. Others will communicate with the data gatherers and send updates to mission elements on Earth. For software and systems development, this is uncharted territory that calls for revolutionary techniques.

  9. Guidance and Navigation Software Architecture Design for the Autonomous Multi-Agent Physically Interacting Spacecraft (AMPHIS) Test Bed

    DTIC Science & Technology

    2006-12-01

    NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI-AGENT PHYSICALLY INTERACTING SPACECRAFT (AMPHIS) TEST BED by Blake D. Eikenberry...Engineer Degree 4. TITLE AND SUBTITLE Guidance and Navigation Software Architecture Design for the Autonomous Multi- Agent Physically Interacting...iii Approved for public release; distribution is unlimited GUIDANCE AND NAVIGATION SOFTWARE ARCHITECTURE DESIGN FOR THE AUTONOMOUS MULTI

  10. Sensor Webs: Autonomous Rapid Response to Monitor Transient Science Events

    NASA Technical Reports Server (NTRS)

    Mandl, Dan; Grosvenor, Sandra; Frye, Stu; Sherwood, Robert; Chien, Steve; Davies, Ashley; Cichy, Ben; Ingram, Mary Ann; Langley, John; Miranda, Felix

    2005-01-01

    To better understand how physical phenomena, such as volcanic eruptions, evolve over time, multiple sensor observations over the duration of the event are required. Using sensor web approaches that integrate original detections by in-situ sensors and global-coverage, lower-resolution, on-orbit assets with automated rapid response observations from high resolution sensors, more observations of significant events can be made with increased temporal, spatial, and spectral resolution. This paper describes experiments using Earth Observing 1 (EO-1) along with other space and ground assets to implement progressive mission autonomy to identify, locate and image with high resolution instruments phenomena such as wildfires, volcanoes, floods and ice breakup. The software that plans, schedules and controls the various satellite assets are used to form ad hoc constellations which enable collaborative autonomous image collections triggered by transient phenomena. This software is both flight and ground based and works in concert to run all of the required assets cohesively and includes software that is model-based, artificial intelligence software.

  11. Autonomous power system brassboard

    NASA Technical Reports Server (NTRS)

    Merolla, Anthony

    1992-01-01

    The Autonomous Power System (APS) brassboard is a 20 kHz power distribution system which has been developed at NASA Lewis Research Center, Cleveland, Ohio. The brassboard exists to provide a realistic hardware platform capable of testing artificially intelligent (AI) software. The brassboard's power circuit topology is based upon a Power Distribution Control Unit (PDCU), which is a subset of an advanced development 20 kHz electrical power system (EPS) testbed, originally designed for Space Station Freedom (SSF). The APS program is designed to demonstrate the application of intelligent software as a fault detection, isolation, and recovery methodology for space power systems. This report discusses both the hardware and software elements used to construct the present configuration of the brassboard. The brassboard power components are described. These include the solid-state switches (herein referred to as switchgear), transformers, sources, and loads. Closely linked to this power portion of the brassboard is the first level of embedded control. Hardware used to implement this control and its associated software is discussed. An Ada software program, developed by Lewis Research Center's Space Station Freedom Directorate for their 20 kHz testbed, is used to control the brassboard's switchgear, as well as monitor key brassboard parameters through sensors located within these switches. The Ada code is downloaded from a PC/AT, and is resident within the 8086 microprocessor-based embedded controllers. The PC/AT is also used for smart terminal emulation, capable of controlling the switchgear as well as displaying data from them. Intelligent control is provided through use of a T1 Explorer and the Autonomous Power Expert (APEX) LISP software. Real-time load scheduling is implemented through use of a 'C' program-based scheduling engine. The methods of communication between these computers and the brassboard are explored. In order to evaluate the features of both the brassboard hardware and intelligent controlling software, fault circuits have been developed and integrated as part of the brassboard. A description of these fault circuits and their function is included. The brassboard has become an extremely useful test facility, promoting artificial intelligence (AI) applications for power distribution systems. However, there are elements of the brassboard which could be enhanced, thus improving system performance. Modifications and enhancements to improve the brassboard's operation are discussed.

  12. Autonomous robot software development using simple software components

    NASA Astrophysics Data System (ADS)

    Burke, Thomas M.; Chung, Chan-Jin

    2004-10-01

    Developing software to control a sophisticated lane-following, obstacle-avoiding, autonomous robot can be demanding and beyond the capabilities of novice programmers - but it doesn"t have to be. A creative software design utilizing only basic image processing and a little algebra, has been employed to control the LTU-AISSIG autonomous robot - a contestant in the 2004 Intelligent Ground Vehicle Competition (IGVC). This paper presents a software design equivalent to that used during the IGVC, but with much of the complexity removed. The result is an autonomous robot software design, that is robust, reliable, and can be implemented by programmers with a limited understanding of image processing. This design provides a solid basis for further work in autonomous robot software, as well as an interesting and achievable robotics project for students.

  13. Interesting viewpoints to those who will put Ada into practice

    NASA Technical Reports Server (NTRS)

    Carlsson, Arne

    1986-01-01

    Ada will most probably be used as the programming language for computers in the NASA Space Station. It is reasonable to suppose that Ada will be used for at least embedded computers, because the high software costs for these embedded computers were the reason why Ada activities were initiated about ten years ago. The on-board computers are designed for use in space applications, where maintenance by man is impossible. All manipulation of such computers has to be performed in an autonomous way or remote with commands from the ground. In a manned Space Station some maintenance work can be performed by service people on board, but there are still a lot of applications, which require autonomous computers, for example, vital Space Station functions and unmanned orbital transfer vehicles. Those aspect which have come out of the analysis of Ada characteristics together with the experience of requirements for embedded on-board computers in space applications are examined.

  14. Precise Image-Based Motion Estimation for Autonomous Small Body Exploration

    NASA Technical Reports Server (NTRS)

    Johnson, Andrew Edie; Matthies, Larry H.

    2000-01-01

    We have developed and tested a software algorithm that enables onboard autonomous motion estimation near small bodies using descent camera imagery and laser altimetry. Through simulation and testing, we have shown that visual feature tracking can decrease uncertainty in spacecraft motion to a level that makes landing on small, irregularly shaped, bodies feasible. Possible future work will include qualification of the algorithm as a flight experiment for the Deep Space 4/Champollion comet lander mission currently under study at the Jet Propulsion Laboratory.

  15. Design, Development and Testing of the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) Guidance, Navigation and Control System

    NASA Technical Reports Server (NTRS)

    Wagenknecht, J.; Fredrickson, S.; Manning, T.; Jones, B.

    2003-01-01

    Engineers at NASA Johnson Space Center have designed, developed, and tested a nanosatellite-class free-flyer intended for future external inspection and remote viewing of human spaceflight activities. The technology demonstration system, known as the Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam), has been integrated into the approximate form and function of a flight system. The primary focus has been to develop a system capable of providing external views of the International Space Station. The Mini AERCam system is spherical-shaped and less than eight inches in diameter. It has a full suite of guidance, navigation, and control hardware and software, and is equipped with two digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations. Tests have been performed in both a six degree-of-freedom closed-loop orbital simulation and on an air-bearing table. The Mini AERCam system can also be used as a test platform for evaluating algorithms and relative navigation for autonomous proximity operations and docking around the Space Shuttle Orbiter or the ISS.

  16. Geometry-Based Observability Metric

    NASA Technical Reports Server (NTRS)

    Eaton, Colin; Naasz, Bo

    2012-01-01

    The Satellite Servicing Capabilities Office (SSCO) is currently developing and testing Goddard s Natural Feature Image Recognition (GNFIR) software for autonomous rendezvous and docking missions. GNFIR has flight heritage and is still being developed and tailored for future missions with non-cooperative targets: (1) DEXTRE Pointing Package System on the International Space Station, (2) Relative Navigation System (RNS) on the Space Shuttle for the fourth Hubble Servicing Mission.

  17. Multiple Autonomous Discrete Event Controllers for Constellations

    NASA Technical Reports Server (NTRS)

    Esposito, Timothy C.

    2003-01-01

    The Multiple Autonomous Discrete Event Controllers for Constellations (MADECC) project is an effort within the National Aeronautics and Space Administration Goddard Space Flight Center's (NASA/GSFC) Information Systems Division to develop autonomous positioning and attitude control for constellation satellites. It will be accomplished using traditional control theory and advanced coordination algorithms developed by the Johns Hopkins University Applied Physics Laboratory (JHU/APL). This capability will be demonstrated in the discrete event control test-bed located at JHU/APL. This project will be modeled for the Leonardo constellation mission, but is intended to be adaptable to any constellation mission. To develop a common software architecture. the controllers will only model very high-level responses. For instance, after determining that a maneuver must be made. the MADECC system will output B (Delta)V (velocity change) value. Lower level systems must then decide which thrusters to fire and for how long to achieve that (Delta)V.

  18. Network, system, and status software enhancements for the autonomously managed electrical power system breadboard. Volume 1: Project summary

    NASA Technical Reports Server (NTRS)

    Mckee, James W.

    1990-01-01

    This volume (1 of 4) gives a summary of the original AMPS software system configuration, points out some of the problem areas in the original software design that this project is to address, and in the appendix collects all the bimonthly status reports. The purpose of AMPS is to provide a self reliant system to control the generation and distribution of power in the space station. The software in the AMPS breadboard can be divided into three levels: the operating environment software, the protocol software, and the station specific software. This project deals only with the operating environment software and the protocol software. The present station specific software will not change except as necessary to conform to new data formats.

  19. Modular Software for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hartman, K. R.; Weidow, D. A.; Berry, D. L.; Oza, D. H.; Long, A. C.; Joyce, E.; Steger, W. L.

    1996-01-01

    The Goddard Space Flight Center Flight Dynamics and Mission Operations Divisions have jointly investigated the feasibility of engineering modular Global Positioning SYSTEM (GPS) navigation software to support both real time flight and ground postprocessing configurations. The goals of this effort are to define standard GPS data interfaces and to engineer standard, reusable navigation software components that can be used to build a broad range of GPS navigation support applications. The paper discusses the GPS modular software (GMOD) system and operations concepts, major requirements, candidate software architecture, feasibility assessment and recommended software interface standards. In additon, ongoing efforts to broaden the scope of the initial study and to develop modular software to support autonomous navigation using GPS are addressed,

  20. Autonomous Science Analysis with the New Millennium Program-Autonomous Sciencecraft Experiment

    NASA Astrophysics Data System (ADS)

    Doggett, T.; Davies, A. G.; Castano, R. A.; Baker, V. R.; Dohm, J. M.; Greeley, R.; Williams, K. K.; Chien, S.; Sherwood, R.

    2002-12-01

    The NASA New Millennium Program (NMP) is a testbed for new, high-risk technologies, including new software and hardware. The Autonomous Sciencecraft Experiment (ASE) will fly on the Air Force Research Laboratory TechSat-21 mission in 2006 is such a NMP mission, and is managed by the Jet Propulsion Laboratory, California Institute of Technology. TechSat-21 consists of three satellites, each equipped with X-band Synthetic Aperture Radar (SAR) that will occupy a 13-day repeat track Earth orbit. The main science objectives of ASE are to demonstrate that process-related change detection and feature identification can be conducted autonomously during space flight, leading to autonomous onboard retargeting of the spacecraft. This mission will observe transient geological and environmental processes using SAR. Examples of geologic processes that may be observed and investigated include active volcanism, the movement of sand dunes and transient features in desert environments, water flooding, and the formation and break-up of lake ice. Science software onboard the spacecraft will allow autonomous processing and formation of SAR images and extraction of scientific information. The subsequent analyses, performed on images formed onboard from the SAR data, will include feature identification using scalable feature "templates" for each target, change detection through comparison of current and archived images, and science discovery, a search for other features of interest in each image. This approach results in obtaining the same science return for a reduced amount of resource use (such as downlink) when compared to that from a mission operating without ASE technology. Redundant data is discarded. The science-driven goals of ASE will evolve during the ASE mission through onboard replanning software that can re-task satellite operations. If necessary, as a result of a discovery made autonomously by onboard science processing, existing observation sequences will be pre-empted to obtain data of potential high scientific content. Flight validation of this software will enable radically different missions with significant onboard decision-making and novel science concepts (onboard decision making and selective data return). This work has been carried out at the Jet Propulsion Laboratory-California Institute of Technology, under contract to NASA.

  1. Space Station Freedom ECLSS: A step toward autonomous regenerative life support systems

    NASA Technical Reports Server (NTRS)

    Dewberry, Brandon S.

    1990-01-01

    The Environmental Control and Life Support System (ECLSS) is a Freedom Station distributed system with inherent applicability to extensive automation primarily due to its comparatively long control system latencies. These allow longer contemplation times in which to form a more intelligent control strategy and to prevent and diagnose faults. The regenerative nature of the Space Station Freedom ECLSS will contribute closed loop complexities never before encountered in life support systems. A study to determine ECLSS automation approaches has been completed. The ECLSS baseline software and system processes could be augmented with more advanced fault management and regenerative control systems for a more autonomous evolutionary system, as well as serving as a firm foundation for future regenerative life support systems. Emerging advanced software technology and tools can be successfully applied to fault management, but a fully automated life support system will require research and development of regenerative control systems and models. The baseline Environmental Control and Life Support System utilizes ground tests in development of batch chemical and microbial control processes. Long duration regenerative life support systems will require more active chemical and microbial feedback control systems which, in turn, will require advancements in regenerative life support models and tools. These models can be verified using ground and on orbit life support test and operational data, and used in the engineering analysis of proposed intelligent instrumentation feedback and flexible process control technologies for future autonomous regenerative life support systems, including the evolutionary Space Station Freedom ECLSS.

  2. Metrics of a Paradigm for Intelligent Control

    NASA Technical Reports Server (NTRS)

    Hexmoor, Henry

    1999-01-01

    We present metrics for quantifying organizational structures of complex control systems intended for controlling long-lived robotic or other autonomous applications commonly found in space applications. Such advanced control systems are often called integration platforms or agent architectures. Reported metrics span concerns about time, resources, software engineering, and complexities in the world.

  3. Design and implementation of a compliant robot with force feedback and strategy planning software

    NASA Technical Reports Server (NTRS)

    Premack, T.; Strempek, F. M.; Solis, L. A.; Brodd, S. S.; Cutler, E. P.; Purves, L. R.

    1984-01-01

    Force-feedback robotics techniques are being developed for automated precision assembly and servicing of NASA space flight equipment. Design and implementation of a prototype robot which provides compliance and monitors forces is in progress. Computer software to specify assembly steps and makes force feedback adjustments during assembly are coded and tested for three generically different precision mating problems. A model program demonstrates that a suitably autonomous robot can plan its own strategy.

  4. Space Telecommunications Radio System STRS Cognitive Radio

    NASA Technical Reports Server (NTRS)

    Briones, Janette C.; Handler, Louis M.

    2013-01-01

    Radios today are evolving from awareness toward cognition. A software defined radio (SDR) provides the most capability for integrating autonomic decision making ability and allows the incremental evolution toward a cognitive radio. This cognitive radio technology will impact NASA space communications in areas such as spectrum utilization, interoperability, network operations, and radio resource management over a wide range of operating conditions. NASAs cognitive radio will build upon the infrastructure being developed by Space Telecommunication Radio System (STRS) SDR technology. This paper explores the feasibility of inserting cognitive capabilities in the NASA STRS architecture and the interfaces between the cognitive engine and the STRS radio. The STRS architecture defines methods that can inform the cognitive engine about the radio environment so that the cognitive engine can learn autonomously from experience, and take appropriate actions to adapt the radio operating characteristics and optimize performance.

  5. The Autonomous Sciencecraft and applications to future science missions

    NASA Astrophysics Data System (ADS)

    Chien, S.

    2006-05-01

    The Autonomous Sciencecraft Software has operated the Earth Observing One (EO-1) Mission for over 5000 science observations [Chien et al. 2005a]. This software enables onboard analysis of data to drive: 1. production of rapid alerts summary products, 2. data editing, and 3. to inform subsequent observations. This methodology has been applied to more effectively study Volcano, Flooding, and Cryosphere processes on Earth. In this talk we discuss how this software enables new paradigms for science missions and discuss the types of science phenomena that can now be more readily studied (e.g. dynamic investigations, large scale searches for specific events). We also describe a range of Earth, Solar, and Space science applications under concept study for onboard autonomy. Finally, we describe ongoing work to link EO-1 with other spacecraft and in-situ sensor networks to enable a sensorweb for monitoring dynamic science events [Chien et al. 2005b]. S. Chien, R. Sherwood, D. Tran, B. Cichy, G. Rabideau, R. Castano, A. Davies, D. Mandl, S. Frye, B. Trout, S. Shulman, D. Boyer, "Using Autonomy Flight Software to Improve Science Return on Earth Observing One, Journal of Aerospace Computing, Information, & Communication, April 2005, AIAA. S. Chien, B. Cichy, A. Davies, D. Tran, G. Rabideau, R. Castano, R. Sherwood, D. Mandl, S. Frye, S. Shulman, J. Jones, S. Grosvenor, "An Autonomous Earth Observing Sensorweb," IEEE Intelligent Systems, May-June 2005, pp. 16- 24.

  6. Lean Development with the Morpheus Simulation Software

    NASA Technical Reports Server (NTRS)

    Brogley, Aaron C.

    2013-01-01

    The Morpheus project is an autonomous robotic testbed currently in development at NASA's Johnson Space Center (JSC) with support from other centers. Its primary objectives are to test new 'green' fuel propulsion systems and to demonstrate the capability of the Autonomous Lander Hazard Avoidance Technology (ALHAT) sensor, provided by the Jet Propulsion Laboratory (JPL) on a lunar landing trajectory. If successful, these technologies and lessons learned from the Morpheus testing cycle may be incorporated into a landing descent vehicle used on the moon, an asteroid, or Mars. In an effort to reduce development costs and cycle time, the project employs lean development engineering practices in its development of flight and simulation software. The Morpheus simulation makes use of existing software packages where possible to reduce the development time. The development and testing of flight software occurs primarily through the frequent test operation of the vehicle and incrementally increasing the scope of the test. With rapid development cycles, risk of loss of the vehicle and loss of the mission are possible, but efficient progress in development would not be possible without that risk.

  7. Autonomous Aerobraking: A Design, Development, and Feasibility Study

    NASA Technical Reports Server (NTRS)

    Prince, Jill L. H.; Powell, Richard W.; Murri, Dan

    2011-01-01

    Aerobraking has been used four times to decrease the apoapsis of a spacecraft in a captured orbit around a planetary body with a significant atmosphere utilizing atmospheric drag to decelerate the spacecraft. While aerobraking requires minimum fuel, the long time required for aerobraking requires both a large operations staff, and large Deep Space Network resources. A study to automate aerobraking has been sponsored by the NASA Engineering and Safety Center to determine initial feasibility of equipping a spacecraft with the onboard capability for autonomous aerobraking, thus saving millions of dollars incurred by a large aerobraking operations workforce and continuous DSN coverage. This paper describes the need for autonomous aerobraking, the development of the Autonomous Aerobraking Development Software that includes an ephemeris estimator, an atmospheric density estimator, and maneuver calculation, and the plan forward for continuation of this study.

  8. Implementation and Simulation Results using Autonomous Aerobraking Development Software

    NASA Technical Reports Server (NTRS)

    Maddock, Robert W.; DwyerCianciolo, Alicia M.; Bowes, Angela; Prince, Jill L. H.; Powell, Richard W.

    2011-01-01

    An Autonomous Aerobraking software system is currently under development with support from the NASA Engineering and Safety Center (NESC) that would move typically ground-based operations functions to onboard an aerobraking spacecraft, reducing mission risk and mission cost. The suite of software that will enable autonomous aerobraking is the Autonomous Aerobraking Development Software (AADS) and consists of an ephemeris model, onboard atmosphere estimator, temperature and loads prediction, and a maneuver calculation. The software calculates the maneuver time, magnitude and direction commands to maintain the spacecraft periapsis parameters within design structural load and/or thermal constraints. The AADS is currently tested in simulations at Mars, with plans to also evaluate feasibility and performance at Venus and Titan.

  9. HyspIRI Intelligent Payload Module(IPM) and Benchmarking Algorithms for Upload

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel

    2010-01-01

    Features: Hardware: a) Xilinx Virtex-5 (GSFC Space Cube 2); b) 2 x 400MHz PPC; c) 100MHz Bus; d) 2 x 512MB SDRAM; e) Dual Gigabit Ethernet. Support Linux kernel 2.6.31 (gcc version 4.2.2). Support software running in stand alone mode for better performance. Can stream raw data up to 800 Mbps. Ready for operations. Software Application Examples: Band-stripping Algiotrhmsl:cloud, sulfur, flood, thermal, SWIL, NDVI, NDWI, SIWI, oil spills, algae blooms, etc. Corrections: geometric, radiometric, atmospheric. Core Flight System/dynamic software bus. CCSDS File Delivery Protocol. Delay Tolerant Network. CASPER /onboard planning. Fault monitoring/recovery software. S/C command and telemetry software. Data compression. Sensor Web for Autonomous Mission Operations.

  10. Space Station man-machine automation trade-off analysis

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.; Bard, J.; Feinberg, A.

    1985-01-01

    The man machine automation tradeoff methodology presented is of four research tasks comprising the autonomous spacecraft system technology (ASST) project. ASST was established to identify and study system level design problems for autonomous spacecraft. Using the Space Station as an example spacecraft system requiring a certain level of autonomous control, a system level, man machine automation tradeoff methodology is presented that: (1) optimizes man machine mixes for different ground and on orbit crew functions subject to cost, safety, weight, power, and reliability constraints, and (2) plots the best incorporation plan for new, emerging technologies by weighing cost, relative availability, reliability, safety, importance to out year missions, and ease of retrofit. A fairly straightforward approach is taken by the methodology to valuing human productivity, it is still sensitive to the important subtleties associated with designing a well integrated, man machine system. These subtleties include considerations such as crew preference to retain certain spacecraft control functions; or valuing human integration/decision capabilities over equivalent hardware/software where appropriate.

  11. A Space Station robot walker and its shared control software

    NASA Technical Reports Server (NTRS)

    Xu, Yangsheng; Brown, Ben; Aoki, Shigeru; Yoshida, Tetsuji

    1994-01-01

    In this paper, we first briefly overview the update of the self-mobile space manipulator (SMSM) configuration and testbed. The new robot is capable of projecting cameras anywhere interior or exterior of the Space Station Freedom (SSF), and will be an ideal tool for inspecting connectors, structures, and other facilities on SSF. Experiments have been performed under two gravity compensation systems and a full-scale model of a segment of SSF. This paper presents a real-time shared control architecture that enables the robot to coordinate autonomous locomotion and teleoperation input for reliable walking on SSF. Autonomous locomotion can be executed based on a CAD model and off-line trajectory planning, or can be guided by a vision system with neural network identification. Teleoperation control can be specified by a real-time graphical interface and a free-flying hand controller. SMSM will be a valuable assistant for astronauts in inspection and other EVA missions.

  12. Overview of computational control research at UT Austin

    NASA Technical Reports Server (NTRS)

    Bong, Wie

    1989-01-01

    An overview of current research activities at UT Austin is presented to discuss certain technical issues in the following areas: (1) Computer-Aided Nonlinear Control Design: In this project, the describing function method is employed for the nonlinear control analysis and design of a flexible spacecraft equipped with pulse modulated reaction jets. INCA program has been enhanced to allow the numerical calculation of describing functions as well as the nonlinear limit cycle analysis capability in the frequency domain; (2) Robust Linear Quadratic Gaussian (LQG) Compensator Synthesis: Robust control design techniques and software tools are developed for flexible space structures with parameter uncertainty. In particular, an interactive, robust multivariable control design capability is being developed for INCA program; and (3) LQR-Based Autonomous Control System for the Space Station: In this project, real time implementation of LQR-based autonomous control system is investigated for the space station with time-varying inertias and with significant multibody dynamic interactions.

  13. A navigation and control system for an autonomous rescue vehicle in the space station environment

    NASA Technical Reports Server (NTRS)

    Merkel, Lawrence

    1991-01-01

    A navigation and control system was designed and implemented for an orbital autonomous rescue vehicle envisioned to retrieve astronauts or equipment in the case that they become disengaged from the space station. The rescue vehicle, termed the Extra-Vehicular Activity Retriever (EVAR), has an on-board inertial measurement unit ahd GPS receivers for self state estimation, a laser range imager (LRI) and cameras for object state estimation, and a data link for reception of space station state information. The states of the retriever and objects (obstacles and the target object) are estimated by inertial state propagation which is corrected via measurements from the GPS, the LRI system, or the camera system. Kalman filters are utilized to perform sensor fusion and estimate the state propagation errors. Control actuation is performed by a Manned Maneuvering Unit (MMU). Phase plane control techniques are used to control the rotational and translational state of the retriever. The translational controller provides station-keeping or motion along either Clohessy-Wiltshire trajectories or straight line trajectories in the LVLH frame of any sufficiently observed object or of the space station. The software was used to successfully control a prototype EVAR on an air bearing floor facility, and a simulated EVAR operating in a simulated orbital environment. The design of the navigation system and the control system are presented. Also discussed are the hardware systems and the overall software architecture.

  14. Spacecube: A Family of Reconfigurable Hybrid On-Board Science Data Processors

    NASA Technical Reports Server (NTRS)

    Flatley, Thomas P.

    2015-01-01

    SpaceCube is a family of Field Programmable Gate Array (FPGA) based on-board science data processing systems developed at the NASA Goddard Space Flight Center (GSFC). The goal of the SpaceCube program is to provide 10x to 100x improvements in on-board computing power while lowering relative power consumption and cost. SpaceCube is based on the Xilinx Virtex family of FPGAs, which include processor, FPGA logic and digital signal processing (DSP) resources. These processing elements are leveraged to produce a hybrid science data processing platform that accelerates the execution of algorithms by distributing computational functions to the most suitable elements. This approach enables the implementation of complex on-board functions that were previously limited to ground based systems, such as on-board product generation, data reduction, calibration, classification, eventfeature detection, data mining and real-time autonomous operations. The system is fully reconfigurable in flight, including data parameters, software and FPGA logic, through either ground commanding or autonomously in response to detected eventsfeatures in the instrument data stream.

  15. The Jet Propulsion Laboratory shared control architecture and implementation

    NASA Technical Reports Server (NTRS)

    Backes, Paul G.; Hayati, Samad

    1990-01-01

    A hardware and software environment for shared control of telerobot task execution has been implemented. Modes of task execution range from fully teleoperated to fully autonomous as well as shared where hand controller inputs from the human operator are mixed with autonomous system inputs in real time. The objective of the shared control environment is to aid the telerobot operator during task execution by merging real-time operator control from hand controllers with autonomous control to simplify task execution for the operator. The operator is the principal command source and can assign as much autonomy for a task as desired. The shared control hardware environment consists of two PUMA 560 robots, two 6-axis force reflecting hand controllers, Universal Motor Controllers for each of the robots and hand controllers, a SUN4 computer, and VME chassis containing 68020 processors and input/output boards. The operator interface for shared control, the User Macro Interface (UMI), is a menu driven interface to design a task and assign the levels of teleoperated and autonomous control. The operator also sets up the system monitor which checks safety limits during task execution. Cartesian-space degrees of freedom for teleoperated and/or autonomous control inputs are selected within UMI as well as the weightings for the teleoperation and autonmous inputs. These are then used during task execution to determine the mix of teleoperation and autonomous inputs. Some of the autonomous control primitives available to the user are Joint-Guarded-Move, Cartesian-Guarded-Move, Move-To-Touch, Pin-Insertion/Removal, Door/Crank-Turn, Bolt-Turn, and Slide. The operator can execute a task using pure teleoperation or mix control execution from the autonomous primitives with teleoperated inputs. Presently the shared control environment supports single arm task execution. Work is presently underway to provide the shared control environment for dual arm control. Teleoperation during shared control is only Cartesian space control and no force-reflection is provided. Force-reflecting teleoperation and joint space operator inputs are planned extensions to the environment.

  16. Multi-Agent Diagnosis and Control of an Air Revitalization System for Life Support in Space

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Kowing, Jeffrey; Nieten, Joseph; Graham, Jeffrey s.; Schreckenghost, Debra; Bonasso, Pete; Fleming, Land D.; MacMahon, Matt; Thronesbery, Carroll

    2000-01-01

    An architecture of interoperating agents has been developed to provide control and fault management for advanced life support systems in space. In this adjustable autonomy architecture, software agents coordinate with human agents and provide support in novel fault management situations. This architecture combines the Livingstone model-based mode identification and reconfiguration (MIR) system with the 3T architecture for autonomous flexible command and control. The MIR software agent performs model-based state identification and diagnosis. MIR identifies novel recovery configurations and the set of commands required for the recovery. The AZT procedural executive and the human operator use the diagnoses and recovery recommendations, and provide command sequencing. User interface extensions have been developed to support human monitoring of both AZT and MIR data and activities. This architecture has been demonstrated performing control and fault management for an oxygen production system for air revitalization in space. The software operates in a dynamic simulation testbed.

  17. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Astrophysics Data System (ADS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA’s Orbital Debris Program Office (ODPO), in honour of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosyncronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA’s Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  18. NASA's Optical Program on Ascension Island: Bringing MCAT to Life as the Eugene Stansbery-Meter Class Autonomous Telescope (ES-MCAT)

    NASA Technical Reports Server (NTRS)

    Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.

    2017-01-01

    In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA's Orbital Debris Program Office (ODPO), in honor of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosynchronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA's Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).

  19. Space Transportation Avionics Technology Symposium. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The focus of the symposium was to examine existing and planned avionics technology processes and products and to recommend necessary changes for strengthening priorities and program emphases. Innovative changes in avionics technology development and design processes, identified during the symposium, are needed to support the increasingly complex, multi-vehicle, integrated, autonomous space-based systems. Key technology advances make such a major initiative viable at this time: digital processing capabilities, integrated on-board test/checkout methods, easily reconfigurable laboratories, and software design and production techniques.

  20. Space Transportation Avionics Technology Symposium. Volume 2: Conference Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The focus of the symposium was to examine existing and planned avionics technology processes and products and to recommend necessary changes for strengthening priorities and program emphases. Innovative changes in avionics technology development and design processes are needed to support the increasingly complex, multi-vehicle, integrated, autonomous space-based systems. Key technology advances make such a major initiative viable at this time: digital processing capabilities, integrated on-board test/checkout methods, easily reconfigurable laboratories, and software design and production techniques.

  1. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  2. ANTS: Applying A New Paradigm for Lunar and Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Clark, P. E.; Curtis, S. A.; Rilee, M. L.

    2002-01-01

    ANTS (Autonomous Nano- Technology Swarm), a mission architecture consisting of a large (1000 member) swarm of picoclass (1 kg) totally autonomous spacecraft with both adaptable and evolvable heuristic systems, is being developed as a NASA advanced mission concept, and is here examined as a paradigm for lunar surface exploration. As the capacity and complexity of hardware and software, demands for bandwidth, and the sophistication of goals for lunar and planetary exploration have increased, greater cost constraints have led to fewer resources and thus, the need to operate spacecraft with less frequent human contact. At present, autonomous operation of spacecraft systems allows great capability of spacecraft to 'safe' themselves and survive when conditions threaten spacecraft safety. To further develop spacecraft capability, NASA is at the forefront of development of new mission architectures which involve the use of Intelligent Software Agents (ISAs), performing experiments in space and on the ground to advance deliberative and collaborative autonomous control techniques. Selected missions in current planning stages require small groups of spacecraft weighing tens, instead of hundreds, of kilograms to cooperate at a tactical level to select and schedule measurements to be made by appropriate instruments onboard. Such missions will be characterizing rapidly unfolding real-time events on a routine basis. The next level of development, which we are considering here, is in the use of autonomous systems at the strategic level, to explore the remote terranes, potentially involving large surveys or detailed reconnaissance.

  3. Computer science: Key to a space program renaissance. The 1981 NASA/ASEE summer study on the use of computer science and technology in NASA. Volume 2: Appendices

    NASA Technical Reports Server (NTRS)

    Freitas, R. A., Jr. (Editor); Carlson, P. A. (Editor)

    1983-01-01

    Adoption of an aggressive computer science research and technology program within NASA will: (1) enable new mission capabilities such as autonomous spacecraft, reliability and self-repair, and low-bandwidth intelligent Earth sensing; (2) lower manpower requirements, especially in the areas of Space Shuttle operations, by making fuller use of control center automation, technical support, and internal utilization of state-of-the-art computer techniques; (3) reduce project costs via improved software verification, software engineering, enhanced scientist/engineer productivity, and increased managerial effectiveness; and (4) significantly improve internal operations within NASA with electronic mail, managerial computer aids, an automated bureaucracy and uniform program operating plans.

  4. Perception, planning, and control for walking on rugged terrain

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Krotkov, Eric

    1991-01-01

    The CMU Planetary Rover project is developing a six-legged walking robot capable of autonomously navigating, exploring, and acquiring samples in rugged, unknown environments. To gain experience with the problems involved in walking on rugged terrain, a full-scale prototype leg was built and mounted on a carriage that rolls along overhead rails. Issues addressed in developing the software system to autonomously walk the leg through rugged terrain are described. In particular, the insights gained into perceiving and modeling rugged terrain, controlling the legged mechanism, interacting with the ground, choosing safe yet effective footfalls, and planning efficient leg moves through space are described.

  5. Adaptive Tunable Laser Spectrometer for Space Applications

    NASA Technical Reports Server (NTRS)

    Flesch, Gregory; Keymeulen, Didier

    2010-01-01

    An architecture and process for the rapid prototyping and subsequent development of an adaptive tunable laser absorption spectrometer (TLS) are described. Our digital hardware/firmware/software platform is both reconfigurable at design time as well as autonomously adaptive in real-time for both post-integration and post-launch situations. The design expands the range of viable target environments and enhances tunable laser spectrometer performance in extreme and even unpredictable environments. Through rapid prototyping with a commercial RTOS/FPGA platform, we have implemented a fully operational tunable laser spectrometer (using a highly sensitive second harmonic technique). With this prototype, we have demonstrated autonomous real-time adaptivity in the lab with simulated extreme environments.

  6. Intelligent Systems Technologies for Ops

    NASA Technical Reports Server (NTRS)

    Smith, Ernest E.; Korsmeyer, David J.

    2012-01-01

    As NASA supports International Space Station assembly complete operations through 2020 (or later) and prepares for future human exploration programs, there is additional emphasis in the manned spaceflight program to find more efficient and effective ways of providing the ground-based mission support. Since 2006 this search for improvement has led to a significant cross-fertilization between the NASA advanced software development community and the manned spaceflight operations community. A variety of mission operations systems and tools have been developed over the past decades as NASA has operated the Mars robotic missions, the Space Shuttle, and the International Space Station. NASA Ames Research Center has been developing and applying its advanced intelligent systems research to mission operations tools for both unmanned Mars missions operations since 2001 and to manned operations with NASA Johnson Space Center since 2006. In particular, the fundamental advanced software development work under the Exploration Technology Program, and the experience and capabilities developed for mission operations systems for the Mars surface missions, (Spirit/Opportunity, Phoenix Lander, and MSL) have enhanced the development and application of advanced mission operation systems for the International Space Station and future spacecraft. This paper provides an update on the status of the development and deployment of a variety of intelligent systems technologies adopted for manned mission operations, and some discussion of the planned work for Autonomous Mission Operations in future human exploration. We discuss several specific projects between the Ames Research Center and the Johnson Space Centers Mission Operations Directorate, and how these technologies and projects are enhancing the mission operations support for the International Space Station, and supporting the current Autonomous Mission Operations Project for the mission operation support of the future human exploration programs.

  7. OAST Space Theme Workshop. Volume 3: Working group summary. 4: Software (E-4). A. Summary. B. Technology needs (form 1). C. Priority assessment (form 2)

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.

  8. Experimenting with an Evolving Ground/Space-based Software Architecture to Enable Sensor Webs

    NASA Technical Reports Server (NTRS)

    mandl, Daniel; Frye, Stuart

    2005-01-01

    A series of ongoing experiments are being conducted at the NASA Goddard Space Flight Center to explore integrated ground and space-based software architectures enabling sensor webs. A sensor web, as defined by Steve Talabac at NASA Goddard Space Flight Center(GSFC), is a coherent set of distributed nodes interconnected by a communications fabric, that collectively behave as a single, dynamically adaptive, observing system. The nodes can be comprised of satellites, ground instruments, computing nodes etc. Sensor web capability requires autonomous management of constellation resources. This becomes progressively more important as more and more satellites share resource, such as communication channels and ground station,s while automatically coordinating their activities. There have been five ongoing activities which include an effort to standardize a set of middleware. This paper will describe one set of activities using the Earth Observing 1 satellite, which used a variety of ground and flight software along with other satellites and ground sensors to prototype a sensor web. This activity allowed us to explore where the difficulties that occur in the assembly of sensor webs given today s technology. We will present an overview of the software system architecture, some key experiments and lessons learned to facilitate better sensor webs in the future.

  9. Lessons Learned from Autonomous Sciencecraft Experiment

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Sherwood, Rob; Tran, Daniel; Cichy, Benjamin; Rabideau, Gregg; Castano, Rebecca; Davies, Ashley; Mandl, Dan; Frye, Stuart; Trout, Bruce; hide

    2005-01-01

    An Autonomous Science Agent has been flying onboard the Earth Observing One Spacecraft since 2003. This software enables the spacecraft to autonomously detect and responds to science events occurring on the Earth such as volcanoes, flooding, and snow melt. The package includes AI-based software systems that perform science data analysis, deliberative planning, and run-time robust execution. This software is in routine use to fly the EO-l mission. In this paper we briefly review the agent architecture and discuss lessons learned from this multi-year flight effort pertinent to deployment of software agents to critical applications.

  10. AMO EXPRESS: A Command and Control Experiment for Crew Autonomy Onboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Stetson, Howard K.; Frank, Jeremy; Cornelius, Randy; Haddock, Angie; Wang, Lui; Garner, Larry

    2015-01-01

    NASA is investigating a range of future human spaceflight missions, including both Mars-distance and Near Earth Object (NEO) targets. Of significant importance for these missions is the balance between crew autonomy and vehicle automation. As distance from Earth results in increasing communication delays, future crews need both the capability and authority to independently make decisions. However, small crews cannot take on all functions performed by ground today, and so vehicles must be more automated to reduce the crew workload for such missions. NASA's Advanced Exploration Systems Program funded Autonomous Mission Operations (AMO) project conducted an autonomous command and control experiment on-board the International Space Station that demonstrated single action intelligent procedures for crew command and control. The target problem was to enable crew initialization of a facility class rack with power and thermal interfaces, and involving core and payload command and telemetry processing, without support from ground controllers. This autonomous operations capability is enabling in scenarios such as initialization of a medical facility to respond to a crew medical emergency, and representative of other spacecraft autonomy challenges. The experiment was conducted using the Expedite the Processing of Experiments for Space Station (EXPRESS) rack 7, which was located in the Port 2 location within the U.S Laboratory onboard the International Space Station (ISS). Activation and deactivation of this facility is time consuming and operationally intensive, requiring coordination of three flight control positions, 47 nominal steps, 57 commands, 276 telemetry checks, and coordination of multiple ISS systems (both core and payload). Utilization of Draper Laboratory's Timeliner software, deployed on-board the ISS within the Command and Control (C&C) computers and the Payload computers, allowed development of the automated procedures specific to ISS without having to certify and employ novel software for procedure development and execution. The procedures contained the ground procedure logic and actions as possible to include fault detection and recovery capabilities.

  11. Software for autonomous astronomical observatories: challenges and opportunities in the age of big data

    NASA Astrophysics Data System (ADS)

    Sybilski, Piotr W.; Pawłaszek, Rafał; Kozłowski, Stanisław K.; Konacki, Maciej; Ratajczak, Milena; Hełminiak, Krzysztof G.

    2014-07-01

    We present the software solution developed for a network of autonomous telescopes, deployed and tested in Solaris Project. The software aims to fulfil the contemporary needs of distributed autonomous observatories housing medium sized telescopes: ergonomics, availability, security and reusability. The datafication of such facilities seems inevitable and we give a preliminary study of the challenges and opportunities waiting for software developers. Project Solaris is a global network of four 0.5 m autonomous telescopes conducting a survey of eclipsing binaries in the Southern Hemisphere. The Project's goal is to detect and characterise circumbinary planets using the eclipse timing method. The observatories are located on three continents, and the headquarters coordinating and monitoring the network is in Poland. All four are operational as of December 2013.

  12. Monitoring Floods with NASA's ST6 Autonomous Sciencecraft Experiment: Implications on Planetary Exploration

    NASA Technical Reports Server (NTRS)

    Ip, Felipe; Dohm, J. M.; Baker, V. R.; Castano, B.; Chien, S.; Cichy, B.; Davies, A. G.; Doggett, T.; Greeley, R.; Sherwood, R.

    2005-01-01

    NASA's New Millennium Program (NMP) Autonomous Sciencecraft Experiment (ASE) [1-3] has been successfully demonstrated in Earth-orbit. NASA has identified the development of an autonomously operating spacecraft as a necessity for an expanded program of missions exploring the Solar System. The versatile ASE spacecraft command and control, image formation, and science processing software was uploaded to the Earth Observer 1 (EO-1) spacecraft in early 2004 and has been undergoing onboard testing since May 2004 for the near real-time detection of surface modification related to transient geological and hydrological processes such as volcanism [4], ice formation and retreat [5], and flooding [6]. Space autonomy technology developed as part of ASE creates the new capability to autonomously detect, assess, react to, and monitor dynamic events such as flooding. Part of the challenge has been the difficulty to observe flooding in real time at sufficient temporal resolutions; more importantly, it is the large spatial extent of most drainage networks coupled with the size of the data sets necessary to be downlinked from satellites that make it difficult to monitor flooding from space. Below is a description of the algorithms (referred to as ASE Flood water Classifiers) used in tandem with the Hyperion spectrometer instrument on EO-1 to identify flooding and some of the test results.

  13. Scheduling lessons learned from the Autonomous Power System

    NASA Technical Reports Server (NTRS)

    Ringer, Mark J.

    1992-01-01

    The Autonomous Power System (APS) project at NASA LeRC is designed to demonstrate the applications of integrated intelligent diagnosis, control, and scheduling techniques to space power distribution systems. The project consists of three elements: the Autonomous Power Expert System (APEX) for Fault Diagnosis, Isolation, and Recovery (FDIR); the Autonomous Intelligent Power Scheduler (AIPS) to efficiently assign activities start times and resources; and power hardware (Brassboard) to emulate a space-based power system. The AIPS scheduler was tested within the APS system. This scheduler is able to efficiently assign available power to the requesting activities and share this information with other software agents within the APS system in order to implement the generated schedule. The AIPS scheduler is also able to cooperatively recover from fault situations by rescheduling the affected loads on the Brassboard in conjunction with the APEX FDIR system. AIPS served as a learning tool and an initial scheduling testbed for the integration of FDIR and automated scheduling systems. Many lessons were learned from the AIPS scheduler and are now being integrated into a new scheduler called SCRAP (Scheduler for Continuous Resource Allocation and Planning). This paper will service three purposes: an overview of the AIPS implementation, lessons learned from the AIPS scheduler, and a brief section on how these lessons are being applied to the new SCRAP scheduler.

  14. Software Engineering and Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Sterritt, Roy; Pena, Joaquin; Rouff, Christopher A.

    2006-01-01

    We discuss two software engineering aspects in the development of complex swarm-based systems. NASA researchers have been investigating various possible concept missions that would greatly advance future space exploration capabilities. The concept mission that we have focused on exploits the principles of autonomic computing as well as being based on the use of intelligent swarms, whereby a (potentially large) number of similar spacecraft collaborate to achieve mission goals. The intent is that such systems not only can be sent to explore remote and harsh environments but also are endowed with greater degrees of protection and longevity to achieve mission goals.

  15. Intelligence Applied to Air Vehicles

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Gross, Anthony R.; Fletcher, L. Skip; Zornetzer, Steven (Technical Monitor)

    2000-01-01

    The exponential growth in information technology has provided the potential for air vehicle capabilities that were previously unavailable to mission and vehicle designers. The increasing capabilities of computer hardware and software, including new developments such as neural networks, provide a new balance of work between humans and machines. This paper will describe several NASA projects, and review results and conclusions from ground and flight investigations where vehicle intelligence was developed and applied to aeronautical and space systems. In the first example, flight results from a neural network flight control demonstration will be reviewed. Using, a highly-modified F-15 aircraft, a NASA/Dryden experimental flight test program has demonstrated how the neural network software can correctly identify and respond to changes in aircraft stability and control characteristics. Using its on-line learning capability, the neural net software would identify that something in the vehicle has changed, then reconfigure the flight control computer system to adapt to those changes. The results of the Remote Agent software project will be presented. This capability will reduce the cost of future spacecraft operations as computers become "thinking" partners along with humans. In addition, the paper will describe the objectives and plans for the autonomous airplane program and the autonomous rotorcraft project. Technologies will also be developed.

  16. Relative Navigation of Formation Flying Satellites

    NASA Technical Reports Server (NTRS)

    Long, Anne; Kelbel, David; Lee, Taesul; Leung, Dominic; Carpenter, Russell; Gramling, Cheryl; Bauer, Frank (Technical Monitor)

    2002-01-01

    The Guidance, Navigation, and Control Center (GNCC) at Goddard Space Flight Center (GSFC) has successfully developed high-accuracy autonomous satellite navigation systems using the National Aeronautics and Space Administration's (NASA's) space and ground communications systems and the Global Positioning System (GPS). In addition, an autonomous navigation system that uses celestial object sensor measurements is currently under development and has been successfully tested using real Sun and Earth horizon measurements.The GNCC has developed advanced spacecraft systems that provide autonomous navigation and control of formation flyers in near-Earth, high-Earth, and libration point orbits. To support this effort, the GNCC is assessing the relative navigation accuracy achievable for proposed formations using GPS, intersatellite crosslink, ground-to-satellite Doppler, and celestial object sensor measurements. This paper evaluates the performance of these relative navigation approaches for three proposed missions with two or more vehicles maintaining relatively tight formations. High-fidelity simulations were performed to quantify the absolute and relative navigation accuracy as a function of navigation algorithm and measurement type. Realistically-simulated measurements were processed using the extended Kalman filter implemented in the GPS Enhanced Inboard Navigation System (GEONS) flight software developed by GSFC GNCC. Solutions obtained by simultaneously estimating all satellites in the formation were compared with the results obtained using a simpler approach based on differencing independently estimated state vectors.

  17. Video semaphore decoding for free-space optical communication

    NASA Astrophysics Data System (ADS)

    Last, Matthew; Fisher, Brian; Ezekwe, Chinwuba; Hubert, Sean M.; Patel, Sheetal; Hollar, Seth; Leibowitz, Brian S.; Pister, Kristofer S. J.

    2001-04-01

    Using teal-time image processing we have demonstrated a low bit-rate free-space optical communication system at a range of more than 20km with an average optical transmission power of less than 2mW. The transmitter is an autonomous one cubic inch microprocessor-controlled sensor node with a laser diode output. The receiver is a standard CCD camera with a 1-inch aperture lens, and both hardware and software implementations of the video semaphore decoding algorithm. With this system sensor data can be reliably transmitted 21 km form San Francisco to Berkeley.

  18. Control of a free-flying robot manipulator system

    NASA Technical Reports Server (NTRS)

    Alexander, H.; Cannon, R. H., Jr.

    1985-01-01

    The goal of the research is to develop and test control strategies for a self-contained, free flying space robot. Such a robot would perform operations in space similar to those currently handled by astronauts during extravehicular activity (EVA). The focus of the work is to develop and carry out a program of research with a series of physical Satellite Robot Simulator Vehicles (SRSV's), two-dimensionally freely mobile laboratory models of autonomous free-flying space robots such as might perform extravehicular functions associated with operation of a space station or repair of orbiting satellites. The development of the SRSV and of some of the controller subsystems are discribed. The two-link arm was fitted to the SRSV base, and researchers explored the open-loop characteristics of the arm and thruster actuators. Work began on building the software foundation necessary for use of the on-board computer, as well as hardware and software for a local vision system for target identification and tracking.

  19. The SSM/PMAD automated test bed project

    NASA Technical Reports Server (NTRS)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module/Power Management and Distribution (SSM/PMAD) autonomous subsystem project was initiated in 1984. The project's goal has been to design and develop an autonomous, user-supportive PMAD test bed simulating the SSF Hab/Lab module(s). An eighteen kilowatt SSM/PMAD test bed model with a high degree of automated operation has been developed. This advanced automation test bed contains three expert/knowledge based systems that interact with one another and with other more conventional software residing in up to eight distributed 386-based microcomputers to perform the necessary tasks of real-time and near real-time load scheduling, dynamic load prioritizing, and fault detection, isolation, and recovery (FDIR).

  20. Architectures and Evaluation for Adjustable Control Autonomy for Space-Based Life Support Systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schreckenghost, Debra K.

    2001-01-01

    In the past five years, a number of automation applications for control of crew life support systems have been developed and evaluated in the Adjustable Autonomy Testbed at NASA's Johnson Space Center. This paper surveys progress on an adjustable autonomous control architecture for situations where software and human operators work together to manage anomalies and other system problems. When problems occur, the level of control autonomy can be adjusted, so that operators and software agents can work together on diagnosis and recovery. In 1997 adjustable autonomy software was developed to manage gas transfer and storage in a closed life support test. Four crewmembers lived and worked in a chamber for 91 days, with both air and water recycling. CO2 was converted to O2 by gas processing systems and wheat crops. With the automation software, significantly fewer hours were spent monitoring operations. System-level validation testing of the software by interactive hybrid simulation revealed problems both in software requirements and implementation. Since that time, we have been developing multi-agent approaches for automation software and human operators, to cooperatively control systems and manage problems. Each new capability has been tested and demonstrated in realistic dynamic anomaly scenarios, using the hybrid simulation tool.

  1. Operability driven space system concept with high leverage technologies

    NASA Astrophysics Data System (ADS)

    Woo, Henry H.

    1997-01-01

    One of the common objectives of future launch and space transfer systems is to achieve low-cost and effective operational capability by automating processes from pre-launch to the end of mission. Hierarchical and integrated mission management, system management, autonomous GN&C, and integrated micro-nano avionics technologies are critical to extend or revitalize the exploitation of space. Essential to space transfer, orbital systems, Earth-To-Orbit (ETO), commercial and military aviation, and planetary systems are these high leverage hardware and software technologies. This paper covers the driving issues, goals, and requirements definition supported with typical concepts and utilization of multi-use technologies. The approach and method results in a practical system architecture and lower level design concepts.

  2. Autonomous Flight Safety System Road Test

    NASA Technical Reports Server (NTRS)

    Simpson, James C.; Zoemer, Roger D.; Forney, Chris S.

    2005-01-01

    On February 3, 2005, Kennedy Space Center (KSC) conducted the first Autonomous Flight Safety System (AFSS) test on a moving vehicle -- a van driven around the KSC industrial area. A subset of the Phase III design was used consisting of a single computer, GPS receiver, and UPS antenna. The description and results of this road test are described in this report.AFSS is a joint KSC and Wallops Flight Facility project that is in its third phase of development. AFSS is an independent subsystem intended for use with Expendable Launch Vehicles that uses tracking data from redundant onboard sensors to autonomously make flight termination decisions using software-based rules implemented on redundant flight processors. The goals of this project are to increase capabilities by allowing launches from locations that do not have or cannot afford extensive ground-based range safety assets, to decrease range costs, and to decrease reaction time for special situations.

  3. Software Agents Applications Using Real-Time CORBA

    NASA Astrophysics Data System (ADS)

    Fowell, S.; Ward, R.; Nielsen, M.

    This paper describes current projects being performed by SciSys in the area of the use of software agents, built using CORBA middleware, to improve operations within autonomous satellite/ground systems. These concepts have been developed and demonstrated in a series of experiments variously funded by ESA's Technology Flight Opportunity Initiative (TFO) and Leading Edge Technology for SMEs (LET-SME), and the British National Space Centre's (BNSC) National Technology Programme. Some of this earlier work has already been reported in [1]. This paper will address the trends, issues and solutions associated with this software agent architecture concept, together with its implementation using CORBA within an on-board environment, that is to say taking account of its real- time and resource constrained nature.

  4. Characterizing the Severity of Autonomic Cardiovascular Dysfunction after Spinal Cord Injury Using a Novel 24 Hour Ambulatory Blood Pressure Analysis Software.

    PubMed

    Popok, David W; West, Christopher R; Hubli, Michele; Currie, Katharine D; Krassioukov, Andrei V

    2017-02-01

    Cardiovascular disease is one of the leading causes of morbidity and mortality in the spinal cord injury (SCI) population. SCI may disrupt autonomic cardiovascular homeostasis, which can lead to persistent hypotension, irregular diurnal rhythmicity, and the development of autonomic dysreflexia (AD). There is currently no software available to perform automated detection and evaluation of cardiovascular autonomic dysfunction(s) such as those generated from 24 h ambulatory blood pressure monitoring (ABPM) recordings in the clinical setting. The objective of this study is to compare the efficacy of a novel 24 h ABPM Autonomic Dysfunction Detection Software against manual detection and to use the software to demonstrate the relationships between level of injury and the degree of autonomic cardiovascular impairment in a large cohort of individuals with SCI. A total of 46 individuals with cervical (group 1, n = 37) or high thoracic (group 2, n = 9) SCI participated in the study. Outcome measures included the frequency and severity of AD, frequency of hypotensive events, and diurnal variations in blood pressure and heart rate. There was good agreement between the software and manual detection of AD events (Bland-Altman limits of agreement = ±1.458 events). Cervical SCI presented with more frequent (p = 0.0043) and severe AD (p = 0.0343) than did high thoracic SCI. Cervical SCI exhibited higher systolic and diastolic blood pressure during the night and lower heart rate during the day than high thoracic SCI. In conclusion, our ABPM AD Detection Software was equally as effective in detecting the frequency and severity of AD and hypotensive events as manual detection, suggesting that this software can be used in the clinical setting to expedite ABPM analyses.

  5. Autonomous dexterous end-effectors for space robotics

    NASA Technical Reports Server (NTRS)

    Bekey, George A.; Iberall, Thea; Liu, Huan

    1989-01-01

    The development of a knowledge-based controller is summarized for the Belgrade/USC robot hand, a five-fingered end effector, designed for maximum autonomy. The biological principles of the hand and its architecture are presented. The conceptual and software aspects of the grasp selection system are discussed, including both the effects of the geometry of the target object and the task to be performed. Some current research issues are presented.

  6. Software control architecture for autonomous vehicles

    NASA Astrophysics Data System (ADS)

    Nelson, Michael L.; DeAnda, Juan R.; Fox, Richard K.; Meng, Xiannong

    1999-07-01

    The Strategic-Tactical-Execution Software Control Architecture (STESCA) is a tri-level approach to controlling autonomous vehicles. Using an object-oriented approach, STESCA has been developed as a generalization of the Rational Behavior Model (RBM). STESCA was initially implemented for the Phoenix Autonomous Underwater Vehicle (Naval Postgraduate School -- Monterey, CA), and is currently being implemented for the Pioneer AT land-based wheeled vehicle. The goals of STESCA are twofold. First is to create a generic framework to simplify the process of creating a software control architecture for autonomous vehicles of any type. Second is to allow for mission specification system by 'anyone' with minimal training to control the overall vehicle functionality. This paper describes the prototype implementation of STESCA for the Pioneer AT.

  7. Intermediate Levels of Autonomy within the SSM/PMAD Breadboard

    NASA Technical Reports Server (NTRS)

    Dugal-Whitehead, Norma R.; Walls, Bryan

    1995-01-01

    The Space Station Module Power Management and Distribution (SSM/PMAD) bread-board is a test bed for the development of advanced power system control and automation. Software control in the SSM/PMAD breadboard is through co-operating systems, called Autonomous Agents. Agents can be a mixture of algorithmic software and expert systems. The early SSM/PMAD system was envisioned as being completely autonomous. It soon became apparent, though, that there would always be a need for human intervention, at least as long as a human interacts with the system in any way. In a system designed only for autonomous operation, manual intervention meant taking full control of the whole system, and loosing whatever expertise was in the system. Several methods for allowing humans to interact at an appropriate level of control were developed. This paper examines some of these intermediate modes of autonomy. The least humanly intrusive mode is simple monitoring. The ability to modify future behavior by altering a schedule involves high-level interaction. Modification of operating activities comes next. The coarsest mode of control is individual, unplanned operation of individual Power System components. Each of these levels is integrated into the SSM/PMAD breadboard, with support for the user (such as warnings of the consequences of control decisions) at every level.

  8. Autonomous satellite command and control: A comparison with other military systems

    NASA Technical Reports Server (NTRS)

    Kruchten, Robert J.; Todd, Wayne

    1988-01-01

    Existing satellite concepts of operation depend on readily available experts and are extremely manpower intensive. Areas of expertise required include mission planning, mission data interpretation, telemetry monitoring, and anomaly resolution. The concepts of operation have envolved to their current state in part because space systems have tended to be treated more as research and development assets rather than as operational assets. These methods of satellite command and control will be inadequate in the future because of the availability, survivability, and capability of human experts. Because space systems have extremely high reliability and limited access, they offer challenges not found in other military systems. Thus, automation techniques used elsewhere are not necessarily applicable to space systems. A program to make satellites much more autonomous has been developed, using a variety of advanced software techniques. The problem the program is addressing, some possible solutions, the goals of the Rome Air Development Center (RADC) program, the rationale as to why the goals are reasonable, and the current program status are discussed. Also presented are some of the concepts used in the program and how they differ from more traditional approaches.

  9. Developing Software for NASA Missions in the New Millennia

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Rouff, Christopher; Hinchey, Mike

    2004-01-01

    NASA is working on new mission concepts for exploration of the solar system. The concepts for these missions include swarms of hundreds of cooperating intelligent spacecraft which will be able to work in teams and gather more data than current single spacecraft missions. These spacecraft will not only have to operate independently for long periods of time on their own and in teams, but will also need to have autonomic properties of self healing, self configuring, self optimizing and self protecting for them to survive in the harsh space environment. Software for these types of missions has never been developed before and represents some of the challenges of software development in the new millennia. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm missions NASA is considering. The ANTS mission will use a swarm of one thousand pico-spacecraft that weigh less than five pounds. Using an insect colony analog, ANTS will explore the asteroid belt and catalog the mass, density, morphology, and chemical composition of the asteroids. Due to the size of the spacecraft, each will only carry a single miniaturized science instrument which will require them to cooperate in searching for asteroids that are of scientific interest. This article also discusses the ANTS mission, the properties the spacecraft will need and how that will effect future software development.

  10. Software defined coherent lidar (SD-Cl) architecture

    NASA Astrophysics Data System (ADS)

    Laghezza, F.; Onori, D.; Scotti, F.; Bogoni, A.

    2017-09-01

    In recent years, thanks to the innovation in optical and electro-optical components, space based light detection and ranging (Lidar) systems are having great success, as a considerable alternative to passive radiometers or microwave sensors [1]. One of the most important applications, for space based Lidars, is the measure of target's distance and its relative properties as e.g., topography, surface's roughness and reflectivity, gravity and mass, that provide useful information for surface mapping, as well as semi-autonomous landing functionalities on lowgravity bodies (moons and asteroids). These kind of systems are often called Lidar altimeters or laser rangefinders.

  11. The Personal Satellite Assistant: An Internal Spacecraft Autonomous Mobile Monitor

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Gawdiak, Yuri; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper presents an overview of the research and development effort at the NASA Ames Research Center to create an internal spacecraft autonomous mobile monitor capable of performing intra-vehicular sensing activities by autonomously navigating onboard the International Space Station. We describe the capabilities, mission roles, rationale, high-level functional requirements, and design challenges for an autonomous mobile monitor. The rapid prototyping design methodology used, in which five prototypes of increasing fidelity are designed, is described as well as the status of these prototypes, of which two are operational and being tested, and one is actively being designed. The physical test facilities used to perform ground testing are briefly described, including a micro-gravity test facility that permits a prototype to propel itself in 3 dimensions with 6 degrees-of-freedom as if it were in an micro-gravity environment. We also describe an overview of the autonomy framework and its components including the software simulators used in the development process. Sample mission test scenarios are also described. The paper concludes with a discussion of future and related work followed by the summary.

  12. Physics based model for online fault detection in autonomous cryogenic loading system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kashani, Ali; Ponizhovskaya, Ekaterina; Luchinsky, Dmitry

    2014-01-29

    We report the progress in the development of the chilldown model for a rapid cryogenic loading system developed at NASA-Kennedy Space Center. The nontrivial characteristic feature of the analyzed chilldown regime is its active control by dump valves. The two-phase flow model of the chilldown is approximated as one-dimensional homogeneous fluid flow with no slip condition for the interphase velocity. The model is built using commercial SINDA/FLUINT software. The results of numerical predictions are in good agreement with the experimental time traces. The obtained results pave the way to the application of the SINDA/FLUINT model as a verification tool formore » the design and algorithm development required for autonomous loading operation.« less

  13. Implementation of Autonomous Control Technology for Plant Growth Chambers

    NASA Technical Reports Server (NTRS)

    Costello, Thomas A.; Sager, John C.; Krumins, Valdis; Wheeler, Raymond M.

    2002-01-01

    The Kennedy Space Center has significant infrastructure for research using controlled environment plant growth chambers. Such research supports development of bioregenerative life support technology for long-term space missions. Most of the existing chambers in Hangar L and Little L will be moved to the new Space Experiment Research and Processing Laboratory (SERPL) in the summer of 2003. The impending move has created an opportunity to update the control system technologies to allow for greater flexibility, less labor for set-up and maintenance, better diagnostics, better reliability and easier data retrieval. Part of these improvements can be realized using hardware which communicates through an ethernet connection to a central computer for supervisory control but can be operated independently of the computer during routine run-time. Both the hardware and software functionality of an envisioned system were tested on a prototype plant growth chamber (CEC-4) in Hangar L. Based upon these tests, recommendations for hardware and software selection and system design for implementation in SERPL are included.

  14. Flightspeed Integral Image Analysis Toolkit

    NASA Technical Reports Server (NTRS)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  15. The Flight Telerobotic Servicer (FTS) - A focus for automation and robotics on the Space Station

    NASA Technical Reports Server (NTRS)

    Hinkal, Sanford W.; Andary, James F.; Watzin, James G.; Provost, David E.

    1987-01-01

    The concept, fundamental design principles, and capabilities of the FTS, a multipurpose telerobotic system for use on the Space Station and Space Shuttle, are discussed. The FTS is intended to assist the crew in the performance of extravehicular tasks; the telerobot will also be used on the Orbital Maneuvering Vehicle to service free-flyer spacecraft. The FTS will be capable of both teleoperation and autonomous operation; eventually it may also utilize ground control. By careful selection of the functional architecture and a modular approach to the hardware and software design, the FTS can accept developments in artificial intelligence and newer, more advanced sensors, such as machine vision and collision avoidance.

  16. Robotic Assembly of Truss Structures for Space Systems and Future Research Plans

    NASA Technical Reports Server (NTRS)

    Doggett, William

    2002-01-01

    Many initiatives under study by both the space science and earth science communities require large space systems, i.e. with apertures greater than 15 m or dimensions greater than 20 m. This paper reviews the effort in NASA Langley Research Center's Automated Structural Assembly Laboratory which laid the foundations for robotic construction of these systems. In the Automated Structural Assembly Laboratory reliable autonomous assembly and disassembly of an 8 meter planar structure composed of 102 truss elements covered by 12 panels was demonstrated. The paper reviews the hardware and software design philosophy which led to reliable operation during weeks of near continuous testing. Special attention is given to highlight the features enhancing assembly reliability.

  17. NASA Tech Briefs, January 2004

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Topics covered include: Multisensor Instrument for Real-Time Biological Monitoring; Sensor for Monitoring Nanodevice-Fabrication Plasmas; Backed Bending Actuator; Compact Optoelectronic Compass; Micro Sun Sensor for Spacecraft; Passive IFF: Autonomous Nonintrusive Rapid Identification of Friendly Assets; Finned-Ladder Slow-Wave Circuit for a TWT; Directional Radio-Frequency Identification Tag Reader; Integrated Solar-Energy-Harvesting and -Storage Device; Event-Driven Random-Access-Windowing CCD Imaging System; Stroboscope Controller for Imaging Helicopter Rotors; Software for Checking State-charts; Program Predicts Broadband Noise from a Turbofan Engine; Protocol for a Delay-Tolerant Data-Communication Network; Software Implements a Space-Mission File-Transfer Protocol; Making Carbon-Nanotube Arrays Using Block Copolymers: Part 2; Modular Rake of Pitot Probes; Preloading To Accelerate Slow-Crack-Growth Testing; Miniature Blimps for Surveillance and Collection of Samples; Hybrid Automotive Engine Using Ethanol-Burning Miller Cycle; Fabricating Blazed Diffraction Gratings by X-Ray Lithography; Freeze-Tolerant Condensers; The StarLight Space Interferometer; Champagne Heat Pump; Controllable Sonar Lenses and Prisms Based on ERFs; Measuring Gravitation Using Polarization Spectroscopy; Serial-Turbo-Trellis-Coded Modulation with Rate-1 Inner Code; Enhanced Software for Scheduling Space-Shuttle Processing; Bayesian-Augmented Identification of Stars in a Narrow View; Spacecraft Orbits for Earth/Mars-Lander Radio Relay; and Self-Inflatable/Self-Rigidizable Reflectarray Antenna.

  18. Hazard Detection Software for Lunar Landing

    NASA Technical Reports Server (NTRS)

    Huertas, Andres; Johnson, Andrew E.; Werner, Robert A.; Montgomery, James F.

    2011-01-01

    The Autonomous Landing and Hazard Avoidance Technology (ALHAT) Project is developing a system for safe and precise manned lunar landing that involves novel sensors, but also specific algorithms. ALHAT has selected imaging LIDAR (light detection and ranging) as the sensing modality for onboard hazard detection because imaging LIDARs can rapidly generate direct measurements of the lunar surface elevation from high altitude. Then, starting with the LIDAR-based Hazard Detection and Avoidance (HDA) algorithm developed for Mars Landing, JPL has developed a mature set of HDA software for the manned lunar landing problem. Landing hazards exist everywhere on the Moon, and many of the more desirable landing sites are near the most hazardous terrain, so HDA is needed to autonomously and safely land payloads over much of the lunar surface. The HDA requirements used in the ALHAT project are to detect hazards that are 0.3 m tall or higher and slopes that are 5 or greater. Steep slopes, rocks, cliffs, and gullies are all hazards for landing and, by computing the local slope and roughness in an elevation map, all of these hazards can be detected. The algorithm in this innovation is used to measure slope and roughness hazards. In addition to detecting these hazards, the HDA capability also is able to find a safe landing site free of these hazards for a lunar lander with diameter .15 m over most of the lunar surface. This software includes an implementation of the HDA algorithm, software for generating simulated lunar terrain maps for testing, hazard detection performance analysis tools, and associated documentation. The HDA software has been deployed to Langley Research Center and integrated into the POST II Monte Carlo simulation environment. The high-fidelity Monte Carlo simulations determine the required ground spacing between LIDAR samples (ground sample distances) and the noise on the LIDAR range measurement. This simulation has also been used to determine the effect of viewing on hazard detection performance. The software has also been deployed to Johnson Space Center and integrated into the ALHAT real-time Hardware-in-the-Loop testbed.

  19. Flying the ST-5 Constellation with "Plug and Play" Autonomy Components and the GMSEC Bus

    NASA Technical Reports Server (NTRS)

    Shendock, Bob; Witt, Ken; Stanley, Jason; Mandl, Dan; Coyle, Steve

    2006-01-01

    The Space Technology 5 (ST5) Project, part of NASA's New Millennium Program, will consist of a constellation of three micro-satellites. This viewgraph document presents the components that will allow it to operate in an autonomous mode. The ST-5 constellation will use the GSFC Mission Services Evolution Center (GMSEC) architecture to enable cost effective model based operations. The ST-5 mission will demonstrate several principles of self managing software components.

  20. Towards Autonomous Operation of Robonaut 2

    NASA Technical Reports Server (NTRS)

    Badger, Julia M.; Hart, Stephen W.; Yamokoski, J. D.

    2011-01-01

    The Robonaut 2 (R2) platform, as shown in Figure 1, was designed through a collaboration between NASA and General Motors to be a capable robotic assistant with the dexterity similar to a suited astronaut [1]. An R2 robot was sent to the International Space Station (ISS) in February 2011 and, in doing so, became the first humanoid robot in space. Its capabilities are presently being tested and expanded to increase its usefulness to the crew. Current work on R2 includes the addition of a mobility platform to allow the robot to complete tasks (such as cleaning, maintenance, or simple construction activities) both inside and outside of the ISS. To support these new activities, R2's software architecture is being developed to provide efficient ways of programming robust and autonomous behavior. In particular, a multi-tiered software architecture is proposed that combines principles of low-level feedback control with higher-level planners that accomplish behavioral goals at the task level given the run-time context, user constraints, the health of the system, and so on. The proposed architecture is shown in Figure 2. At the lowest-level, the resource level, there exists the various sensory and motor signals available to the system. The sensory signals for a robot such as R2 include multiple channels of force/torque data, joint or Cartesian positions calculated through the robot's proprioception, and signals derived from objects observable by its cameras.

  1. Contingency Software in Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn; Patterson-Hine, Ann

    2006-01-01

    This viewgraph presentation reviews the development of contingency software for autonomous systems. Autonomous vehicles currently have a limited capacity to diagnose and mitigate failures. There is a need to be able to handle a broader range of contingencies. The goals of the project are: 1. Speed up diagnosis and mitigation of anomalous situations.2.Automatically handle contingencies, not just failures.3.Enable projects to select a degree of autonomy consistent with their needs and to incrementally introduce more autonomy.4.Augment on-board fault protection with verified contingency scripts

  2. Autonomous software: Myth or magic?

    NASA Astrophysics Data System (ADS)

    Allan, A.; Naylor, T.; Saunders, E. S.

    2008-03-01

    We discuss work by the eSTAR project which demonstrates a fully closed loop autonomous system for the follow up of possible micro-lensing anomalies. Not only are the initial micro-lensing detections followed up in real time, but ongoing events are prioritised and continually monitored, with the returned data being analysed automatically. If the ``smart software'' running the observing campaign detects a planet-like anomaly, further follow-up will be scheduled autonomously and other telescopes and telescope networks alerted to the possible planetary detection. We further discuss the implications of this, and how such projects can be used to build more general autonomous observing and control systems.

  3. Automation of the space station core module power management and distribution system

    NASA Technical Reports Server (NTRS)

    Weeks, David J.

    1988-01-01

    Under the Advanced Development Program for Space Station, Marshall Space Flight Center has been developing advanced automation applications for the Power Management and Distribution (PMAD) system inside the Space Station modules for the past three years. The Space Station Module Power Management and Distribution System (SSM/PMAD) test bed features three artificial intelligence (AI) systems coupled with conventional automation software functioning in an autonomous or closed-loop fashion. The AI systems in the test bed include a baseline scheduler/dynamic rescheduler (LES), a load shedding management system (LPLMS), and a fault recovery and management expert system (FRAMES). This test bed will be part of the NASA Systems Autonomy Demonstration for 1990 featuring cooperating expert systems in various Space Station subsystem test beds. It is concluded that advanced automation technology involving AI approaches is sufficiently mature to begin applying the technology to current and planned spacecraft applications including the Space Station.

  4. Technology test results from an intelligent, free-flying robot for crew and equipment retrieval in space

    NASA Technical Reports Server (NTRS)

    Erickson, J.; Goode, R.; Grimm, K.; Hess, C.; Norsworthy, R.; Anderson, G.; Merkel, L.; Phinney, D.

    1992-01-01

    The ground-based demonstrations of Extra Vehicular Activity (EVA) Retriever, a voice-supervised, intelligent, free-flying robot, are designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the Space Station. The EVA Retriever software is required to autonomously plan and execute a target rendezvous, grapple, and return to base while avoiding stationary and moving obstacles with subsequent object handover. The software architecture incorporates a heirarchical decomposition of the control system that is horizontally partitioned into five major functional subsystems: sensing, perception, world model, reasoning, and acting. The design provides for supervised autonomy as the primary mode of operation. It is intended to be an evolutionary system improving in capability over time and as it earns crew trust through reliable and safe operation. This paper gives an overview of the hardware, a focus on software, and a summary of results achieved recently from both computer simulations and air bearing floor demonstrations. Limitations of the technology used are evaluated. Plans for the next phase, during which moving targets and obstacles drive realtime behavior requirements, are discussed.

  5. Technology test results from an intelligent, free-flying robot for crew and equipment retrieval in space

    NASA Astrophysics Data System (ADS)

    Erickson, Jon D.; Goode, R.; Grimm, K. A.; Hess, Clifford W.; Norsworthy, Robert S.; Anderson, Greg D.; Merkel, L.; Phinney, Dale E.

    1992-03-01

    The ground-based demonstrations of Extra Vehicular Activity (EVA) Retriever, a voice- supervised, intelligent, free-flying robot, are designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the space station. The EVA Retriever software is required to autonomously plan and execute a target rendezvous, grapple, and return to base while avoiding stationary and moving obstacles with subsequent object handover. The software architecture incorporates a hierarchical decomposition of the control system that is horizontally partitioned into five major functional subsystems: sensing, perception, world model, reasoning, and acting. The design provides for supervised autonomy as the primary mode of operation. It is intended to be an evolutionary system improving in capability over time and as it earns crew trust through reliable and safe operation. This paper gives an overview of the hardware, a focus on software, and a summary of results achieved recently from both computer simulations and air bearing floor demonstrations. Limitations of the technology used are evaluated. Plans for the next phase, during which moving targets and obstacles drive realtime behavior requirements, are discussed.

  6. Challenges in verification and validation of autonomous systems for space exploration

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Jonsson, Ari

    2005-01-01

    Space exploration applications offer a unique opportunity for the development and deployment of autonomous systems, due to limited communications, large distances, and great expense of direct operation. At the same time, the risk and cost of space missions leads to reluctance to taking on new, complex and difficult-to-understand technology. A key issue in addressing these concerns is the validation of autonomous systems. In recent years, higher-level autonomous systems have been applied in space applications. In this presentation, we will highlight those autonomous systems, and discuss issues in validating these systems. We will then look to future demands on validating autonomous systems for space, identify promising technologies and open issues.

  7. Monitoring Space Weather Hazards caused by geomagnetic disturbances with Space Hazard Monitor (SHM) systems

    NASA Astrophysics Data System (ADS)

    Xu, Z.; Gannon, J. L.; Peek, T. A.; Lin, D.

    2017-12-01

    One space weather hazard is the Geomagnetically Induced Currents (GICs) in the electric power transmission systems, which is naturally induced geoelectric field during the geomagnetic disturbances (GMDs). GICs are a potentially catastrophic threat to bulk power systems. For instance, the Blackout in Quebec in March 1989 was caused by GMDs during a significant magnetic storm. To monitor the GMDs, the autonomous Space Hazard Monitor (SHM) system is developed recently. The system includes magnetic field measurement from magnetometers and geomagnetic field measurement from electrodes. In this presentation, we introduce the six sites of SHMs which have been deployed in the US continental regions. The data from the magnetometers are processed with the Multiple Observatory Geomagnetic Data Analysis Software (MOGDAS). And the statistical results are presented here. It reveals not only the impacts of space weather over US continental region but also the potential of improving instrumentation development to provide better space weather monitor.

  8. Multi-Agent Software Design and Engineering for Human Centered Collaborative Autonomous Space Systems: NASA Intelligent Systems

    NASA Technical Reports Server (NTRS)

    Bradshaw, Jeffrey M.

    2005-01-01

    Detailed results of this three-year project are available in 37 publications, including 7 book chapters, 3 journal articles, and 27 refereed conference proceedings. In addition, various aspects of the project were the subject of 31 invited presentations and 6 tutorials at international conferences and workshops. Good descriptions of prior and ongoing work on foundational technologies in Brahms, KAoS, NOMADS, and the PSA project can be found in numerous publications not listed here.

  9. The Rational Behavior Model: A Multi-Paradigm, Tri-Level Software Architecture for the Control of Autonomous Vehicles

    DTIC Science & Technology

    1993-03-01

    possible over a RF link when surfaced and over acoustic telemetry when submerged . Lockheed Missiles and Space Company has been awarded the contract to...ACL), is purely hierarchical and consists of three major components: the Data Manager, the ACL Controller, and the Model- 22 Based Reasoner ( MBR ). The...Data Manager receives, processes, and analyzes sensor and status data for use by the MBR and ACL Controller. The ACL Controller communicates commands

  10. The Joint Tactical Aerial Resupply Vehicle Impact on Sustainment Operations

    DTIC Science & Technology

    2017-06-09

    Artificial Intelligence , Sustainment Operations, Rifle Company, Autonomous Aerial Resupply, Joint Tactical Autonomous Aerial Resupply System 16...Integrations and Development System AI Artificial Intelligence ARCIC Army Capabilities Integration Center ARDEC Armament Research, Development and...semi- autonomous systems, and fully autonomous systems. Autonomy of machines depends on sophisticated software, including Artificial Intelligence

  11. Enabling New Operations Concepts for Lunar and Mars Exploration

    NASA Astrophysics Data System (ADS)

    Jaap, John; Maxwell, Theresa

    2005-02-01

    The planning and scheduling of human space activities is an expensive and time-consuming task that seldom provides the crew with the control, flexibility, or insight that they need. During the past thirty years, scheduling software has seen only incremental improvements; however, software limitations continue to prevent even evolutionary improvements in the ``operations concept'' that is used for human space missions. Space missions are planned on the ground long before they are executed in space, and the crew has little input or influence on the schedule. In recent years the crew has been presented with a ``job jar'' of activities that they can do whenever they have time, but the contents of the jar is limited to tasks that do not use scarce shared resources and do not have external timing constraints. Consequently, the crew has no control over the schedule of the majority of their own tasks. As humans venture farther from earth for longer durations, it will become imperative that they have the ability to plan and schedule not only their own activities, but also the unattended activities of the systems, equipment, and robots on the journey with them. Significant software breakthroughs are required to enable the change in the operations concept. The crew does not have the time to build or modify the schedule by hand. They only need to issue a request to schedule a task and the system should automatically do the rest. Of course, the crew should not be required to build the complete schedule. Controllers on the ground should contribute the models and schedules where they have the better knowledge. The system must allow multiple simultaneous users, some on earth and some in space. The Mission Operations Laboratory at NASA's Marshall Space Flight Center has been researching and prototyping a modeling schema, scheduling engine, and system architecture that can enable the needed paradigm shift - it can make the crew autonomous. This schema and engine can be the core of a planning and scheduling system that would enable multiple planners, some on the earth and some in space, to build one integrated timeline. Its modeling schema can capture all the task requirements; its scheduling engine can build the schedule automatically; and its architecture can allow those (on earth and in space) with the best knowledge of the tasks to schedule them. This paper describes the enabling technology and proposes an operations concept for astronauts autonomously scheduling their activities and the activities around them.

  12. Enabling New Operations Concepts for Lunar and Mars Exploration

    NASA Technical Reports Server (NTRS)

    Jaap, John; Maxwell, Theresa

    2005-01-01

    The planning and scheduling of human space activities is an expensive and time-consuming task that seldom provides the crew with the control, flexibility, or insight that they need. During the past thirty years, scheduling software has seen only incremental improvements; however, software limitations continue to prevent even evolutionary improvements in the operations concept that is used for human space missions. Space missions are planned on the ground long before they are executed in space, and the crew has little input or influence on the schedule. In recent years the crew has been presented with a job jar of activities that they can do whenever they have time, but the contents of the jar is limited to tasks that do not use scarce shared resources and do not have external timing constraints. Consequently, the crew has no control over the schedule of the majority of their own tasks. As humans venture farther from earth for longer durations, it will become imperative that they have the ability to plan and schedule not only their own activities, but also the unattended activities of the systems, equipment, and robots on the journey with them. Significant software breakthroughs are required to enable the change in the operations concept. The crew does not have the time to build or modify the schedule by hand. They only need to issue a request to schedule a task and the system should automatically do the rest. Of course, the crew should not be required to build the complete schedule. Controllers on the ground should contribute the models and schedules where they have the better knowledge. The system must allow multiple simultaneous users, some on earth and some in space. The Mission Operations Laboratory at NASA's Marshall Space flight Center has been researching and prototyping a modeling schema, scheduling engine, and system architecture that can enable the needed paradigm shift - it can make the crew autonomous. This schema and engine can be the core of a planning and scheduling system that would enable multiple planners, some on the earth and some in space, to build one integrated timeline. Its modeling schema can capture all the task requirements; its scheduling engine can build the schedule automatically, and its architecture can allow those (on earth and in space) with the best knowledge of the tasks to schedule them. This paper describes the enabling technology and proposes an operations concept for astronauts autonomously scheduling their activities and the activities around them.

  13. Reconfigurable Software for Controlling Formation Flying

    NASA Technical Reports Server (NTRS)

    Mueller, Joseph B.

    2006-01-01

    Software for a system to control the trajectories of multiple spacecraft flying in formation is being developed to reflect underlying concepts of (1) a decentralized approach to guidance and control and (2) reconfigurability of the control system, including reconfigurability of the software and of control laws. The software is organized as a modular network of software tasks. The computational load for both determining relative trajectories and planning maneuvers is shared equally among all spacecraft in a cluster. The flexibility and robustness of the software are apparent in the fact that tasks can be added, removed, or replaced during flight. In a computational simulation of a representative formation-flying scenario, it was demonstrated that the following are among the services performed by the software: Uploading of commands from a ground station and distribution of the commands among the spacecraft, Autonomous initiation and reconfiguration of formations, Autonomous formation of teams through negotiations among the spacecraft, Working out details of high-level commands (e.g., shapes and sizes of geometrically complex formations), Implementation of a distributed guidance law providing autonomous optimization and assignment of target states, and Implementation of a decentralized, fuel-optimal, impulsive control law for planning maneuvers.

  14. NASA Docking System (NDS) Interface Definitions Document (IDD). Revision C, Nov. 2010

    NASA Technical Reports Server (NTRS)

    2010-01-01

    The NASA Docking System (NDS) mating system supports low approach velocity docking and provides a modular and reconfigurable standard interface, supporting crewed and autonomous vehicles during mating and assembly operations. The NDS is NASA's implementation for the emerging International Docking System Standard (IDSS) using low impact docking technology. All NDS configurations can mate with the configuration specified in the IDSS Interface Definition Document (IDD) released September 21, 2010. The NDS evolved from the Low Impact Docking System (LIDS). The acronym international Low Impact Docking System (iLIDS) is also used to describe this system. NDS and iLIDS may be used interchangeability. Some of the heritage documentation and implementations (e.g., software command names) used on NDS will continue to use the LIDS acronym. The NDS IDD defines the interface characteristics and performance capability of the NDS, including uses ranging from crewed to autonomous space vehicles and from low earth orbit to deep space exploration. The responsibility for developing space vehicles and for making them technically and operationally compatible with the NDS rests with the vehicle providers. Host vehicle examples include crewed/uncrewed spacecraft, space station modules, elements, etc. Within this document, any docking space vehicle will be referred to as the host vehicle. This document defines the NDS-to-NDS interfaces, as well as the NDS-to-host vehicle interfaces and performance capability.

  15. An Autonomous Gps-Denied Unmanned Vehicle Platform Based on Binocular Vision for Planetary Exploration

    NASA Astrophysics Data System (ADS)

    Qin, M.; Wan, X.; Shao, Y. Y.; Li, S. Y.

    2018-04-01

    Vision-based navigation has become an attractive solution for autonomous navigation for planetary exploration. This paper presents our work of designing and building an autonomous vision-based GPS-denied unmanned vehicle and developing an ARFM (Adaptive Robust Feature Matching) based VO (Visual Odometry) software for its autonomous navigation. The hardware system is mainly composed of binocular stereo camera, a pan-and tilt, a master machine, a tracked chassis. And the ARFM-based VO software system contains four modules: camera calibration, ARFM-based 3D reconstruction, position and attitude calculation, BA (Bundle Adjustment) modules. Two VO experiments were carried out using both outdoor images from open dataset and indoor images captured by our vehicle, the results demonstrate that our vision-based unmanned vehicle is able to achieve autonomous localization and has the potential for future planetary exploration.

  16. Enhancements to the KATE model-based reasoning system

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1994-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. This report describes two software efforts which enhance the functionality and usability of KATE. The first addition, a flow solver, adds to KATE a tool for modeling the flow of liquid in a pipe system. The second addition adds support for editing KATE knowledge base files to the Emacs editor. The body of this report discusses design and implementation issues having to do with these two tools. It will be useful to anyone maintaining or extending either the flow solver or the editor enhancements.

  17. Evolution of a radio communication relay system

    NASA Astrophysics Data System (ADS)

    Nguyen, Hoa G.; Pezeshkian, Narek; Hart, Abraham; Burmeister, Aaron; Holz, Kevin; Neff, Joseph; Roth, Leif

    2013-05-01

    Providing long-distance non-line-of-sight control for unmanned ground robots has long been recognized as a problem, considering the nature of the required high-bandwidth radio links. In the early 2000s, the DARPA Mobile Autonomous Robot Software (MARS) program funded the Space and Naval Warfare Systems Center (SSC) Pacific to demonstrate a capability for autonomous mobile communication relaying on a number of Pioneer laboratory robots. This effort also resulted in the development of ad hoc networking radios and software that were later leveraged in the development of a more practical and logistically simpler system, the Automatically Deployed Communication Relays (ADCR). Funded by the Joint Ground Robotics Enterprise and internally by SSC Pacific, several generations of ADCR systems introduced increasingly more capable hardware and software for automatic maintenance of communication links through deployment of static relay nodes from mobile robots. This capability was finally tapped in 2010 to fulfill an urgent need from theater. 243 kits of ruggedized, robot-deployable communication relays were produced and sent to Afghanistan to extend the range of EOD and tactical ground robots in 2012. This paper provides a summary of the evolution of the radio relay technology at SSC Pacific, and then focuses on the latest two stages, the Manually-Deployed Communication Relays and the latest effort to automate the deployment of these ruggedized and fielded relay nodes.

  18. Autonomous Cryogenics Loading Operations Simulation Software: Knowledgebase Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2012-01-01

    The Simulation Software, KATE (Knowledgebase Autonomous Test Engineer), is used to demonstrate the automatic identification of faults in a system. The ACLO (Autonomous Cryogenics Loading Operation) project uses KATE to monitor and find faults in the loading of the cryogenics int o a vehicle fuel tank. The KATE software interfaces with the IHM (Integrated Health Management) systems bus to communicate with other systems that are part of ACLO. One system that KATE uses the IHM bus to communicate with is AIS (Advanced Inspection System). KATE will send messages to AIS when there is a detected anomaly. These messages include visual inspection of specific valves, pressure gauges and control messages to have AIS open or close manual valves. My goals include implementing the connection to the IHM bus within KATE and for the AIS project. I will also be working on implementing changes to KATE's Ul and implementing the physics objects in KATE that will model portions of the cryogenics loading operation.

  19. Multi-Spacecraft Autonomous Positioning System

    NASA Technical Reports Server (NTRS)

    Anzalone, Evan

    2015-01-01

    As the number of spacecraft in simultaneous operation continues to grow, there is an increased dependency on ground-based navigation support. The current baseline system for deep space navigation utilizes Earth-based radiometric tracking, requiring long-duration observations to perform orbit determination and generate a state update. The age, complexity, and high utilization of the ground assets pose a risk to spacecraft navigation performance. In order to perform complex operations at large distances from Earth, such as extraterrestrial landing and proximity operations, autonomous systems are required. With increasingly complex mission operations, the need for frequent and Earth-independent navigation capabilities is further reinforced. The Multi-spacecraft Autonomous Positioning System (MAPS) takes advantage of the growing interspacecraft communication network and infrastructure to allow for Earth-autonomous state measurements to enable network-based space navigation. A notional concept of operations is given in figure 1. This network is already being implemented and routinely used in Martian communications through the use of the Mars Reconnaissance Orbiter and Mars Odyssey spacecraft as relays for surface assets. The growth of this communications architecture is continued through MAVEN, and future potential commercial Mars telecom orbiters. This growing network provides an initial Marslocal capability for inter-spacecraft communication and navigation. These navigation updates are enabled by cross-communication between assets in the network, coupled with onboard navigation estimation routines to integrate packet travel time to generate ranging measurements. Inter-spacecraft communication allows for frequent state broadcasts and time updates from trusted references. The architecture is a software-based solution, enabling its implementation on a wide variety of current assets, with the operational constraints and measurement accuracy determined by onboard systems.

  20. Supervising Remote Humanoids Across Intermediate Time Delay

    NASA Technical Reports Server (NTRS)

    Hambuchen, Kimberly; Bluethmann, William; Goza, Michael; Ambrose, Robert; Rabe, Kenneth; Allan, Mark

    2006-01-01

    The President's Vision for Space Exploration, laid out in 2004, relies heavily upon robotic exploration of the lunar surface in early phases of the program. Prior to the arrival of astronauts on the lunar surface, these robots will be required to be controlled across space and time, posing a considerable challenge for traditional telepresence techniques. Because time delays will be measured in seconds, not minutes as is the case for Mars Exploration, uploading the plan for a day seems excessive. An approach for controlling humanoids under intermediate time delay is presented. This approach uses software running within a ground control cockpit to predict an immersed robot supervisor's motions which the remote humanoid autonomously executes. Initial results are presented.

  1. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general computing workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed to demonstrate the ability of an expert system to autonomously control the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. Integration options are explored and several possible solutions are presented.

  2. Launch Commit Criteria Monitoring Agent

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Dan A.; Kelly, Andrew O.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has developed and deployed a software agent to monitor the Space Shuttle's ground processing telemetry stream. The application, the Launch Commit Criteria Monitoring Agent, increases situational awareness for system and hardware engineers during Shuttle launch countdown. The agent provides autonomous monitoring of the telemetry stream, automatically alerts system engineers when predefined criteria have been met, identifies limit warnings and violations of launch commit criteria, aids Shuttle engineers through troubleshooting procedures, and provides additional insight to verify appropriate troubleshooting of problems by contractors. The agent has successfully detected launch commit criteria warnings and violations on a simulated playback data stream. Efficiency and safety are improved through increased automation.

  3. Development of Algorithms for Control of Humidity in Plant Growth Chambers

    NASA Technical Reports Server (NTRS)

    Costello, Thomas A.

    2003-01-01

    Algorithms were developed to control humidity in plant growth chambers used for research on bioregenerative life support at Kennedy Space Center. The algorithms used the computed water vapor pressure (based on measured air temperature and relative humidity) as the process variable, with time-proportioned outputs to operate the humidifier and de-humidifier. Algorithms were based upon proportional-integral-differential (PID) and Fuzzy Logic schemes and were implemented using I/O Control software (OPTO-22) to define and download the control logic to an autonomous programmable logic controller (PLC, ultimate ethernet brain and assorted input-output modules, OPTO-22), which performed the monitoring and control logic processing, as well the physical control of the devices that effected the targeted environment in the chamber. During limited testing, the PLC's successfully implemented the intended control schemes and attained a control resolution for humidity of less than 1%. The algorithms have potential to be used not only with autonomous PLC's but could also be implemented within network-based supervisory control programs. This report documents unique control features that were implemented within the OPTO-22 framework and makes recommendations regarding future uses of the hardware and software for biological research by NASA.

  4. Software for Automation of Real-Time Agents, Version 2

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Estlin, Tara; Gaines, Daniel; Schaffer, Steve; Chouinard, Caroline; Engelhardt, Barbara; Wilklow, Colette; Mutz, Darren; Knight, Russell; Rabideau, Gregg; hide

    2005-01-01

    Version 2 of Closed Loop Execution and Recovery (CLEaR) has been developed. CLEaR is an artificial intelligence computer program for use in planning and execution of actions of autonomous agents, including, for example, Deep Space Network (DSN) antenna ground stations, robotic exploratory ground vehicles (rovers), robotic aircraft (UAVs), and robotic spacecraft. CLEaR automates the generation and execution of command sequences, monitoring the sequence execution, and modifying the command sequence in response to execution deviations and failures as well as new goals for the agent to achieve. The development of CLEaR has focused on the unification of planning and execution to increase the ability of the autonomous agent to perform under tight resource and time constraints coupled with uncertainty in how much of resources and time will be required to perform a task. This unification is realized by extending the traditional three-tier robotic control architecture by increasing the interaction between the software components that perform deliberation and reactive functions. The increase in interaction reduces the need to replan, enables earlier detection of the need to replan, and enables replanning to occur before an agent enters a state of failure.

  5. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1988-01-01

    Expert systems that require access to data bases, complex simulations and real time instrumentation have both symbolic and algorithmic needs. Both of these needs could be met using a general purpose workstation running both symbolic and algorithmic codes, or separate, specialized computers networked together. The later approach was chosen to implement TEXSYS, the thermal expert system, developed by the NASA Ames Research Center in conjunction with the Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. The integration options and several possible solutions are presented.

  6. Space station automation of common module power management and distribution

    NASA Technical Reports Server (NTRS)

    Miller, W.; Jones, E.; Ashworth, B.; Riedesel, J.; Myers, C.; Freeman, K.; Steele, D.; Palmer, R.; Walsh, R.; Gohring, J.

    1989-01-01

    The purpose is to automate a breadboard level Power Management and Distribution (PMAD) system which possesses many functional characteristics of a specified Space Station power system. The automation system was built upon 20 kHz ac source with redundancy of the power buses. There are two power distribution control units which furnish power to six load centers which in turn enable load circuits based upon a system generated schedule. The progress in building this specified autonomous system is described. Automation of Space Station Module PMAD was accomplished by segmenting the complete task in the following four independent tasks: (1) develop a detailed approach for PMAD automation; (2) define the software and hardware elements of automation; (3) develop the automation system for the PMAD breadboard; and (4) select an appropriate host processing environment.

  7. Control of intelligent robots in space

    NASA Technical Reports Server (NTRS)

    Freund, E.; Buehler, CH.

    1989-01-01

    In view of space activities like International Space Station, Man-Tended-Free-Flyer (MTFF) and free flying platforms, the development of intelligent robotic systems is gaining increasing importance. The range of applications that have to be performed by robotic systems in space includes e.g., the execution of experiments in space laboratories, the service and maintenance of satellites and flying platforms, the support of automatic production processes or the assembly of large network structures. Some of these tasks will require the development of bi-armed or of multiple robotic systems including functional redundancy. For the development of robotic systems which are able to perform this variety of tasks a hierarchically structured modular concept of automation is required. This concept is characterized by high flexibility as well as by automatic specialization to the particular sequence of tasks that have to be performed. On the other hand it has to be designed such that the human operator can influence or guide the system on different levels of control supervision, and decision. This leads to requirements for the hardware and software concept which permit a range of application of the robotic systems from telemanipulation to autonomous operation. The realization of this goal requires strong efforts in the development of new methods, software and hardware concepts, and the integration into an automation concept.

  8. Target Trailing With Safe Navigation for Maritime Autonomous Surface Vehicles

    NASA Technical Reports Server (NTRS)

    Wolf, Michael; Kuwata, Yoshiaki; Zarzhitsky, Dimitri V.

    2013-01-01

    This software implements a motion-planning module for a maritime autonomous surface vehicle (ASV). The module trails a given target while also avoiding static and dynamic surface hazards. When surface hazards are other moving boats, the motion planner must apply International Regulations for Avoiding Collisions at Sea (COLREGS). A key subset of these rules has been implemented in the software. In case contact with the target is lost, the software can receive and follow a "reacquisition route," provided by a complementary system, until the target is reacquired. The programmatic intention is that the trailed target is a submarine, although any mobile naval platform could serve as the target. The algorithmic approach to combining motion with a (possibly moving) goal location, while avoiding local hazards, may be applicable to robotic rovers, automated landing systems, and autonomous airships. The software operates in JPL s CARACaS (Control Architecture for Robotic Agent Command and Sensing) software architecture and relies on other modules for environmental perception data and information on the predicted detectability of the target, as well as the low-level interface to the boat controls.

  9. An integrated dexterous robotic testbed for space applications

    NASA Technical Reports Server (NTRS)

    Li, Larry C.; Nguyen, Hai; Sauer, Edward

    1992-01-01

    An integrated dexterous robotic system was developed as a testbed to evaluate various robotics technologies for advanced space applications. The system configuration consisted of a Utah/MIT Dexterous Hand, a PUMA 562 arm, a stereo vision system, and a multiprocessing computer control system. In addition to these major subsystems, a proximity sensing system was integrated with the Utah/MIT Hand to provide capability for non-contact sensing of a nearby object. A high-speed fiber-optic link was used to transmit digitized proximity sensor signals back to the multiprocessing control system. The hardware system was designed to satisfy the requirements for both teleoperated and autonomous operations. The software system was designed to exploit parallel processing capability, pursue functional modularity, incorporate artificial intelligence for robot control, allow high-level symbolic robot commands, maximize reusable code, minimize compilation requirements, and provide an interactive application development and debugging environment for the end users. An overview is presented of the system hardware and software configurations, and implementation is discussed of subsystem functions.

  10. An Introduction to Flight Software Development: FSW Today, FSW 2010

    NASA Technical Reports Server (NTRS)

    Gouvela, John

    2004-01-01

    Experience and knowledge gained from ongoing maintenance of Space Shuttle Flight Software and new development projects including Cockpit Avionics Upgrade are applied to projected needs of the National Space Exploration Vision through Spiral 2. Lessons learned from these current activities are applied to create a sustainable, reliable model for development of critical software to support Project Constellation. This presentation introduces the technologies, methodologies, and infrastructure needed to produce and sustain high quality software. It will propose what is needed to support a Vision for Space Exploration that places demands on the innovation and productivity needed to support future space exploration. The technologies in use today within FSW development include tools that provide requirements tracking, integrated change management, modeling and simulation software. Specific challenges that have been met include the introduction and integration of Commercial Off the Shelf (COTS) Real Time Operating System for critical functions. Though technology prediction has proved to be imprecise, Project Constellation requirements will need continued integration of new technology with evolving methodologies and changing project infrastructure. Targets for continued technology investment are integrated health monitoring and management, self healing software, standard payload interfaces, autonomous operation, and improvements in training. Emulation of the target hardware will also allow significant streamlining of development and testing. The methodologies in use today for FSW development are object oriented UML design, iterative development using independent components, as well as rapid prototyping . In addition, Lean Six Sigma and CMMI play a critical role in the quality and efficiency of the workforce processes. Over the next six years, we expect these methodologies to merge with other improvements into a consolidated office culture with all processes being guided by automated office assistants. The infrastructure in use today includes strict software development and configuration management procedures, including strong control of resource management and critical skills coverage. This will evolve to a fully integrated staff organization with efficient and effective communication throughout all levels guided by a Mission-Systems Architecture framework with focus on risk management and attention toward inevitable product obsolescence. This infrastructure of computing equipment, software and processes will itself be subject to technological change and need for management of change and improvement,

  11. Autonomous Navigation Above the GNSS Constellations and Beyond: GPS Navigation for the Magnetospheric Multiscale Mission and SEXTANT Pulsar Navigation Demonstration

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke

    2017-01-01

    This talk will describe two first-of-their-kind technology demonstrations attached to ongoing NASA science missions, both of which aim to extend the range of autonomous spacecraft navigation far from the Earth. First, we will describe the onboard GPS navigation system for the Magnetospheric Multiscale (MMS) mission which is currently operating in elliptic orbits reaching nearly halfway to the Moon. The MMS navigation system is a key outgrowth of a larger effort at NASA Goddard Space Flight Center to advance high-altitude Global Navigation Satellite System (GNSS) navigation on multiple fronts, including developing Global Positioning System receivers and onboard navigation software, running simulation studies, and leading efforts to characterize and protect signals at high-altitude in the so-called GNSS Space-Service Volume (SSV). In the second part of the talk, we will describe the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) mission that aims to make the first in-space demonstration of X-ray pulsar navigation (XNAV). SEXTANT is attached to the NASA astrophysics mission Neutron-star Interior Composition ExploreR (NICER) whose International Space Station mounted X-ray telescope is investigating the fundamental physics of extremes in gravity, material density, and electromagnetic fields found in neutron stars, and whose instrument provides a nearly ideal navigation sensor for XNAV.

  12. Galileo IOV Electrical Power Subsystem Relies On Li-Ion Batter Charge Management Controlled By Hardware

    NASA Astrophysics Data System (ADS)

    Douay, N.

    2011-10-01

    In the frame of GALILEO In-Orbit Validation program which is composed of 4 satellites, Thales Alenia Space France has designed, developed and tested the Electrical Power Subsystem. Besides some classical design choices like: -50V regulated main power bus provided by the PCDU manufactured by Terma (DK), -Solar array, manufactured by Dutch-Space (NL), using Ga-As triple junction technology from Azur Space Power Solar GmbH, -SAFT (FR) Lithium-ion Battery for which cell package balancing function is required, -Solar Array Drive Mechanism, provided by RUAG Space Switzerland, to transfer the power. This subsystem features a fully autonomous, failure tolerant, battery charge management able to operate even after a complete unavailability of the on-board software. The battery charge management is implemented such that priority is always given to satisfy the satellite main bus needs in order to maintain the main bus regulation under MEA control. This battery charge management principle provides very high reliability and operational robustness. So, the paper describes : -the battery charge management concept using a combination of PCDU hardware and relevant battery lines monitoring, -the functional aspect of the single point failure free S4R (Sequential Switching Shunt Switch Regulator) and associated performances, -the failure modes isolated and passivated by this architecture. The paper will address as well the autonomous balancing function characteristics and performances.

  13. Telerobot local-remote control architecture for space flight program applications

    NASA Technical Reports Server (NTRS)

    Zimmerman, Wayne; Backes, Paul; Steele, Robert; Long, Mark; Bon, Bruce; Beahan, John

    1993-01-01

    The JPL Supervisory Telerobotics (STELER) Laboratory has developed and demonstrated a unique local-remote robot control architecture which enables management of intermittent communication bus latencies and delays such as those expected for ground-remote operation of Space Station robotic systems via the Tracking and Data Relay Satellite System (TDRSS) communication platform. The current work at JPL in this area has focused on enhancing the technologies and transferring the control architecture to hardware and software environments which are more compatible with projected ground and space operational environments. At the local site, the operator updates the remote worksite model using stereo video and a model overlay/fitting algorithm which outputs the location and orientation of the object in free space. That information is relayed to the robot User Macro Interface (UMI) to enable programming of the robot control macros. This capability runs on a single Silicon Graphics Inc. machine. The operator can employ either manual teleoperation, shared control, or supervised autonomous control to manipulate the intended object. The remote site controller, called the Modular Telerobot Task Execution System (MOTES), runs in a multi-processor VME environment and performs the task sequencing, task execution, trajectory generation, closed loop force/torque control, task parameter monitoring, and reflex action. This paper describes the new STELER architecture implementation, and also documents the results of the recent autonomous docking task execution using the local site and MOTES.

  14. Launch vehicle operations cost reduction through artificial intelligence techniques

    NASA Technical Reports Server (NTRS)

    Davis, Tom C., Jr.

    1988-01-01

    NASA's Kennedy Space Center has attempted to develop AI methods in order to reduce the cost of launch vehicle ground operations as well as to improve the reliability and safety of such operations. Attention is presently given to cost savings estimates for systems involving launch vehicle firing-room software and hardware real-time diagnostics, as well as the nature of configuration control and the real-time autonomous diagnostics of launch-processing systems by these means. Intelligent launch decisions and intelligent weather forecasting are additional applications of AI being considered.

  15. Lessons Learned in the Livingstone 2 on Earth Observing One Flight Experiment

    NASA Technical Reports Server (NTRS)

    Hayden, Sandra C.; Sweet, Adam J.; Shulman, Seth

    2005-01-01

    The Livingstone 2 (L2) model-based diagnosis software is a reusable diagnostic tool for monitoring complex systems. In 2004, L2 was integrated with the JPL Autonomous Sciencecraft Experiment (ASE) and deployed on-board Goddard's Earth Observing One (EO-1) remote sensing satellite, to monitor and diagnose the EO-1 space science instruments and imaging sequence. This paper reports on lessons learned from this flight experiment. The goals for this experiment, including validation of minimum success criteria and of a series of diagnostic scenarios, have all been successfully net. Long-term operations in space are on-going, as a test of the maturity of the system, with L2 performance remaining flawless. L2 has demonstrated the ability to track the state of the system during nominal operations, detect simulated abnormalities in operations and isolate failures to their root cause fault. Specific advances demonstrated include diagnosis of ambiguity groups rather than a single fault candidate; hypothesis revision given new sensor evidence about the state of the system; and the capability to check for faults in a dynamic system without having to wait until the system is quiescent. The major benefits of this advanced health management technology are to increase mission duration and reliability through intelligent fault protection, and robust autonomous operations with reduced dependency on supervisory operations from Earth. The work-load for operators will be reduced by telemetry of processed state-of-health information rather than raw data. The long-term vision is that of making diagnosis available to the onboard planner or executive, allowing autonomy software to re-plan in order to work around known component failures. For a system that is expected to evolve substantially over its lifetime, as for the International Space Station, the model-based approach has definite advantages over rule-based expert systems and limit-checking fault protection systems, as these do not scale well. The model-based approach facilitates reuse of the L2 diagnostic software; only the model of the system to be diagnosed and telemetry monitoring software has to be rebuilt for a new system or expanded for a growing system. The hierarchical L2 model supports modularity and expendability, and as such is suitable solution for integrated system health management as envisioned for systems-of-systems.

  16. GROVER: An autonomous vehicle for ice sheet research

    NASA Astrophysics Data System (ADS)

    Trisca, G. O.; Robertson, M. E.; Marshall, H.; Koenig, L.; Comberiate, M. A.

    2013-12-01

    The Goddard Remotely Operated Vehicle for Exploration and Research or Greenland Rover (GROVER) is a science enabling autonomous robot specifically designed to carry a low-power, large bandwidth radar for snow accumulation mapping over the Greenland Ice Sheet. This new and evolving technology enables reduced cost and increased safety for polar research. GROVER was field tested at Summit, Greenland in May 2013. The robot traveled over 30 km and was controlled both by line of sight wireless and completely autonomously with commands and telemetry via the Iridium Satellite Network, from Summit as well as remotely from Boise, Idaho. Here we describe GROVER's unique abilities and design. The software stack features a modular design that can be adapted for any application that requires autonomous behavior, reliable communications using different technologies and low level control of peripherals. The modules are built to communicate using the publisher-subscriber design pattern to maximize data-reuse and allow for graceful failures at the software level, along with the ability to be loaded or unloaded on-the-fly, enabling the software to adopt different behaviors based on power constraints or specific processing needs. These modules can also be loaded or unloaded remotely for servicing and telemetry can be configured to contain any kind of information being generated by the sensors or scientific instruments. The hardware design protects the electronic components and the control system can change functional parameters based on sensor input. Power failure modes built into the hardware prevent the vehicle from running out of energy permanently by monitoring voltage levels and triggering software reboots when the levels match pre-established conditions. This guarantees that the control software will be operational as soon as there is enough charge to sustain it, giving the vehicle increased longevity in case of a temporary power loss. GROVER demonstrates that autonomous rovers can be a revolutionary tool for data collection, and that both the technology and the software are available and ready to be implemented to create scientific data collection platforms.

  17. Software Architecture for Anti-Submarine Warfare Unmanned Surface Vehicles

    DTIC Science & Technology

    2016-09-01

    discussion about software systems that could be used to control these systems to make the jobs of the human operators easier. B. RESEARCH QUESTIONS... research study. To better understand the role of artificial intelligence in designing autonomous systems, S. Russell and P. Norvig jointly authored a...artificial intelligence, and autonomous systems. This serves as the framework for the real design challenge. 1. Protecting the Battle Group The United

  18. MER Surface Phase; Blurring the Line Between Fault Protection and What is Supposed to Happen

    NASA Technical Reports Server (NTRS)

    Reeves, Glenn E.

    2008-01-01

    An assessment on the limitations of communication with MER rovers and how such constraints drove the system design, flight software and fault protection architecture, blurring the line between traditional fault protection and expected nominal behavior, and requiring the most novel autonomous and semi-autonomous elements of the vehicle software including communication, surface mobility, attitude knowledge acquisition, fault protection, and the activity arbitration service.

  19. KSC-2014-3534

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Former astronaut Greg Johnson, executive director of the Center for the Advancement of Science in Space, talks to Florida middle school students and their teachers before the start of the Zero Robotics finals competition at NASA Kennedy Space Center's Space Station Processing Facility in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  20. KSC-2014-3539

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Former astronaut Greg Johnson, executive director of the Center for the Advancement of Science in Space, talks to Florida middle school students and their teachers before the start of the Zero Robotics finals competition at NASA Kennedy Space Center's Space Station Processing Facility in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  1. KSC-2014-3538

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Former astronaut Greg Johnson, executive director of the Center for the Advancement of Science in Space, talks to Florida middle school students and their teachers before the start of the Zero Robotics finals competition at NASA Kennedy Space Center's Space Station Processing Facility in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  2. KSC-2014-3541

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Florida middle school students and their teachers watch the Zero Robotics finals competition broadcast live via webex from the International Space Station. The Florida teams are at the Space Station Processing Facility at NASA's Kennedy Space Center in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  3. Packet telemetry and packet telecommand - The new generation of spacecraft data handling techniques

    NASA Technical Reports Server (NTRS)

    Hooke, A. J.

    1983-01-01

    Because of rising costs and reduced reliability of spacecraft and ground network hardware and software customization, standardization Packet Telemetry and Packet Telecommand concepts are emerging as viable alternatives. Autonomous packets of data, within each concept, which are created within ground and space application processes through the use of formatting techniques, are switched end-to-end through the space data network to their destination application processes through the use of standard transfer protocols. This process may result in facilitating a high degree of automation and interoperability because of completely mission-independent-designed intermediate data networks. The adoption of an international guideline for future space telemetry formatting of the Packet Telemetry concept, and the advancement of the NASA-ESA Working Group's Packet Telecommand concept to a level of maturity parallel to the of Packet Telemetry are the goals of the Consultative Committee for Space Data Systems. Both the Packet Telemetry and Packet Telecommand concepts are reviewed.

  4. Autonomous Science Operations Technologies for Deep Space Gateway

    NASA Astrophysics Data System (ADS)

    Barnes, P. K.; Haddock, A. T.; Cruzen, C. A.

    2018-02-01

    Autonomous Science Operations Technologies for Deep Space Gateway (DSG) is an overview of how the DSG would benefit from autonomous systems utilizing proven technologies performing telemetry monitoring and science operations.

  5. Optimized autonomous operations of a 20 K space hydrogen sorption cryocooler

    NASA Astrophysics Data System (ADS)

    Borders, J.; Morgante, G.; Prina, M.; Pearson, D.; Bhandari, P.

    2004-06-01

    A fully redundant hydrogen sorption cryocooler is being developed for the European Space Agency Planck mission, dedicated to the measurement of the temperature anisotropies of the cosmic microwave background radiation with unprecedented sensitivity and resolution [Advances in Cryogenic Engineering 45A (2000) 499]. In order to achieve this ambitious scientific task, this cooler is required to provide a stable temperature reference (˜20 K) and appropriate cooling (˜1 W) to the two instruments on-board, with a flight operational lifetime of 18 months. During mission operations, communication with the spacecraft will be possible in a restricted time-window, not longer than 2 h/day. This implies the need for an operations control structure with the required robustness to safely perform autonomous procedures. The cooler performance depends on many operating parameters (such as the temperatures of the pre-cooling stages and the warm radiator), therefore the operation control system needs the capability to adapt to variations of these boundary conditions, while maintaining safe operating procedures. An engineering bread board (EBB) cooler was assembled and tested to evaluate the behavior of the system under conditions simulating flight operations and the test data were used to refine and improve the operation control software. In order to minimize scientific data loss, the cooler is required to detect all possible failure modes and to autonomously react to them by taking the appropriate action in a rapid fashion. Various procedures and schemes both general and specific in nature were developed, tested and implemented to achieve these goals. In general, the robustness to malfunctions was increased by implementing an automatic classification of anomalies in different levels relative to the seriousness of the error. The response is therefore proportional to the failure level. Specifically, the start-up sequence duration was significantly reduced, allowing a much faster activation of the system, particularly useful in case of restarts after inadvertent shutdowns arising from malfunctions in the spacecraft. The capacity of the system to detect J-T plugs was increased to the point that the cooler is able to autonomously identify actual contaminants clogging from gas flow reductions due to off-nominal operating conditions. Once a plug is confirmed, the software autonomously energizes, and subsequently turns off, a J-T defrost heater until the clog is removed, bringing the system back to normal operating conditions. In this paper, all the cooler Operational Modes are presented, together with the description of the logic structure of the procedures and the advantages they produce for the operations.

  6. A Feasible Approach for Implementing Greater Levels of Satellite Autonomy

    NASA Astrophysics Data System (ADS)

    Lindsay, Steve; Zetocha, Paul

    2002-01-01

    In this paper, we propose a means for achieving increasingly autonomous satellite operations. We begin with a brief discussion of the current state-of-the-art in satellite ground operations and flight software, as well as the real and perceived technical and political obstacles to increasing the levels of autonomy on today's satellites. We then present a list of system requirements that address these hindrances and include the artificial intelligence (AI) technologies with the potential to satisfy these requirements. We conclude with a discussion of how the space industry can use this information to incorporate increased autonomy. From past experience we know that autonomy will not just "happen," and we know that the expensive course of manually intensive operations simply cannot continue. Our goal is to present the aerospace industry with an analysis that will begin moving us in the direction of autonomous operations.

  7. Autonomous docking ground demonstration

    NASA Technical Reports Server (NTRS)

    Lamkin, Steve L.; Le, Thomas Quan; Othon, L. T.; Prather, Joseph L.; Eick, Richard E.; Baxter, Jim M.; Boyd, M. G.; Clark, Fred D.; Spehar, Peter T.; Teters, Rebecca T.

    1991-01-01

    The Autonomous Docking Ground Demonstration is an evaluation of the laser sensor system to support the docking phase (12 ft to contact) when operated in conjunction with the guidance, navigation, and control (GN&C) software. The docking mechanism being used was developed for the Apollo/Soyuz Test Program. This demonstration will be conducted using the 6-DOF Dynamic Test System (DTS). The DTS simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration, the laser sensor will be mounted on the target vehicle and the retroflectors will be on the chase vehicle. This arrangement was chosen to prevent potential damage to the laser. The laser sensor system, GN&C, and 6-DOF DTS will be operated closed-loop. Initial conditions to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved.

  8. Intelligent Vision Systems Independent Research and Development (IR&D) 2006

    NASA Technical Reports Server (NTRS)

    Patrick, Clinton; Chavis, Katherine

    2006-01-01

    This report summarizes results in conduct of research sponsored by the 2006 Independent Research and Development (IR&D) program at Marshall Space Flight Center (MSFC) at Redstone Arsenal, Alabama. The focus of this IR&D is neural network (NN) technology provided by Imagination Engines, Incorporated (IEI) of St. Louis, Missouri. The technology already has many commercial, military, and governmental applications, and a rapidly growing list of other potential spin-offs. The goal for this IR&D is implementation and demonstration of the technology for autonomous robotic operations, first in software and ultimately in one or more hardware realizations. Testing is targeted specifically to the MSFC Flat Floor, but may also include other robotic platforms at MSFC, as time and funds permit. For the purpose of this report, the NN technology will be referred to by IEI's designation for a subset configuration of its patented technology suite: Self-Training Autonomous Neural Network Object (STANNO).

  9. Towards an autonomous telescope system: the Test-Bed Telescope project

    NASA Astrophysics Data System (ADS)

    Racero, E.; Ocaña, F.; Ponz, D.; the TBT Consortium

    2015-05-01

    In the context of the Space Situational Awareness (SSA) programme of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. It is foreseen that this test-bed environment will be used to validate future prototype software systems as well as to evaluate remote monitoring and control techniques. The test-bed system will be capable to deliver astrometric and photometric data of the observed objects in near real-time. This contribution describes the current status of the project.

  10. Work Experience Report

    NASA Technical Reports Server (NTRS)

    Guo, Daniel

    2017-01-01

    The NASA Platform for Autonomous Systems (NPAS) toolkit is currently being used at the NASA John C. Stennis Space Center (SSC) to develop the INSIGHT program, which will autonomously monitor and control the Nitrogen System of the High Pressure Gas Facility (HPGF) on site. The INSIGHT program is in need of generic timing capabilities in order to perform timing based actions such as pump usage timing and sequence step timing. The purpose of this project was to develop a timing module that could fulfill these requirements and be adaptable for expanded use in the future. The code was written in Gensym G2 software platform, the same as INSIGHT, and was written generically to ensure compatibility with any G2 program. Currently, the module has two timing capabilities, a stopwatch function and a countdown function. Although the module has gone through some functionality testing, actual integration of the module into NPAS and the INSIGHT program is contingent on the module passing later checks.

  11. AGATE: Adversarial Game Analysis for Tactical Evaluation

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terrance L.

    2013-01-01

    AGATE generates a set of ranked strategies that enables an autonomous vehicle to track/trail another vehicle that is trying to break the contact using evasive tactics. The software is efficient (can be run on a laptop), scales well with environmental complexity, and is suitable for use onboard an autonomous vehicle. The software will run in near-real-time (2 Hz) on most commercial laptops. Existing software is usually run offline in a planning mode, and is not used to control an unmanned vehicle actively. JPL has developed a system for AGATE that uses adversarial game theory (AGT) methods (in particular, leader-follower and pursuit-evasion) to enable an autonomous vehicle (AV) to maintain tracking/ trailing operations on a target that is employing evasive tactics. The AV trailing, tracking, and reacquisition operations are characterized by imperfect information, and are an example of a non-zero sum game (a positive payoff for the AV is not necessarily an equal loss for the target being tracked and, potentially, additional adversarial boats). Previously, JPL successfully applied the Nash equilibrium method for onboard control of an autonomous ground vehicle (AGV) travelling over hazardous terrain.

  12. Space Station power system autonomy demonstration

    NASA Technical Reports Server (NTRS)

    Kish, James A.; Dolce, James L.; Weeks, David J.

    1988-01-01

    The Systems Autonomy Demonstration Program (SADP) represents NASA's major effort to demonstrate, through a series of complex ground experiments, the application and benefits of applying advanced automation technologies to the Space Station project. Lewis Research Center (LeRC) and Marshall Space Flight Center (MSFC) will first jointly develop an autonomous power system using existing Space Station testbed facilities at each center. The subsequent 1990 power-thermal demonstration will then involve the cooperative operation of the LeRC/MSFC power system with the Johnson Space Center (JSC's) thermal control and DMS/OMS testbed facilities. The testbeds and expert systems at each of the NASA centers will be interconnected via communication links. The appropriate knowledge-based technology will be developed for each testbed and applied to problems requiring intersystem cooperation. Primary emphasis will be focused on failure detection and classification, system reconfiguration, planning and scheduling of electrical power resources, and integration of knowledge-based and conventional control system software into the design and operation of Space Station testbeds.

  13. Control of autonomous ground vehicles: a brief technical review

    NASA Astrophysics Data System (ADS)

    Babak, Shahian-Jahromi; Hussain, Syed A.; Karakas, Burak; Cetin, Sabri

    2017-07-01

    This paper presents a brief review of the developments achieved in autonomous vehicle systems technology. A concise history of autonomous driver assistance systems is presented, followed by a review of current state of the art sensor technology used in autonomous vehicles. Standard sensor fusion method that has been recently explored is discussed. Finally, advances in embedded software methodologies that define the logic between sensory information and actuation decisions are reviewed.

  14. A Virtual Ocean Observatory for Climate and Ocean Science: Synergistic Applications for SWOT and XOVWM

    NASA Astrophysics Data System (ADS)

    Arabshahi, P.; Howe, B. M.; Chao, Y.; Businger, S.; Chien, S.

    2010-12-01

    We present a virtual ocean observatory (VOO) that supports climate and ocean science as addressed in the NRC decadal survey. The VOO is composed of an autonomous software system, in-situ and space-based sensing assets, data sets, and interfaces to ocean and atmosphere models. The purpose of this observatory and its output data products are: 1) to support SWOT mission planning, 2) to serve as a vanguard for fusing SWOT, XOVWM, and in-situ data sets through fusion of OSTM (SWOT proxy) and QuikSCAT (XOVWM proxy) data with in-situ data, and 3) to serve as a feed-forward platform for high-resolution measurements of ocean surface topography (OST) in island and coastal environments utilizing space-based and in-situ adaptive sampling. The VOO will enable models capable of simulating and estimating realistic oceanic processes and atmospheric forcing of the ocean in these environments. Such measurements are critical in understanding the oceans' effects on global climate. The information systems innovations of the VOO are: 1. Development of an autonomous software platform for automated mission planning and combining science data products of QuikSCAT and OSTM with complementary in-situ data sets to deliver new data products. This software will present first-step demonstrations of technology that, once matured, will offer increased operational capability to SWOT by providing automated planning, and new science data sets using automated workflows. The future data sets to be integrated include those from SWOT and XOVWM. 2. A capstone demonstration of the effort utilizes the elements developed in (1) above to achieve adaptive in-situ sampling through feedback from space-based-assets via the SWOT simulator. This effort will directly contribute to orbit design during the experimental phase (first 6-9 months) of the SWOT mission by high resolution regional atmospheric and ocean modeling and sampling. It will also contribute to SWOT science via integration of in-situ data, QuikSCAT, and OSTM data sets, and models, thus serving as technology pathfinder for SWOT and XOVWM data fusion; and will contribute to SWOT operations via data fusion and mission planning technology. The goals of our project are as follows: (a) Develop and test the VOO, including hardware, in-situ science platforms (Seagliders) and instruments, and two autonomous software modules: 1) automated data fusion/assimilation, and 2) automated planning technology; (b) Generate new data sets (OST data in the Hawaiian Islands region) from fusion of in-situ data with QuikSCAT and OSTM data; (c) Integrate data sets derived from the VOO into the SWOT simulator for improved SWOT mission planning; (d) Demonstrate via Hawaiian Islands region field experiments and simulation the operational capability of the VOO to generate improved hydrologic cycle/ocean science, in particular: mesoscale and submesoscale ocean circulation including velocities, vorticity, and stress measurements, that are important to the modeling of ocean currents, eddies and mixing.

  15. KSC-2014-3536

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Kennedy Space Center Director and former astronaut Bob Cabana, talks to Florida middle school students and their teachers during the Zero Robotics finals competition at the center's Space Station Processing Facility in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  16. KSC-2014-3535

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Kennedy Space Center Director and former astronaut Bob Cabana, talks to Florida middle school students and their teachers during the Zero Robotics finals competition at the center's Space Station Processing Facility in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  17. KSC-2014-3537

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Kennedy Space Center Director and former astronaut Bob Cabana, talks to Florida middle school students and their teachers during the Zero Robotics finals competition at the center's Space Station Processing Facility in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  18. Autonomous Aerobraking: Thermal Analysis and Response Surface Development

    NASA Technical Reports Server (NTRS)

    Dec, John A.; Thornblom, Mark N.

    2011-01-01

    A high-fidelity thermal model of the Mars Reconnaissance Orbiter was developed for use in an autonomous aerobraking simulation study. Response surface equations were derived from the high-fidelity thermal model and integrated into the autonomous aerobraking simulation software. The high-fidelity thermal model was developed using the Thermal Desktop software and used in all phases of the analysis. The use of Thermal Desktop exclusively, represented a change from previously developed aerobraking thermal analysis methodologies. Comparisons were made between the Thermal Desktop solutions and those developed for the previous aerobraking thermal analyses performed on the Mars Reconnaissance Orbiter during aerobraking operations. A variable sensitivity screening study was performed to reduce the number of variables carried in the response surface equations. Thermal analysis and response surface equation development were performed for autonomous aerobraking missions at Mars and Venus.

  19. Autonomous Cryogenics Loading Operations Simulation Software: Knowledgebase Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S., Jr.

    2013-01-01

    Working on the ACLO (Autonomous Cryogenics Loading Operations) project I have had the opportunity to add functionality to the physics simulation software known as KATE (Knowledgebase Autonomous Test Engineer), create a new application allowing WYSIWYG (what-you-see-is-what-you-get) creation of KATE schematic files and begin a preliminary design and implementation of a new subsystem that will provide vision services on the IHM (Integrated Health Management) bus. The functionality I added to KATE over the past few months includes a dynamic visual representation of the fluid height in a pipe based on number of gallons of fluid in the pipe and implementing the IHM bus connection within KATE. I also fixed a broken feature in the system called the Browser Display, implemented many bug fixes and made changes to the GUI (Graphical User Interface).

  20. Towards Supervising Remote Dexterous Robots Across Time Delay

    NASA Technical Reports Server (NTRS)

    Hambuchen, Kimberly; Bluethmann, William; Goza, Michael; Ambrose, Robert; Wheeler, Kevin; Rabe, Ken

    2006-01-01

    The President s Vision for Space Exploration, laid out in 2004, relies heavily upon robotic exploration of the lunar surface in early phases of the program. Prior to the arrival of astronauts on the lunar surface, these robots will be required to be controlled across space and time, posing a considerable challenge for traditional telepresence techniques. Because time delays will be measured in seconds, not minutes as is the case for Mars Exploration, uploading the plan for a day seems excessive. An approach for controlling dexterous robots under intermediate time delay is presented, in which software running within a ground control cockpit predicts the intention of an immersed robot supervisor, then the remote robot autonomously executes the supervisor s intended tasks. Initial results are presented.

  1. Autonomous Cryogenic Load Operations: Knowledge-Based Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Schrading, J. Nicolas

    2013-01-01

    The Knowledge-Based Autonomous Test Engineer (KATE) program has a long history at KSC. Now a part of the Autonomous Cryogenic Load Operations (ACLO) mission, this software system has been sporadically developed over the past 20 years. Originally designed to provide health and status monitoring for a simple water-based fluid system, it was proven to be a capable autonomous test engineer for determining sources of failure in the system. As part of a new goal to provide this same anomaly-detection capability for a complicated cryogenic fluid system, software engineers, physicists, interns and KATE experts are working to upgrade the software capabilities and graphical user interface. Much progress was made during this effort to improve KATE. A display of the entire cryogenic system's graph, with nodes for components and edges for their connections, was added to the KATE software. A searching functionality was added to the new graph display, so that users could easily center their screen on specific components. The GUI was also modified so that it displayed information relevant to the new project goals. In addition, work began on adding new pneumatic and electronic subsystems into the KATE knowledge base, so that it could provide health and status monitoring for those systems. Finally, many fixes for bugs, memory leaks, and memory errors were implemented and the system was moved into a state in which it could be presented to stakeholders. Overall, the KATE system was improved and necessary additional features were added so that a presentation of the program and its functionality in the next few months would be a success.

  2. Autonomous Cryogenic Load Operations: KSC Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Shrading, Nicholas J.

    2012-01-01

    The KSC Autonomous Test Engineer (KATE) program has a long history at KSC. Now a part of the Autonomous Cryogenic Load Operations (ACLO) mission, this software system has been sporadically developed over the past 20+ years. Originally designed to provide health and status monitoring for a simple water-based fluid system, it was proven to be a capable autonomous test engineer for determining sources of failure in. the system, As part.of a new goal to provide this same anomaly-detection capability for a complicated cryogenic fluid system, software engineers, physicists, interns and KATE experts are working to upgrade the software capabilities and graphical user interface. Much progress was made during this effort to improve KATE. A display ofthe entire cryogenic system's graph, with nodes for components and edges for their connections, was added to the KATE software. A searching functionality was added to the new graph display, so that users could easily center their screen on specific components. The GUI was also modified so that it displayed information relevant to the new project goals. In addition, work began on adding new pneumatic and electronic subsystems into the KATE knowledgebase, so that it could provide health and status monitoring for those systems. Finally, many fixes for bugs, memory leaks, and memory errors were implemented and the system was moved into a state in which it could be presented to stakeholders. Overall, the KATE system was improved and necessary additional features were added so that a presentation of the program and its functionality in the next few months would be a success.

  3. Sampling Technique for Robust Odorant Detection Based on MIT RealNose Data

    NASA Technical Reports Server (NTRS)

    Duong, Tuan A.

    2012-01-01

    This technique enhances the detection capability of the autonomous Real-Nose system from MIT to detect odorants and their concentrations in noisy and transient environments. The lowcost, portable system with low power consumption will operate at high speed and is suited for unmanned and remotely operated long-life applications. A deterministic mathematical model was developed to detect odorants and calculate their concentration in noisy environments. Real data from MIT's NanoNose was examined, from which a signal conditioning technique was proposed to enable robust odorant detection for the RealNose system. Its sensitivity can reach to sub-part-per-billion (sub-ppb). A Space Invariant Independent Component Analysis (SPICA) algorithm was developed to deal with non-linear mixing that is an over-complete case, and it is used as a preprocessing step to recover the original odorant sources for detection. This approach, combined with the Cascade Error Projection (CEP) Neural Network algorithm, was used to perform odorant identification. Signal conditioning is used to identify potential processing windows to enable robust detection for autonomous systems. So far, the software has been developed and evaluated with current data sets provided by the MIT team. However, continuous data streams are made available where even the occurrence of a new odorant is unannounced and needs to be noticed by the system autonomously before its unambiguous detection. The challenge for the software is to be able to separate the potential valid signal from the odorant and from the noisy transition region when the odorant is just introduced.

  4. Autonomous Navigation With Ground Station One-Way Forward-Link Doppler Data

    NASA Technical Reports Server (NTRS)

    Horstkamp, G. M.; Niklewski, D. J.; Gramling, C. J.

    1996-01-01

    The National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) has spent several years developing operational onboard navigation systems (ONS's) to provide real time autonomous, highly accurate navigation products for spacecraft using NASA's space and ground communication systems. The highly successful Tracking and Data Relay Satellite (TDRSS) ONS (TONS) experiment on the Explorer Platform/Extreme Ultraviolet (EP/EUV) spacecraft, launched on June 7, 1992, flight demonstrated the ONS for high accuracy navigation using TDRSS forward link communication services. In late 1994, a similar ONS experiment was performed using EP/EUV flight hardware (the ultrastable oscillator and Doppler extractor card in one of the TDRSS transponders) and ground system software to demonstrate the feasibility of using an ONS with ground station forward link communication services. This paper provides a detailed evaluation of ground station-based ONS performance of data collected over a 20 day period. The ground station ONS (GONS) experiment results are used to project the expected performance of an operational system. The GONS processes Doppler data derived from scheduled ground station forward link services using a sequential estimation algorithm enhanced by a sophisticated process noise model to provide onboard orbit and frequency determination. Analysis of the GONS experiment performance indicates that real time onboard position accuracies of better than 125 meters (1 sigma) are achievable with two or more 5-minute contacts per day for the EP/EUV 525 kilometer altitude, 28.5 degree inclination orbit. GONS accuracy is shown to be a function of the fidelity of the onboard propagation model, the frequency/geometry of the tracking contacts, and the quality of the tracking measurements. GONS provides a viable option for using autonomous navigation to reduce operational costs for upcoming spacecraft missions with moderate position accuracy requirements.

  5. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Simpson, James

    2010-01-01

    The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.

  6. Contingency Software in Autonomous Systems: Technical Level Briefing

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Patterson-Hines, Ann

    2006-01-01

    Contingency management is essential to the robust operation of complex systems such as spacecraft and Unpiloted Aerial Vehicles (UAVs). Automatic contingency handling allows a faster response to unsafe scenarios with reduced human intervention on low-cost and extended missions. Results, applied to the Autonomous Rotorcraft Project and Mars Science Lab, pave the way to more resilient autonomous systems.

  7. Autonomous docking ground demonstration (category 3)

    NASA Technical Reports Server (NTRS)

    Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.

    1991-01-01

    The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.

  8. Autonomous docking ground demonstration (category 3)

    NASA Astrophysics Data System (ADS)

    Lamkin, Steve L.; Eick, Richard E.; Baxter, James M.; Boyd, M. G.; Clark, Fred D.; Lee, Thomas Q.; Othon, L. T.; Prather, Joseph L.; Spehar, Peter T.; Teders, Rebecca J.

    The NASA Johnson Space Center (JSC) is involved in the development of an autonomous docking ground demonstration. The demonstration combines the technologies, expertise and facilities of the JSC Tracking and Communications Division (EE), Structures and Mechanics Division (ES), and the Navigation, Guidance and Control Division (EG) and their supporting contractors. The autonomous docking ground demonstration is an evaluation of the capabilities of the laser sensor system to support the docking phase (12ft to contact) when operated in conjunction with the Guidance, Navigation and Control Software. The docking mechanism being used was developed for the Apollo Soyuz Test Program. This demonstration will be conducted using the Six-Degrees of Freedom (6-DOF) Dynamic Test System (DTS). The DTS environment simulates the Space Station Freedom as the stationary or target vehicle and the Orbiter as the active or chase vehicle. For this demonstration the laser sensor will be mounted on the target vehicle and the retroreflectors on the chase vehicle. This arrangement was used to prevent potential damage to the laser. The sensor system. GN&C and 6-DOF DTS will be operated closed-loop. Initial condition to simulate vehicle misalignments, translational and rotational, will be introduced within the constraints of the systems involved. Detailed description of each of the demonstration components (e.g., Sensor System, GN&C, 6-DOF DTS and supporting computer configuration) including their capabilities and limitations will be discussed. A demonstration architecture drawing and photographs of the test configuration will be presented.

  9. KSC-2014-3542

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Former astronaut Greg Johnson, at left, executive director of the Center for the Advancement of Science in Space, and NASA Kennedy Space Center Director Bob Cabana, visit with Florida middle school students and their teachers before the start of the Zero Robotics finals competition at NASA Kennedy Space Center's Space Station Processing Facility in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  10. JAXA-NASA Interoperability Demonstration for Application of DTN Under Simulated Rain Attenuation

    NASA Technical Reports Server (NTRS)

    Suzuki, Kiyoshisa; Inagawa, Shinichi; Lippincott, Jeff; Cecil, Andrew J.

    2014-01-01

    As is well known, K-band or higher band communications in space link segment often experience intermittent disruptions caused by heavy rainfall. In view of keeping data integrity and establishing autonomous operations under such situation, it is important to consider introducing a tolerance mechanism such as Delay/Disruption Tolerant Networking (DTN). The Consultative Committee for Space Data Systems (CCSDS) is studying DTN as part of the standardization activities for space data systems. As a contribution to CCSDS and a feasibility study for future utilization of DTN, Japan Aerospace Exploration Agency (JAXA) and National Aeronautics and Space Administration (NASA) conducted an interoperability demonstration for confirming its tolerance mechanism and capability of automatic operation using Data Relay Test Satellite (DRTS) space link and its ground terminals. Both parties used the Interplanetary Overlay Network (ION) open source software, including the Bundle Protocol, the Licklider Transmission Protocol, and Contact Graph Routing. This paper introduces the contents of the interoperability demonstration and its results.

  11. Relative navigation and attitude determination using a GPS/INS integrated system near the International Space Station

    NASA Astrophysics Data System (ADS)

    Um, Jaeyong

    2001-08-01

    The Space Integrated GPS/INS (SIGI) sensor is the primary navigation and attitude determination source for the International Space Station (ISS). The SIGI was successfully demonstrated on-orbit for the first time in the SIGI Orbital Attitude Readiness (SOAR) demonstration on the Space Shuttle Atlantis in May 2000. Numerous proximity operations near the ISS have been and will be performed over the lifetime of the Station. The development of an autonomous relative navigation system is needed to improve the safety and efficiency of vehicle operations near the ISS. A hardware simulation study was performed for the GPS-based relative navigation using the state vector difference approach and the interferometric approach in the absence of multipath. The interferometric approach, where the relative states are estimated directly, showed comparable results for a 1 km baseline. One of the most pressing current technical issues is the design of an autonomous relative navigation system in the proximity of the ISS, where GPS signals are blocked and maneuvers happen frequently. An integrated GPS/INS system is investigated for the possibility of a fully autonomous relative navigation system. Another application of GPS measurements is determination of the vehicle's orientation in space. This study used the SOAR experiment data to characterize the SICI's on-orbit performance for attitude determination. A cold start initialization algorithm was developed for integer ambiguity resolution in any initial orientation. The original algorithm that was used in the SIGI had an operational limitation in the integer ambiguity resolution, which was developed for terrestrial applications, and limited its effectiveness in space. The new algorithm was tested using the SOAR data and has been incorporated in the current SIGI flight software. The attitude estimation performance was examined using two different GPS/INS integration algorithms. The GPS/INS attitude solution using the SOAR data was as accurate as 0.06 deg (RMS) in 3-axis with multipath mitigation. Other improvements to the attitude determination algorithm were the development of a faster integer ambiguity resolution method and the incorporation of line bias modeling.

  12. Autonomous and Autonomic Systems: A Paradigm for Future Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walter F.; Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2004-01-01

    NASA increasingly will rely on autonomous systems concepts, not only in the mission control centers on the ground, but also on spacecraft and on rovers and other assets on extraterrestrial bodies. Automomy enables not only reduced operations costs, But also adaptable goal-driven functionality of mission systems. Space missions lacking autonomy will be unable to achieve the full range of advanced mission objectives, given that human control under dynamic environmental conditions will not be feasible due, in part, to the unavoidably high signal propagation latency and constrained data rates of mission communications links. While autonomy cost-effectively supports accomplishment of mission goals, autonomicity supports survivability of remote mission assets, especially when human tending is not feasible. Autonomic system properties (which ensure self-configuring, self-optimizing self-healing, and self-protecting behavior) conceptually may enable space missions of a higher order into any previously flown. Analysis of two NASA agent-based systems previously prototyped, and of a proposed future mission involving numerous cooperating spacecraft, illustrates how autonomous and autonomic system concepts may be brought to bear on future space missions.

  13. Swarmathon 2017

    NASA Image and Video Library

    2017-04-19

    In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's second annual Swarmathon, 20 teams representing 22 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the moon or Mars.

  14. Swarmathon 2018

    NASA Image and Video Library

    2018-04-18

    In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.

  15. Swarmathon 2018

    NASA Image and Video Library

    2018-04-17

    In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.

  16. Information Handling is the Problem

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.

    2001-01-01

    This slide presentation reviews the concerns surrounding the automation of information handling. There are two types of decision support software that supports most Space Station Flight Controllers. one is very simple, and the other is very complex. A middle ground is sought. This is the reason for the Human Centered Autonomous and Assistant Systems Testbed (HCAAST) Project. The aim is to study flight controllers at work, and in the bigger picture, with particular attention to how they handle information and how coordination of multiple teams is performed. The focus of the project is on intelligent assistants to assist in handling information for the flight controllers.

  17. Trusted Autonomy for Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Freed, Michael; Bonasso, Pete; Ingham, Mitch; Kortenkamp, David; Perix, John

    2005-01-01

    NASA has long supported research on intelligent control technologies that could allow space systems to operate autonomously or with reduced human supervision. Proposed uses range from automated control of entire space vehicles to mobile robots that assist or substitute for astronauts to vehicle systems such as life support that interact with other systems in complex ways and require constant vigilance. The potential for pervasive use of such technology to extend the kinds of missions that are possible in practice is well understood, as is its potential to radically improve the robustness, safety and productivity of diverse mission systems. Despite its acknowledged potential, intelligent control capabilities are rarely used in space flight systems. Perhaps the most famous example of intelligent control on a spacecraft is the Remote Agent system flown on the Deep Space One mission (1998 - 2001). However, even in this case, the role of the intelligent control element, originally intended to have full control of the spacecraft for the duration of the mission, was reduced to having partial control for a two-week non-critical period. Even this level of mission acceptance was exceptional. In most cases, mission managers consider intelligent control systems an unacceptable source of risk and elect not to fly them. Overall, the technology is not trusted. From the standpoint of those who need to decide whether to incorporate this technology, lack of trust is easy to understand. Intelligent high-level control means allowing software io make decisions that are too complex for conventional software. The decision-making behavior of these systems is often hard to understand and inspect, and thus hard to evaluate. Moreover, such software is typically designed and implemented either as a research product or custom-built for a particular mission. In the former case, software quality is unlikely to be adequate for flight qualification and the functionality provided by the system is likely driven largely by the need to publish innovative work. In the latter case, the mission represents the first use of the system, a risky proposition even for relatively simple software.

  18. First Image from a Mars Rover Choosing a Target

    NASA Image and Video Library

    2010-03-23

    This true-color image is the result of the first observation of a target selected autonomously by NASA Mars Exploration Rover Opportunity using newly developed and uploaded software named Autonomous Exploration for Gathering Increased Science, or AEGIS.

  19. Single-Frequency GPS Relative Navigation in a High Ionosphere Orbital Environment

    NASA Technical Reports Server (NTRS)

    Conrad, Patrick R.; Naasz, Bo J.

    2007-01-01

    The Global Positioning System (GPS) provides a convenient source for space vehicle relative navigation measurements, especially for low Earth orbit formation flying and autonomous rendezvous mission concepts. For single-frequency GPS receivers, ionospheric path delay can be a significant error source if not properly mitigated. In particular, ionospheric effects are known to cause significant radial position error bias and add dramatically to relative state estimation error if the onboard navigation software does not force the use of measurements from common or shared GPS space vehicles. Results from GPS navigation simulations are presented for a pair of space vehicles flying in formation and using GPS pseudorange measurements to perform absolute and relative orbit determination. With careful measurement selection techniques relative state estimation accuracy to less than 20 cm with standard GPS pseudorange processing and less than 10 cm with single-differenced pseudorange processing is shown.

  20. Integration of symbolic and algorithmic hardware and software for the automation of space station subsystems

    NASA Technical Reports Server (NTRS)

    Gregg, Hugh; Healey, Kathleen; Hack, Edmund; Wong, Carla

    1987-01-01

    Traditional expert systems, such as diagnostic and training systems, interact with users only through a keyboard and screen, and are usually symbolic in nature. Expert systems that require access to data bases, complex simulations and real-time instrumentation have both symbolic as well as algorithmic computing needs. These needs could both be met using a general purpose workstation running both symbolic and algorithmic code, or separate, specialized computers networked together. The latter approach was chosen to implement TEXSYS, the thermal expert system, developed by NASA Ames Research Center in conjunction with Johnson Space Center to demonstrate the ability of an expert system to autonomously monitor the thermal control system of the space station. TEXSYS has been implemented on a Symbolics workstation, and will be linked to a microVAX computer that will control a thermal test bed. This paper will explore the integration options, and present several possible solutions.

  1. KSC-2014-3540

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – Florida middle school students and their teachers greet students from other locations via webex before the start of the Zero Robotics finals competition. The Florida teams are at the Space Station Processing Facility at NASA's Kennedy Space Center in Florida. Students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  2. System architecture for asynchronous multi-processor robotic control system

    NASA Technical Reports Server (NTRS)

    Steele, Robert D.; Long, Mark; Backes, Paul

    1993-01-01

    The architecture for the Modular Telerobot Task Execution System (MOTES) as implemented in the Supervisory Telerobotics (STELER) Laboratory is described. MOTES is the software component of the remote site of a local-remote telerobotic system which is being developed for NASA for space applications, in particular Space Station Freedom applications. The system is being developed to provide control and supervised autonomous control to support both space based operation and ground-remote control with time delay. The local-remote architecture places task planning responsibilities at the local site and task execution responsibilities at the remote site. This separation allows the remote site to be designed to optimize task execution capability within a limited computational environment such as is expected in flight systems. The local site task planning system could be placed on the ground where few computational limitations are expected. MOTES is written in the Ada programming language for a multiprocessor environment.

  3. The NASA/Army Autonomous Rotorcraft Project

    NASA Technical Reports Server (NTRS)

    Whalley, M.; Freed, M.; Takahashi, M.; Christian, D.; Patterson-Hine, A.; Schulein, G.; Harris, R.

    2002-01-01

    An overview of the NASA Ames Research Center Autonomous Rotorcraft Project (ARP) is presented. The project brings together several technologies to address NASA and US Army autonomous vehicle needs, including a reactive planner for mission planning and execution, control system design incorporating a detailed understanding of the platform dynamics, and health monitoring and diagnostics. A candidate reconnaissance and surveillance mission is described. The autonomous agent architecture and its application to the candidate mission are presented. Details of the vehicle hardware and software development are provided.

  4. Generalized Software Architecture Applied to the Continuous Lunar Water Separation Process and the Lunar Greenhouse Amplifier

    NASA Technical Reports Server (NTRS)

    Perusich, Stephen; Moos, Thomas; Muscatello, Anthony

    2011-01-01

    This innovation provides the user with autonomous on-screen monitoring, embedded computations, and tabulated output for two new processes. The software was originally written for the Continuous Lunar Water Separation Process (CLWSP), but was found to be general enough to be applicable to the Lunar Greenhouse Amplifier (LGA) as well, with minor alterations. The resultant program should have general applicability to many laboratory processes (see figure). The objective for these programs was to create a software application that would provide both autonomous monitoring and data storage, along with manual manipulation. The software also allows operators the ability to input experimental changes and comments in real time without modifying the code itself. Common process elements, such as thermocouples, pressure transducers, and relative humidity sensors, are easily incorporated into the program in various configurations, along with specialized devices such as photodiode sensors. The goal of the CLWSP research project is to design, build, and test a new method to continuously separate, capture, and quantify water from a gas stream. The application is any In-Situ Resource Utilization (ISRU) process that desires to extract or produce water from lunar or planetary regolith. The present work is aimed at circumventing current problems and ultimately producing a system capable of continuous operation at moderate temperatures that can be scaled over a large capacity range depending on the ISRU process. The goal of the LGA research project is to design, build, and test a new type of greenhouse that could be used on the moon or Mars. The LGA uses super greenhouse gases (SGGs) to absorb long-wavelength radiation, thus creating a highly efficient greenhouse at a future lunar or Mars outpost. Silica-based glass, although highly efficient at trapping heat, is heavy, fragile, and not suitable for space greenhouse applications. Plastics are much lighter and resilient, but are not efficient for absorbing longwavelength infrared radiation and therefore will lose more heat to the environment compared to glass. The LGA unit uses a transparent polymer antechamber that surrounds part of the greenhouse and encases the SGGs, thereby minimizing infrared losses through the plastic windows. With ambient temperatures at the lunar poles at 50 C, the LGA should provide a substantial enhancement to currently conceived lunar greenhouses. Positive results obtained from this project could lead to a future large-scale system capable of running autonomously on the Moon, Mars, and beyond. The software for both applications needs to run the entire units and all subprocesses; however, throughout testing, many variables and parameters need to be changed as more is learned about the system operation. The software provides the versatility to permit the software operation to change as the user requirements evolve.

  5. Evaluation of a Mobile Platform for Proof-of-Concept Autonomous Site Selection and Preparation

    NASA Astrophysics Data System (ADS)

    Gammell, Jonathan

    A mobile robotic platform for Autonomous Site Selection and Preparation (ASSP) was developed for an analogue deployment to Mauna Kea, Hawai`i. A team of rovers performed an autonomous Ground Penetrating Radar (GPR) survey and constructed a level landing pad. They used interchangeable payloads that allowed the GPR and blade to be easily exchanged. Autonomy was accomplished by integrating the individual hardware devices with software based on the ArgoSoft framework previously developed at UTIAS. The rovers were controlled by an on-board netbook. The successes and failures of the devices and software modules are evaluated within. Recommendations are presented to address problems discovered during the deployment and to guide future research on the platform.

  6. Autonomous Instrument Placement for Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Leger, P. Chris; Maimone, Mark

    2009-01-01

    Autonomous Instrument Placement (AutoPlace) is onboard software that enables a Mars Exploration Rover to act autonomously in using its manipulator to place scientific instruments on or near designated rock and soil targets. Prior to the development of AutoPlace, it was necessary for human operators on Earth to plan every motion of the manipulator arm in a time-consuming process that included downlinking of images from the rover, analysis of images and creation of commands, and uplinking of commands to the rover. AutoPlace incorporates image analysis and planning algorithms into the onboard rover software, eliminating the need for the downlink/uplink command cycle. Many of these algorithms are derived from the existing groundbased image analysis and planning algorithms, with modifications and augmentations for onboard use.

  7. Project Morpheus: Morpheus 1.5A Lander Failure Investigation Results

    NASA Technical Reports Server (NTRS)

    Devolites, Jennifer L.; Olansen, Jon B.; Munday, Stephen R.

    2013-01-01

    On August 9, 2012 the Morpheus 1.5A vehicle crashed shortly after lift off from the Kennedy Space Center. The loss was limited to the vehicle itself which was pre-declared to be a test failure and not a mishap. The Morpheus project is demonstrating advanced technologies for in space and planetary surface vehicles including: autonomous flight control, landing site hazard identification and safe site selection, relative surface and hazard navigation, precision landing, modular reusable flight software, and high performance, non-toxic, cryogenic liquid Oxygen and liquid Methane integrated main engine and attitude control propulsion system. A comprehensive failure investigation isolated the fault to the Inertial Measurement Unit (IMU) data path to the flight computer. Several improvements have been identified and implemented for the 1.5B and 1.5C vehicles.

  8. Space station automation study. Volume 1: Executive summary. Autonomous systems and assembly

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The purpose of the Space Station Automation Study (SSAS) was to develop informed technical guidance for NASA personnel in the use of autonomy and autonomous systems to implement space station functions.

  9. Autonomous Control of Space Reactor Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belle R. Upadhyaya; K. Zhao; S.R.P. Perillo

    2007-11-30

    Autonomous and semi-autonomous control is a key element of space reactor design in order to meet the mission requirements of safety, reliability, survivability, and life expectancy. Interrestrial nuclear power plants, human operators are avilable to perform intelligent control functions that are necessary for both normal and abnormal operational conditions.

  10. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  11. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  12. Agent Technology, Complex Adaptive Systems, and Autonomic Systems: Their Relationships

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Rouff, Chistopher; Hincheny, Mike

    2004-01-01

    To reduce the cost of future spaceflight missions and to perform new science, NASA has been investigating autonomous ground and space flight systems. These goals of cost reduction have been further complicated by nanosatellites for future science data-gathering which will have large communications delays and at times be out of contact with ground control for extended periods of time. This paper describes two prototype agent-based systems, the Lights-out Ground Operations System (LOGOS) and the Agent Concept Testbed (ACT), and their autonomic properties that were developed at NASA Goddard Space Flight Center (GSFC) to demonstrate autonomous operations of future space flight missions. The paper discusses the architecture of the two agent-based systems, operational scenarios of both, and the two systems autonomic properties.

  13. HRVanalysis: A Free Software for Analyzing Cardiac Autonomic Activity

    PubMed Central

    Pichot, Vincent; Roche, Frédéric; Celle, Sébastien; Barthélémy, Jean-Claude; Chouchou, Florian

    2016-01-01

    Since the pioneering studies of the 1960s, heart rate variability (HRV) has become an increasingly used non-invasive tool for examining cardiac autonomic functions and dysfunctions in various populations and conditions. Many calculation methods have been developed to address these issues, each with their strengths and weaknesses. Although, its interpretation may remain difficult, this technique provides, from a non-invasive approach, reliable physiological information that was previously inaccessible, in many fields including death and health prediction, training and overtraining, cardiac and respiratory rehabilitation, sleep-disordered breathing, large cohort follow-ups, children's autonomic status, anesthesia, or neurophysiological studies. In this context, we developed HRVanalysis, a software to analyse HRV, used and improved for over 20 years and, thus, designed to meet laboratory requirements. The main strength of HRVanalysis is its wide application scope. In addition to standard analysis over short and long periods of RR intervals, the software allows time-frequency analysis using wavelet transform as well as analysis of autonomic nervous system status on surrounding scored events and on preselected labeled areas. Moreover, the interface is designed for easy study of large cohorts, including batch mode signal processing to avoid running repetitive operations. Results are displayed as figures or saved in TXT files directly employable in statistical softwares. Recordings can arise from RR or EKG files of different types such as cardiofrequencemeters, holters EKG, polygraphs, and data acquisition systems. HRVanalysis can be downloaded freely from the Web page at: https://anslabtools.univ-st-etienne.fr HRVanalysis is meticulously maintained and developed for in-house laboratory use. In this article, after a brief description of the context, we present an overall view of HRV analysis and we describe the methodological approach of the different techniques provided by the software. PMID:27920726

  14. KSC-2014-3543

    NASA Image and Video Library

    2014-08-15

    CAPE CANAVERAL, Fla. – The Kennedy Space Center Visitor Complex Spaceperson poses for a photo with Carver Middle School students and their teacher from Orlando, Florida, during the Zero Robotics finals competition at NASA Kennedy Space Center's Space Station Processing Facility in Florida. The team, members of the After School All-Stars, were regional winners and advanced to the final competition. For the competition, students designed software to control Synchronized Position Hold Engage and Reorient Experimental Satellites, or SPHERES, and competed with other teams locally. The Zero Robotics is a robotics programming competition where the robots are SPHERES. The competition starts online, where teams program the SPHERES to solve an annual challenge. After several phases of virtual competition in a simulation environment that mimics the real SPHERES, finalists are selected to compete in a live championship aboard the space station. Students compete to win a technically challenging game by programming their strategies into the SPHERES satellites. The programs are autonomous and the students cannot control the satellites during the test. Photo credit: NASA/Daniel Casper

  15. Architecting Communication Network of Networks for Space System of Systems

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul B.; Hayden, Jeffrey L.

    2008-01-01

    The National Aeronautics and Space Administration (NASA) and the Department of Defense (DoD) are planning Space System of Systems (SoS) to address the new challenges of space exploration, defense, communications, navigation, Earth observation, and science. In addition, these complex systems must provide interoperability, enhanced reliability, common interfaces, dynamic operations, and autonomy in system management. Both NASA and the DoD have chosen to meet the new demands with high data rate communication systems and space Internet technologies that bring Internet Protocols (IP), routers, servers, software, and interfaces to space networks to enable as much autonomous operation of those networks as possible. These technologies reduce the cost of operations and, with higher bandwidths, support the expected voice, video, and data needed to coordinate activities at each stage of an exploration mission. In this paper, we discuss, in a generic fashion, how the architectural approaches and processes are being developed and used for defining a hypothetical communication and navigation networks infrastructure to support lunar exploration. Examples are given of the products generated by the architecture development process.

  16. Comprehensive visual field test & diagnosis system in support of astronaut health and performance

    NASA Astrophysics Data System (ADS)

    Fink, Wolfgang; Clark, Jonathan B.; Reisman, Garrett E.; Tarbell, Mark A.

    Long duration spaceflight, permanent human presence on the Moon, and future human missions to Mars will require autonomous medical care to address both expected and unexpected risks. An integrated non-invasive visual field test & diagnosis system is presented for the identification, characterization, and automated classification of visual field defects caused by the spaceflight environment. This system will support the onboard medical provider and astronauts on space missions with an innovative, non-invasive, accurate, sensitive, and fast visual field test. It includes a database for examination data, and a software package for automated visual field analysis and diagnosis. The system will be used to detect and diagnose conditions affecting the visual field, while in space and on Earth, permitting the timely application of therapeutic countermeasures before astronaut health or performance are impaired. State-of-the-art perimetry devices are bulky, thereby precluding application in a spaceflight setting. In contrast, the visual field test & diagnosis system requires only a touchscreen-equipped computer or touchpad device, which may already be in use for other purposes (i.e., no additional payload), and custom software. The system has application in routine astronaut assessment (Clinical Status Exam), pre-, in-, and post-flight monitoring, and astronaut selection. It is deployable in operational space environments, such as aboard the International Space Station or during future missions to or permanent presence on the Moon and Mars.

  17. An autonomous observation and control system based on EPICS and RTS2 for Antarctic telescopes

    NASA Astrophysics Data System (ADS)

    Zhang, Guang-yu; Wang, Jian; Tang, Peng-yi; Jia, Ming-hao; Chen, Jie; Dong, Shu-cheng; Jiang, Fengxin; Wu, Wen-qing; Liu, Jia-jing; Zhang, Hong-fei

    2016-01-01

    For unattended telescopes in Antarctic, the remote operation, autonomous observation and control are essential. An EPICS-(Experimental Physics and Industrial Control System) and RTS2-(Remote Telescope System, 2nd Version) based autonomous observation and control system with remoted operation is introduced in this paper. EPICS is a set of open source software tools, libraries and applications developed collaboratively and used worldwide to create distributed soft real-time control systems for scientific instruments while RTS2 is an open source environment for control of a fully autonomous observatory. Using the advantage of EPICS and RTS2, respectively, a combined integrated software framework for autonomous observation and control is established that use RTS2 to fulfil the function of astronomical observation and use EPICS to fulfil the device control of telescope. A command and status interface for EPICS and RTS2 is designed to make the EPICS IOC (Input/Output Controller) components integrate to RTS2 directly. For the specification and requirement of control system of telescope in Antarctic, core components named Executor and Auto-focus for autonomous observation is designed and implemented with remote operation user interface based on browser-server mode. The whole system including the telescope is tested in Lijiang Observatory in Yunnan Province for practical observation to complete the autonomous observation and control, including telescope control, camera control, dome control, weather information acquisition with the local and remote operation.

  18. Sustainable and Autonomic Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Sterritt, Roy; Rouff, Christopher; Rash, James L.; Truszkowski, Walter

    2006-01-01

    Visions for future space exploration have long term science missions in sight, resulting in the need for sustainable missions. Survivability is a critical property of sustainable systems and may be addressed through autonomicity, an emerging paradigm for self-management of future computer-based systems based on inspiration from the human autonomic nervous system. This paper examines some of the ongoing research efforts to realize these survivable systems visions, with specific emphasis on developments in Autonomic Policies.

  19. Multi-agent robotic systems and applications for satellite missions

    NASA Astrophysics Data System (ADS)

    Nunes, Miguel A.

    A revolution in the space sector is happening. It is expected that in the next decade there will be more satellites launched than in the previous sixty years of space exploration. Major challenges are associated with this growth of space assets such as the autonomy and management of large groups of satellites, in particular with small satellites. There are two main objectives for this work. First, a flexible and distributed software architecture is presented to expand the possibilities of spacecraft autonomy and in particular autonomous motion in attitude and position. The approach taken is based on the concept of distributed software agents, also referred to as multi-agent robotic system. Agents are defined as software programs that are social, reactive and proactive to autonomously maximize the chances of achieving the set goals. Part of the work is to demonstrate that a multi-agent robotic system is a feasible approach for different problems of autonomy such as satellite attitude determination and control and autonomous rendezvous and docking. The second main objective is to develop a method to optimize multi-satellite configurations in space, also known as satellite constellations. This automated method generates new optimal mega-constellations designs for Earth observations and fast revisit times on large ground areas. The optimal satellite constellation can be used by researchers as the baseline for new missions. The first contribution of this work is the development of a new multi-agent robotic system for distributing the attitude determination and control subsystem for HiakaSat. The multi-agent robotic system is implemented and tested on the satellite hardware-in-the-loop testbed that simulates a representative space environment. The results show that the newly proposed system for this particular case achieves an equivalent control performance when compared to the monolithic implementation. In terms on computational efficiency it is found that the multi-agent robotic system has a consistent lower CPU load of 0.29 +/- 0.03 compared to 0.35 +/- 0.04 for the monolithic implementation, a 17.1 % reduction. The second contribution of this work is the development of a multi-agent robotic system for the autonomous rendezvous and docking of multiple spacecraft. To compute the maneuvers guidance, navigation and control algorithms are implemented as part of the multi-agent robotic system. The navigation and control functions are implemented using existing algorithms, but one important contribution of this section is the introduction of a new six degrees of freedom guidance method which is part of the guidance, navigation and control architecture. This new method is an explicit solution to the guidance problem, and is particularly useful for real time guidance for attitude and position, as opposed to typical guidance methods which are based on numerical solutions, and therefore are computationally intensive. A simulation scenario is run for docking four CubeSats deployed radially from a launch vehicle. Considering fully actuated CubeSats, the simulations show docking maneuvers that are successfully completed within 25 minutes which is approximately 30% of a full orbital period in low earth orbit. The final section investigates the problem of optimization of satellite constellations for fast revisit time, and introduces a new method to generate different constellation configurations that are evaluated with a genetic algorithm. Two case studies are presented. The first is the optimization of a constellation for rapid coverage of the oceans of the globe in 24 hours or less. Results show that for an 80 km sensor swath width 50 satellites are required to cover the oceans with a 24 hour revisit time. The second constellation configuration study focuses on the optimization for the rapid coverage of the North Atlantic Tracks for air traffic monitoring in 3 hours or less. The results show that for a fixed swath width of 160 km and for a 3 hour revisit time 52 satellites are required.

  20. Experimental results in autonomous landing approaches by dynamic machine vision

    NASA Astrophysics Data System (ADS)

    Dickmanns, Ernst D.; Werner, Stefan; Kraus, S.; Schell, R.

    1994-07-01

    The 4-D approach to dynamic machine vision, exploiting full spatio-temporal models of the process to be controlled, has been applied to on board autonomous landing approaches of aircraft. Aside from image sequence processing, for which it was developed initially, it is also used for data fusion from a range of sensors. By prediction error feedback an internal representation of the aircraft state relative to the runway in 3-D space and time is servo- maintained in the interpretation process, from which the control applications required are being derived. The validity and efficiency of the approach have been proven both in hardware- in-the-loop simulations and in flight experiments with a twin turboprop aircraft Do128 under perturbations from cross winds and wind gusts. The software package has been ported to `C' and onto a new transputer image processing platform; the system has been expanded for bifocal vision with two cameras of different focal length mounted fixed relative to each other on a two-axes platform for viewing direction control.

  1. Autonomous Assembly of Modular Structures in Space and on Extraterrestrial Locations

    NASA Astrophysics Data System (ADS)

    Alhorn, Dean C.

    2005-02-01

    The new U.S. National Vision for Space Exploration requires many new enabling technologies to accomplish the goals of space commercialization and returning humans to the moon and extraterrestrial environments. Traditionally, flight elements are complete sub-systems requiring humans to complete the integration and assembly. These bulky structures also require the use of heavy launch vehicles to send the units to a desired location. This philosophy necessitates a high degree of safety, numerous space walks at a significant cost. Future space mission costs must be reduced and safety increased to reasonably achieve exploration goals. One proposed concept is the autonomous assembly of space structures. This concept is an affordable, reliable solution to in-space and extraterrestrial assembly. Assembly is autonomously performed when two components join after determining that specifications are correct. Local sensors continue monitor joint integrity post assembly, which is critical for safety and structural reliability. Achieving this concept requires a change in space structure design philosophy and the development of innovative technologies to perform autonomous assembly. Assembly of large space structures will require significant numbers of integrity sensors. Thus simple, low-cost sensors are integral to the success of this concept. This paper addresses these issues and proposes a novel concept for assembling space structures autonomously. Core technologies required to achieve in space assembly are presented. These core technologies are critical to the goal of utilizing space in a cost efficient and safe manner. Additionally, these novel technologies can be applied to other systems both on earth and extraterrestrial environments.

  2. Autonomous Assembly of Modular Structures in Space and on Extraterrestrial Locations

    NASA Technical Reports Server (NTRS)

    Alhorn, Dean C.

    2005-01-01

    The new U.S. National Vision for Space Exploration requires many new enabling technologies to accomplish the goals of space commercialization and returning humans to the moon and extraterrestrial environments. Traditionally, flight elements are complete subsystems requiring humans to complete the integration and assembly. These bulky structures also require the use of heavy launch vehicles to send the units to a desired location. This philosophy necessitates a high degree of safety, numerous space walks at a significant cost. Future space mission costs must be reduced and safety increased to reasonably achieve exploration goals. One proposed concept is the autonomous assembly of space structures. This concept is an affordable, reliable solution to in-space and extraterrestrial assembly. Assembly is autonomously performed when two components join after determining that specifications are correct. Local sensors continue monitor joint integrity post assembly, which is critical for safety and structural reliability. Achieving this concept requires a change in space structure design philosophy and the development of innovative technologies to perform autonomous assembly. Assembly of large space structures will require significant numbers of integrity sensors. Thus simple, low-cost sensors are integral to the success of this concept. This paper addresses these issues and proposes a novel concept for assembling space structures autonomously. Core technologies required to achieve in space assembly are presented. These core technologies are critical to the goal of utilizing space in a cost efficient and safe manner. Additionally, these novel technologies can be applied to other systems both on earth and extraterrestrial environments.

  3. An Autonomous Data Reduction Pipeline for Wide Angle EO Systems

    NASA Astrophysics Data System (ADS)

    Privett, G.; George, S.; Feline, W.; Ash, A.; Routledge, G.

    The UK’s National Space and Security Policy states that the identification of potential on-orbit collisions and re-entry warning over the UK is of high importance, and is driving requirements for indigenous Space Situational Awareness (SSA) systems. To meet these requirements options are being examined, including the creation of a distributed network of simple, low cost commercial–off-the-shelf electro-optical sensors to support survey work and catalogue maintenance. This paper outlines work at Dstl examining whether data obtained using readily-deployable equipment could significantly enhance UK SSA capability and support cross-cueing between multiple deployed systems. To effectively exploit data from this distributed sensor architecture, a data handling system is required to autonomously detect satellite trails in a manner that pragmatically handles highly variable target intensities, periodicity and rates of apparent motion. The processing and collection strategies must be tailored to specific mission sets to ensure effective detections of platforms as diverse as stable geostationary satellites and low altitude CubeSats. Data captured during the Automated Transfer Vehicle-5 (ATV-5) de-orbit trial and images captured of a rocket body break up and a deployed deorbit sail have been employed to inform the development of a prototype processing pipeline for autonomous on-site processing. The approach taken employs tools such as Astrometry.Net and DAOPHOT from the astronomical community, together with image processing and orbit determination software developed inhouse by Dstl. Interim results from the automated analysis of data collected from wide angle sensors are described, together with the current perceived limitations of the proposed system and our plans for future development.

  4. Deploying the NASA Meter Class Autonomous Telescope (MCAT) on Ascension Island

    NASA Technical Reports Server (NTRS)

    Lederer, S. M.; Pace, L.; Hickson, P.; Cowardin, H. M.; Frith, J.; Buckalew, B.; Glesne, T.; Maeda, R.; Douglas, D.; Nishimoto, D.

    2015-01-01

    NASA has successfully constructed the 1.3m Meter Class Autonomous Telescope (MCAT) facility on Ascension Island in the South Atlantic Ocean. MCAT is an optical telescope designed specifically to collect ground-based data for the statistical characterization of orbital debris ranging from Low Earth Orbit (LEO) through Middle Earth Orbits (MEO) and beyond to Geo Transfer and Geosynchronous Orbits (GTO/GEO). The location of Ascension Island has two distinct advantages. First, the near-equatorial location fills a significant longitudinal gap in the Ground-based Electro-Optical Deep Space Surveillance (GEODSS) network of telescopes, and second, it allows access to objects in Low Inclination Low-Earth Orbits (LILO). The MCAT facility will be controlled by a sophisticated software suite that operates the dome and telescope, assesses sky and weather conditions, conducts all necessary calibrations, defines an observing strategy (as dictated by weather, sky conditions and the observing plan for the night), and carries out the observations. It then reduces the collected data via four primary observing modes ranging from tracking previously cataloged objects to conducting general surveys for detecting uncorrelated debris. Nightly observing plans, as well as the resulting text file of reduced data, will be transferred to and from Ascension, respectively, via a satellite connection. Post-processing occurs at NASA Johnson Space Center. Construction began in September, 2014 with dome and telescope installation occurring in April through early June, 2015. First light was achieved in June, 2015. Acceptance testing, full commissioning, and calibration of this soon-to-be fully autonomous system commenced in summer 2015. The initial characterization of the system from these tests is presented herein.

  5. Deploying the NASA Meter Class Autonomous Telescope (MCAT) on Ascension Island

    NASA Astrophysics Data System (ADS)

    Lederer, S.; Pace, L. F.; Hickson, P.; Glesne, T.; Cowardin, H. M.; Frith, J. M.; Buckalew, B.; Maeda, R.; Douglas, D.; Nishimoto, D.

    NASA has successfully constructed the 1.3m Meter Class Autonomous Telescope (MCAT) facility on Ascension Island in the South Atlantic Ocean. MCAT is an optical telescope designed specifically to collect ground-based data for the statistical characterization of orbital debris ranging from Low Earth Orbit (LEO) through Middle Earth Orbits (MEO) and beyond to Geo Transfer and Geosynchronous Orbits (GTO/GEO). The location of Ascension Island has two distinct advantages. First, the near-equatorial location fills a significant longitudinal gap in the Ground-based Electro-Optical Deep Space Surveillance (GEODSS) network of telescopes, and second, it allows access to objects in Low Inclination Low-Earth Orbits (LILO). The MCAT facility will be controlled by a sophisticated software suite that operates the dome and telescope, assesses sky and weather conditions, conducts all necessary calibrations, defines an observing strategy (as dictated by weather, sky conditions, and the observing plan for the night), and carries out the observations. It then reduces the collected data via four primary observing modes ranging from tracking previously cataloged objects to conducting general surveys for detecting uncorrelated debris. Nightly observing plans, as well as the resulting text file of reduced data, will be transferred to and from Ascension, respectively, via a satellite connection. Post-processing occurs at NASA Johnson Space Center. Construction began in September, 2014 with dome and telescope installation occurring in April through early June, 2015. First light was achieved in June, 2015. Acceptance testing, full commissioning, and calibration of this soon-to-be fully autonomous system commenced in summer 2015. The initial characterization of the system from these tests is presented herein.

  6. Autonomous scheduling technology for Earth orbital missions

    NASA Technical Reports Server (NTRS)

    Srivastava, S.

    1982-01-01

    The development of a dynamic autonomous system (DYASS) of resources for the mission support of near-Earth NASA spacecraft is discussed and the current NASA space data system is described from a functional perspective. The future (late 80's and early 90's) NASA space data system is discussed. The DYASS concept, the autonomous process control, and the NASA space data system are introduced. Scheduling and related disciplines are surveyed. DYASS as a scheduling problem is also discussed. Artificial intelligence and knowledge representation is considered as well as the NUDGE system and the I-Space system.

  7. 75 FR 75621 - Office of Commercial Space Transportation; Waiver of Autonomous Reentry Restriction for a Reentry...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-06

    ... Space Transportation; Waiver of Autonomous Reentry Restriction for a Reentry Vehicle AGENCY: Federal... concerns two petitions for waiver submitted to the Federal Aviation Administration (FAA) by Space Exploration Technologies Corp. (SpaceX): A petition to waive the requirement that a waiver petition be...

  8. Centralized Alert-Processing and Asset Planning for Sensorwebs

    NASA Technical Reports Server (NTRS)

    Castano, Rebecca; Chien, Steve A.; Rabideau, Gregg R.; Tang, Benyang

    2010-01-01

    A software program provides a Sensorweb architecture for alert-processing, event detection, asset allocation and planning, and visualization. It automatically tasks and re-tasks various types of assets such as satellites and robotic vehicles in response to alerts (fire, weather) extracted from various data sources, including low-level Webcam data. JPL has adapted cons iderable Sensorweb infrastructure that had been previously applied to NASA Earth Science applications. This NASA Earth Science Sensorweb has been in operational use since 2003, and has proven reliability of the Sensorweb technologies for robust event detection and autonomous response using space and ground assets. Unique features of the software include flexibility to a range of detection and tasking methods including those that require aggregation of data over spatial and temporal ranges, generality of the response structure to represent and implement a range of response campaigns, and the ability to respond rapidly.

  9. Automating Mission Scheduling for Space-Based Observatories

    NASA Technical Reports Server (NTRS)

    Pell, Barney; Muscettola, Nicola; Hansson, Othar; Mohan, Sunil

    1998-01-01

    In this paper we describe the use of our planning and scheduling framework, HSTS, to reduce the complexity of science mission planning. This work is part of an overall project to enable a small team of scientists to control the operations of a spacecraft. The present process is highly labor intensive. Users (scientists and operators) rely on a non-codified understanding of the different spacecraft subsystems and of their operating constraints. They use a variety of software tools to support their decision making process. This paper considers the types of decision making that need to be supported/automated, the nature of the domain constraints and the capabilities needed to address them successfully, and the nature of external software systems with which the core planning/scheduling engine needs to interact. HSTS has been applied to science scheduling for EUVE and Cassini and is being adapted to support autonomous spacecraft operations in the New Millennium initiative.

  10. Integrating Multiple Autonomous Underwater Vessels, Surface Vessels and Aircraft into Oceanographic Research Vessel Operations

    NASA Astrophysics Data System (ADS)

    McGillivary, P. A.; Borges de Sousa, J.; Martins, R.; Rajan, K.

    2012-12-01

    Autonomous platforms are increasingly used as components of Integrated Ocean Observing Systems and oceanographic research cruises. Systems deployed can include gliders or propeller-driven autonomous underwater vessels (AUVs), autonomous surface vessels (ASVs), and unmanned aircraft systems (UAS). Prior field campaigns have demonstrated successful communication, sensor data fusion and visualization for studies using gliders and AUVs. However, additional requirements exist for incorporating ASVs and UASs into ship operations. For these systems to be optimally integrated into research vessel data management and operational planning systems involves addressing three key issues: real-time field data availability, platform coordination, and data archiving for later analysis. A fleet of AUVs, ASVs and UAS deployed from a research vessel is best operated as a system integrated with the ship, provided communications among them can be sustained. For this purpose, Disruptive Tolerant Networking (DTN) software protocols for operation in communication-challenged environments help ensure reliable high-bandwidth communications. Additionally, system components need to have considerable onboard autonomy, namely adaptive sampling capabilities using their own onboard sensor data stream analysis. We discuss Oceanographic Decision Support System (ODSS) software currently used for situational awareness and planning onshore, and in the near future event detection and response will be coordinated among multiple vehicles. Results from recent field studies from oceanographic research vessels using AUVs, ASVs and UAS, including the Rapid Environmental Picture (REP-12) cruise, are presented describing methods and results for use of multi-vehicle communication and deliberative control networks, adaptive sampling with single and multiple platforms, issues relating to data management and archiving, and finally challenges that remain in addressing these technological issues. Significantly, the use of UAS on oceanographic research vessels is just beginning. We report on several initial field efforts which demonstrated that UAS improve spatial and temporal mapping of ocean features, as well as monitoring marine mammal populations, ocean color, sea ice and wave fields and air-sea gas exchange. These studies however also confirm the challenges for shipboard computer systems ingesting and archiving UAS high resolution video, SAR and lidar data. We describe the successful inclusion of DTN communications for: 1) passing video data between two UAS or a UAS and ship; 2) for inclusion of ASVs as communication nodes for AUVs; as well as, 3) enabling extension of adaptive sampling software from AUVs and ASVs to include UAS. In conclusion, we describe how autonomous sampling systems may be best integrated into shipboard oceanographic vessel research to provide new and more comprehensive time-space ocean and atmospheric data collection that is important not only for scientific study, but also for sustainable ocean management, including emergency response capabilities. The recent examples of such integrated studies highlighted confirm ocean and atmospheric studies can more cost-effectively pursued, and in some cases only accomplished, by combining underwater, surface and aircraft autonomous systems with research vessel operations.

  11. Intelligent (Autonomous) Power Controller Development for Human Deep Space Exploration

    NASA Technical Reports Server (NTRS)

    Soeder, James; Raitano, Paul; McNelis, Anne

    2016-01-01

    As NASAs Evolvable Mars Campaign and other exploration initiatives continue to mature they have identified the need for more autonomous operations of the power system. For current human space operations such as the International Space Station, the paradigm is to perform the planning, operation and fault diagnosis from the ground. However, the dual problems of communication lag as well as limited communication bandwidth beyond GEO synchronous orbit, underscore the need to change the operation methodology for human operation in deep space. To address this need, for the past several years the Glenn Research Center has had an effort to develop an autonomous power controller for human deep space vehicles. This presentation discusses the present roadmap for deep space exploration along with a description of conceptual power system architecture for exploration modules. It then contrasts the present ground centric control and management architecture with limited autonomy on-board the spacecraft with an advanced autonomous power control system that features ground based monitoring with a spacecraft mission manager with autonomous control of all core systems, including power. It then presents a functional breakdown of the autonomous power control system and examines its operation in both normal and fault modes. Finally, it discusses progress made in the development of a real-time power system model and how it is being used to evaluate the performance of the controller and well as using it for verification of the overall operation.

  12. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  13. Planning for the V&V of infused software technologies for the Mars Science Laboratory Mission

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Fesq, Lorraine M.; Ingham, Michel D.; Klein, Suzanne L.; Nelson, Stacy D.

    2004-01-01

    NASA's Mars Science Laboratory (MSL) rover mission is planning to make use of advanced software technologies in order to support fulfillment of its ambitious science objectives. The mission plans to adopt the Mission Data System (MDS) as the mission software architecture, and plans to make significant use of on-board autonomous capabilities for the rover software.

  14. Software architecture of biomimetic underwater vehicle

    NASA Astrophysics Data System (ADS)

    Praczyk, Tomasz; Szymak, Piotr

    2016-05-01

    Autonomous underwater vehicles are vehicles that are entirely or partly independent of human decisions. In order to obtain operational independence, the vehicles have to be equipped with a specialized software. The main task of the software is to move the vehicle along a trajectory with collision avoidance. Moreover, the software has also to manage different devices installed on the vehicle board, e.g. to start and stop cameras, sonars etc. In addition to the software embedded on the vehicle board, the software responsible for managing the vehicle by the operator is also necessary. Its task is to define mission of the vehicle, to start, to stop the mission, to send emergency commands, to monitor vehicle parameters, and to control the vehicle in remotely operated mode. An important objective of the software is also to support development and tests of other software components. To this end, a simulation environment is necessary, i.e. simulation model of the vehicle and all its key devices, the model of the sea environment, and the software to visualize behavior of the vehicle. The paper presents architecture of the software designed for biomimetic autonomous underwater vehicle (BAUV) that is being constructed within the framework of the scientific project financed by Polish National Center of Research and Development.

  15. Swarmathon 2017

    NASA Image and Video Library

    2017-04-19

    A sign at the Kennedy Space Center Visitor Complex announces the second annual Swarmathon competition. Students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of cubes with AprilTags, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's second annual Swarmathon, 20 teams representing 22 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the moon or Mars.

  16. An intelligent, free-flying robot

    NASA Technical Reports Server (NTRS)

    Reuter, G. J.; Hess, C. W.; Rhoades, D. E.; Mcfadin, L. W.; Healey, K. J.; Erickson, J. D.

    1988-01-01

    The ground-based demonstration of EVA Retriever, a voice-supervised, intelligent, free-flying robot, is designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the Space Station. The major objective of the EVA Retriever Project is to design, develop, and evaluate an integrated robotic hardware and on-board software system which autonomously: (1) performs system activation and check-out, (2) searches for and acquires the target, (3) plans and executes a rendezvous while continuously tracking the target, (4) avoids stationary and moving obstacles, (5) reaches for and grapples the target, (6) returns to transfer the object, and (7) returns to base.

  17. An intelligent, free-flying robot

    NASA Technical Reports Server (NTRS)

    Reuter, G. J.; Hess, C. W.; Rhoades, D. E.; Mcfadin, L. W.; Healey, K. J.; Erickson, J. D.; Phinney, Dale E.

    1989-01-01

    The ground based demonstration of the extensive extravehicular activity (EVA) Retriever, a voice-supervised, intelligent, free flying robot, is designed to evaluate the capability to retrieve objects (astronauts, equipment, and tools) which have accidentally separated from the Space Station. The major objective of the EVA Retriever Project is to design, develop, and evaluate an integrated robotic hardware and on-board software system which autonomously: (1) performs system activation and check-out; (2) searches for and acquires the target; (3) plans and executes a rendezvous while continuously tracking the target; (4) avoids stationary and moving obstacles; (5) reaches for and grapples the target; (6) returns to transfer the object; and (7) returns to base.

  18. Large autonomous spacecraft electrical power system (LASEPS)

    NASA Technical Reports Server (NTRS)

    Dugal-Whitehead, Norma R.; Johnson, Yvette B.

    1992-01-01

    NASA - Marshall Space Flight Center is creating a large high voltage electrical power system testbed called LASEPS. This testbed is being developed to simulate an end-to-end power system from power generation and source to loads. When the system is completed it will have several power configurations, which will include several battery configurations. These configurations are: two 120 V batteries, one or two 150 V batteries, and one 250 to 270 V battery. This breadboard encompasses varying levels of autonomy from remote power converters to conventional software control to expert system control of the power system elements. In this paper, the construction and provisions of this breadboard are discussed.

  19. Advanced avionics concepts: Autonomous spacecraft control

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A large increase in space operations activities is expected because of Space Station Freedom (SSF) and long range Lunar base missions and Mars exploration. Space operations will also increase as a result of space commercialization (especially the increase in satellite networks). It is anticipated that the level of satellite servicing operations will grow tenfold from the current level within the next 20 years. This growth can be sustained only if the cost effectiveness of space operations is improved. Cost effectiveness is operational efficiency with proper effectiveness. A concept is presented of advanced avionics, autonomous spacecraft control, that will enable the desired growth, as well as maintain the cost effectiveness (operational efficiency) in satellite servicing operations. The concept of advanced avionics that allows autonomous spacecraft control is described along with a brief description of each component. Some of the benefits of autonomous operations are also described. A technology utilization breakdown is provided in terms of applications.

  20. Laser Range and Bearing Finder for Autonomous Missions

    NASA Technical Reports Server (NTRS)

    Granade, Stephen R.

    2004-01-01

    NASA has recently re-confirmed their interest in autonomous systems as an enabling technology for future missions. In order for autonomous missions to be possible, highly-capable relative sensor systems are needed to determine an object's distance, direction, and orientation. This is true whether the mission is autonomous in-space assembly, rendezvous and docking, or rover surface navigation. Advanced Optical Systems, Inc. has developed a wide-angle laser range and bearing finder (RBF) for autonomous space missions. The laser RBF has a number of features that make it well-suited for autonomous missions. It has an operating range of 10 m to 5 km, with a 5 deg field of view. Its wide field of view removes the need for scanning systems such as gimbals, eliminating moving parts and making the sensor simpler and space qualification easier. Its range accuracy is 1% or better. It is designed to operate either as a stand-alone sensor or in tandem with a sensor that returns range, bearing, and orientation at close ranges, such as NASA's Advanced Video Guidance Sensor. We have assembled the initial prototype and are currently testing it. We will discuss the laser RBF's design and specifications. Keywords: laser range and bearing finder, autonomous rendezvous and docking, space sensors, on-orbit sensors, advanced video guidance sensor

  1. User Needs and Advances in Space Wireless Sensing and Communications

    NASA Technical Reports Server (NTRS)

    Kegege, Obadiah

    2017-01-01

    Decades of space exploration and technology trends for future missions show the need for new approaches in space/planetary sensor networks, observatories, internetworking, and communications/data delivery to Earth. The User Needs to be discussed in this talk includes interviews with several scientists and reviews of mission concepts for the next generation of sensors, observatories, and planetary surface missions. These observatories, sensors are envisioned to operate in extreme environments, with advanced autonomy, whereby sometimes communication to Earth is intermittent and delayed. These sensor nodes require software defined networking capabilities in order to learn and adapt to the environment, collect science data, internetwork, and communicate. Also, some user cases require the level of intelligence to manage network functions (either as a host), mobility, security, and interface data to the physical radio/optical layer. For instance, on a planetary surface, autonomous sensor nodes would create their own ad-hoc network, with some nodes handling communication capabilities between the wireless sensor networks and orbiting relay satellites. A section of this talk will cover the advances in space communication and internetworking to support future space missions. NASA's Space Communications and Navigation (SCaN) program continues to evolve with the development of optical communication, a new vision of the integrated network architecture with more capabilities, and the adoption of CCSDS space internetworking protocols. Advances in wireless communications hardware and electronics have enabled software defined networking (DVB-S2, VCM, ACM, DTN, Ad hoc, etc.) protocols for improved wireless communication and network management. Developing technologies to fulfil these user needs for wireless communications and adoption of standardized communication/internetworking protocols will be a huge benefit to future planetary missions, space observatories, and manned missions to other planets.

  2. Project WISH: The Emerald City

    NASA Technical Reports Server (NTRS)

    Oz, Hayrani; Slonksnes, Linda (Editor); Rogers, James W. (Editor); Sherer, Scott E. (Editor); Strosky, Michelle A. (Editor); Szmerekovsky, Andrew G. (Editor); Klupar, G. Joseph (Editor)

    1990-01-01

    The preliminary design of a permanently manned autonomous space oasis (PEMASO), including its pertinent subsystems, was performed during the 1990 Winter and Spring quarters. The purpose for the space oasis was defined and the preliminary design work was started with emphasis placed on the study of orbital mechanics, power systems and propulsion systems. A rotating torus was selected as the preliminary configuration, and overall size, mass and location of some subsystems within the station were addressed. Computer software packages were utilized to determine station transfer parameters and thus the preliminary propulsion requirements. Power and propulsion systems were researched to determine feasible configurations and many conventional schemes were ruled out. Vehicle dynamics and control, mechanical and life support systems were also studied. For each subsystem studied, the next step in the design process to be performed during the continuation of the project was also addressed.

  3. Space Environments Testbed

    NASA Technical Reports Server (NTRS)

    Leucht, David K.; Koslosky, Marie J.; Kobe, David L.; Wu, Jya-Chang C.; Vavra, David A.

    2011-01-01

    The Space Environments Testbed (SET) is a flight controller data system for the Common Carrier Assembly. The SET-1 flight software provides the command, telemetry, and experiment control to ground operators for the SET-1 mission. Modes of operation (see dia gram) include: a) Boot Mode that is initiated at application of power to the processor card, and runs memory diagnostics. It may be entered via ground command or autonomously based upon fault detection. b) Maintenance Mode that allows for limited carrier health monitoring, including power telemetry monitoring on a non-interference basis. c) Safe Mode is a predefined, minimum power safehold configuration with power to experiments removed and carrier functionality minimized. It is used to troubleshoot problems that occur during flight. d) Operations Mode is used for normal experiment carrier operations. It may be entered only via ground command from Safe Mode.

  4. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  5. Autonomous Flight Safety System September 27, 2005, Aircraft Test

    NASA Technical Reports Server (NTRS)

    Simpson, James C.

    2005-01-01

    This report describes the first aircraft test of the Autonomous Flight Safety System (AFSS). The test was conducted on September 27, 2005, near Kennedy Space Center (KSC) using a privately-owned single-engine plane and evaluated the performance of several basic flight safety rules using real-time data onboard a moving aerial vehicle. This test follows the first road test of AFSS conducted in February 2005 at KSC. AFSS is a joint KSC and Wallops Flight Facility (WEF) project that is in its third phase of development. AFSS is an independent subsystem intended for use with Expendable Launch Vehicles that uses tracking data from redundant onboard sensors to autonomously make flight termination decisions using software-based rules implemented on redundant flight processors. The goals of this project are to increase capabilities by allowing launches from locations that do not have or cannot afford extensive ground-based range safety assets, to decrease range costs, and to decrease reaction time for special situations. The mission rules are configured for each operation by the responsible Range Safety authorities and can be loosely categorized in four major categories: Parameter Threshold Violations, Physical Boundary Violations present position and instantaneous impact point (TIP), Gate Rules static and dynamic, and a Green-Time Rule. Examples of each of these rules were evaluated during this aircraft test.

  6. Expert system isssues in automated, autonomous space vehicle rendezvous

    NASA Technical Reports Server (NTRS)

    Goodwin, Mary Ann; Bochsler, Daniel C.

    1987-01-01

    The problems involved in automated autonomous rendezvous are briefly reviewed, and the Rendezvous Expert (RENEX) expert system is discussed with reference to its goals, approach used, and knowledge structure and contents. RENEX has been developed to support streamlining operations for the Space Shuttle and Space Station program and to aid definition of mission requirements for the autonomous portions of rendezvous for the Mars Surface Sample Return and Comet Nucleus Sample return unmanned missions. The experience with REMEX to date and recommendations for further development are presented.

  7. Engineering Ultimate Self-Protection in Autonomic Agents for Space Exploration Missions

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy; Hinchey, Mike

    2005-01-01

    NASA's Exploration Initiative (EI) will push space exploration missions to the limit. Future missions will be required to be self-managing as well as self-directed, in order to meet the challenges of human and robotic space exploration. We discuss security and self protection in autonomic agent based-systems, and propose the ultimate self-protection mechanism for such systems-self-destruction. Like other metaphors in Autonomic Computing, this is inspired by biological systems, and is the analog of biological apoptosis. Finally, we discus the role it might play in future NASA space exploration missions.

  8. Development of an Algorithm to Perform a Comprehensive Study of Autonomic Dysreflexia in Animals with High Spinal Cord Injury Using a Telemetry Device.

    PubMed

    Popok, David; West, Christopher; Frias, Barbara; Krassioukov, Andrei V

    2016-07-29

    Spinal cord injury (SCI) is a debilitating neurological condition characterized by somatic and autonomic dysfunctions. In particular, SCI above the mid-thoracic level can lead to a potentially life-threatening hypertensive condition called autonomic dysreflexia (AD) that is often triggered by noxious or non-noxious somatic or visceral stimuli below the level of injury. One of the most common triggers of AD is the distension of pelvic viscera, such as during bladder and bowel distension or evacuation. This protocol presents a novel pattern recognition algorithm developed for a JAVA platform software to study the fluctuations of cardiovascular parameters as well as the number, severity and duration of spontaneously occurring AD events. The software is able to apply a pattern recognition algorithm on hemodynamic data such as systolic blood pressure (SBP) and heart rate (HR) extracted from telemetry recordings of conscious and unrestrained animals before and after thoracic (T3) complete transection. With this software, hemodynamic parameters and episodes of AD are able to be detected and analyzed with minimal experimenter bias.

  9. Ground Operations Autonomous Control and Integrated Health Management

    NASA Technical Reports Server (NTRS)

    Daniels, James

    2014-01-01

    The Ground Operations Autonomous Control and Integrated Health Management plays a key role for future ground operations at NASA. The software that is integrated into this system is called G2 2011 Gensym. The purpose of this report is to describe the Ground Operations Autonomous Control and Integrated Health Management with the use of the G2 Gensym software and the G2 NASA toolkit for Integrated System Health Management (ISHM) which is a Computer Software Configuration Item (CSCI). The decision rationale for the use of the G2 platform is to develop a modular capability for ISHM and AC. Toolkit modules include knowledge bases that are generic and can be applied in any application domain module. That way, there's a maximization of reusability, maintainability, and systematic evolution, portability, and scalability. Engine modules are generic, while application modules represent the domain model of a specific application. Furthermore, the NASA toolkit, developed since 2006 (a set of modules), makes it possible to create application domain models quickly, using pre-defined objects that include sensors and components libraries for typical fluid, electrical, and mechanical systems.

  10. Development of a Commercially Viable, Modular Autonomous Robotic Systems for Converting any Vehicle to Autonomous Control

    NASA Technical Reports Server (NTRS)

    Parish, David W.; Grabbe, Robert D.; Marzwell, Neville I.

    1994-01-01

    A Modular Autonomous Robotic System (MARS), consisting of a modular autonomous vehicle control system that can be retrofit on to any vehicle to convert it to autonomous control and support a modular payload for multiple applications is being developed. The MARS design is scalable, reconfigurable, and cost effective due to the use of modern open system architecture design methodologies, including serial control bus technology to simplify system wiring and enhance scalability. The design is augmented with modular, object oriented (C++) software implementing a hierarchy of five levels of control including teleoperated, continuous guidepath following, periodic guidepath following, absolute position autonomous navigation, and relative position autonomous navigation. The present effort is focused on producing a system that is commercially viable for routine autonomous patrolling of known, semistructured environments, like environmental monitoring of chemical and petroleum refineries, exterior physical security and surveillance, perimeter patrolling, and intrafacility transport applications.

  11. A Unified Approach to Model-Based Planning and Execution

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Dorais, Gregory A.; Fry, Chuck; Levinson, Richard; Plaunt, Christian; Norvig, Peter (Technical Monitor)

    2000-01-01

    Writing autonomous software is complex, requiring the coordination of functionally and technologically diverse software modules. System and mission engineers must rely on specialists familiar with the different software modules to translate requirements into application software. Also, each module often encodes the same requirement in different forms. The results are high costs and reduced reliability due to the difficulty of tracking discrepancies in these encodings. In this paper we describe a unified approach to planning and execution that we believe provides a unified representational and computational framework for an autonomous agent. We identify the four main components whose interplay provides the basis for the agent's autonomous behavior: the domain model, the plan database, the plan running module, and the planner modules. This representational and problem solving approach can be applied at all levels of the architecture of a complex agent, such as Remote Agent. In the rest of the paper we briefly describe the Remote Agent architecture. The new agent architecture proposed here aims at achieving the full Remote Agent functionality. We then give the fundamental ideas behind the new agent architecture and point out some implication of the structure of the architecture, mainly in the area of reactivity and interaction between reactive and deliberative decision making. We conclude with related work and current status.

  12. Enabling Autonomous Rover Science through Dynamic Planning and Scheduling

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel; Chouinard, Caroline; Fisher, Forest; Castano, Rebecca; Judd, Michele; Nesnas, Issa

    2005-01-01

    This paper describes how dynamic planning and scheduling techniques can be used onboard a rover to autonomously adjust rover activities in support of science goals. These goals could be identified by scientists on the ground or could be identified by onboard data-analysis software. Several different types of dynamic decisions are described, including the handling of opportunistic science goals identified during rover traverses, preserving high priority science targets when resources, such as power, are unexpectedly over-subscribed, and dynamically adding additional, ground-specified science targets when rover actions are executed more quickly than expected. After describing our specific system approach, we discuss some of the particular challenges we have examined to support autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations.

  13. Joint NASA Ames/Langley Experimental Evaluation of Integrated Air/Ground Operations for En Route Free Maneuvering

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Kopardekar, Parimal; Battiste, Vernol; Doble, Nathan; Johnson, Walter; Lee, Paul; Prevot, Thomas; Smith, Nancy

    2005-01-01

    In order to meet the anticipated future demand for air travel, the National Aeronautics and Space Administration (NASA) is investigating a new concept of operations known as Distributed Air-Ground Traffic Management (DAG-TM). Under the En Route Free Maneuvering component of DAG-TM, appropriately equipped autonomous aircraft self separate from other autonomous aircraft and from managed aircraft that continue to fly under today s Instrument Flight Rules (IFR). Controllers provide separation services between IFR aircraft and assign traffic flow management constraints to all aircraft. To address concept feasibility issues pertaining to integrated air/ground operations at various traffic levels, NASA Ames and Langley Research Centers conducted a joint human-in-the-loop experiment. Professional airline pilots and air traffic controllers flew a total of 16 scenarios under four conditions: mixed autonomous/managed operations at three traffic levels and a baseline all-managed condition at the lowest traffic level. These scenarios included en route flights and descents to a terminal area meter fix in airspace modeled after the Dallas Ft. Worth area. Pilots of autonomous aircraft met controller assigned meter fix constraints with high success. Separation violations by subject pilots did not appear to vary with traffic level and were mainly attributable to software errors and procedural lapses. Controller workload was lower for mixed flight conditions, even at higher traffic levels. Pilot workload was deemed acceptable under all conditions. Controllers raised several safety concerns, most of which pertained to the occurrence of near-term conflicts between autonomous and managed aircraft. These issues are being addressed through better compatibility between air and ground systems and refinements to air and ground procedures.

  14. Space station automation study. Volume 1: Executive summary. Autonomous systems and assembly

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The space station automation study (SSAS) was to develop informed technical guidance for NASA personnel in the use of autonomy and autonomous systems to implement space station functions. The initial step taken by NASA in organizing the SSAS was to form and convene a panel of recognized expert technologists in automation, space sciences and aerospace engineering to produce a space station automation plan.

  15. Preliminary Design of an Autonomous Amphibious System

    DTIC Science & Technology

    2016-09-01

    changing vehicle dynamics will require innovative new autonomy algorithms. The developed software architecture, drive-by- wire kit, and supporting...COMMUNICATIONS ARCHITECTURE .................................................12 3.3 DRIVE-BY- WIRE DESIGN...SOFTWARE MATURATION PLANS ......................................................17 4.2 DRIVE-BY- WIRE PLANNED REFINEMENT

  16. Cybersecurity for aerospace autonomous systems

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2015-05-01

    High profile breaches have occurred across numerous information systems. One area where attacks are particularly problematic is autonomous control systems. This paper considers the aerospace information system, focusing on elements that interact with autonomous control systems (e.g., onboard UAVs). It discusses the trust placed in the autonomous systems and supporting systems (e.g., navigational aids) and how this trust can be validated. Approaches to remotely detect the UAV compromise, without relying on the onboard software (on a potentially compromised system) as part of the process are discussed. How different levels of autonomy (task-based, goal-based, mission-based) impact this remote characterization is considered.

  17. Fault Management Techniques in Human Spaceflight Operations

    NASA Technical Reports Server (NTRS)

    O'Hagan, Brian; Crocker, Alan

    2006-01-01

    This paper discusses human spaceflight fault management operations. Fault detection and response capabilities available in current US human spaceflight programs Space Shuttle and International Space Station are described while emphasizing system design impacts on operational techniques and constraints. Preflight and inflight processes along with products used to anticipate, mitigate and respond to failures are introduced. Examples of operational products used to support failure responses are presented. Possible improvements in the state of the art, as well as prioritization and success criteria for their implementation are proposed. This paper describes how the architecture of a command and control system impacts operations in areas such as the required fault response times, automated vs. manual fault responses, use of workarounds, etc. The architecture includes the use of redundancy at the system and software function level, software capabilities, use of intelligent or autonomous systems, number and severity of software defects, etc. This in turn drives which Caution and Warning (C&W) events should be annunciated, C&W event classification, operator display designs, crew training, flight control team training, and procedure development. Other factors impacting operations are the complexity of a system, skills needed to understand and operate a system, and the use of commonality vs. optimized solutions for software and responses. Fault detection, annunciation, safing responses, and recovery capabilities are explored using real examples to uncover underlying philosophies and constraints. These factors directly impact operations in that the crew and flight control team need to understand what happened, why it happened, what the system is doing, and what, if any, corrective actions they need to perform. If a fault results in multiple C&W events, or if several faults occur simultaneously, the root cause(s) of the fault(s), as well as their vehicle-wide impacts, must be determined in order to maintain situational awareness. This allows both automated and manual recovery operations to focus on the real cause of the fault(s). An appropriate balance must be struck between correcting the root cause failure and addressing the impacts of that fault on other vehicle components. Lastly, this paper presents a strategy for using lessons learned to improve the software, displays, and procedures in addition to determining what is a candidate for automation. Enabling technologies and techniques are identified to promote system evolution from one that requires manual fault responses to one that uses automation and autonomy where they are most effective. These considerations include the value in correcting software defects in a timely manner, automation of repetitive tasks, making time critical responses autonomous, etc. The paper recommends the appropriate use of intelligent systems to determine the root causes of faults and correctly identify separate unrelated faults.

  18. Towards an Autonomous Space In-Situ Marine Sensorweb

    NASA Technical Reports Server (NTRS)

    Chien, S.; Doubleday, J.; Tran, D.; Thompson, D.; Mahoney, G.; Chao, Y.; Castano, R.; Ryan, J.; Kudela, R.; Palacios, S.; hide

    2009-01-01

    We describe ongoing efforts to integrate and coordinate space and marine assets to enable autonomous response to dynamic ocean phenomena such as algal blooms, eddies, and currents. Thus far we have focused on the use of remote sensing assets (e.g. satellites) but future plans include expansions to use a range of in-situ sensors such as gliders, autonomous underwater vehicles, and buoys/moorings.

  19. Technology for an intelligent, free-flying robot for crew and equipment retrieval in space

    NASA Technical Reports Server (NTRS)

    Erickson, J. D.; Reuter, G. J.; Healey, Kathleen J.; Phinney, D. E.

    1990-01-01

    Crew rescue and equipment retrieval is a Space Station Freedom requirement. During Freedom's lifetime, there is a high probability that a number of objects will accidently become separated. Members of the crew, replacement units, and key tools are examples. Retrieval of these objects within a short time is essential. Systems engineering studies were conducted to identify system requirements and candidate approaches. One such approach, based on a voice-supervised, intelligent, free-flying robot was selected for further analysis. A ground-based technology demonstration, now in its second phase, was designed to provide an integrated robotic hardware and software testbed supporting design of a space-borne system. The ground system, known as the EVA Retriever, is examining the problem of autonomously planning and executing a target rendezvous, grapple, and return to base while avoiding stationary and moving obstacles. The current prototype is an anthropomorphic manipulator unit with dexterous arms and hands attached to a robot body and latched in a manned maneuvering unit. A precision air-bearing floor is used to simulate space. Sensor data include two vision systems and force/proximity/tactile sensors on the hands and arms. Planning for a shuttle file experiment is underway. A set of scenarios and strawman requirements were defined to support conceptual development. Initial design activities are expected to begin in late 1989 with the flight occurring in 1994. The flight hardware and software will be based on lessons learned from both the ground prototype and computer simulations.

  20. The Curiosity Mars Rover's Fault Protection Engine

    NASA Technical Reports Server (NTRS)

    Benowitz, Ed

    2014-01-01

    The Curiosity Rover, currently operating on Mars, contains flight software onboard to autonomously handle aspects of system fault protection. Over 1000 monitors and 39 responses are present in the flight software. Orchestrating these behaviors is the flight software's fault protection engine. In this paper, we discuss the engine's design, responsibilities, and present some lessons learned for future missions.

  1. Adapting the RoboCup Simulation for Autonomous Vehicle Team Information Fusion and Decision Making Experimentation

    DTIC Science & Technology

    2010-06-01

    researchers outside the government to produce the kinds of algorithms and software that would easily transition into solutions for teams of autonomous ... vehicles for military scenarios. To accomplish this, we began modifying the RoboCup soccer game step-by-step to incorporate rules that simulate these

  2. Onboard Processing and Autonomous Operations on the IPEX Cubesat

    NASA Technical Reports Server (NTRS)

    Chien, Steve; Doubleday, Joshua; Ortega, Kevin; Flatley, Tom; Crum, Gary; Geist, Alessandro; Lin, Michael; Williams, Austin; Bellardo, John; Puig-Suari, Jordi; hide

    2012-01-01

    IPEX is a 1u Cubesat sponsored by NASA Earth Science Technology Office (ESTO), the goals or which are: (1) Flight validate high performance flight computing, (2) Flight validate onboard instrument data processing product generation software, (3) flight validate autonomous operations for instrument processing, (4) enhance NASA outreach and university ties.

  3. Real time health monitoring and control system methodology for flexible space structures

    NASA Astrophysics Data System (ADS)

    Jayaram, Sanjay

    This dissertation is concerned with the Near Real-time Autonomous Health Monitoring of Flexible Space Structures. The dynamics of multi-body flexible systems is uncertain due to factors such as high non-linearity, consideration of higher modal frequencies, high dimensionality, multiple inputs and outputs, operational constraints, as well as unexpected failures of sensors and/or actuators. Hence a systematic framework of developing a high fidelity, dynamic model of a flexible structural system needs to be understood. The fault detection mechanism that will be an integrated part of an autonomous health monitoring system comprises the detection of abnormalities in the sensors and/or actuators and correcting these detected faults (if possible). Applying the robust control law and the robust measures that are capable of detecting and recovering/replacing the actuators rectifies the actuator faults. The fault tolerant concept applied to the sensors will be in the form of an Extended Kalman Filter (EKF). The EKF is going to weigh the information coming from multiple sensors (redundant sensors used to measure the same information) and automatically identify the faulty sensors and weigh the best estimate from the remaining sensors. The mechanization is comprised of instrumenting flexible deployable panels (solar array) with multiple angular position and rate sensors connected to the data acquisition system. The sensors will give position and rate information of the solar panel in all three axes (i.e. roll, pitch and yaw). The position data corresponds to the steady state response and the rate data will give better insight on the transient response of the system. This is a critical factor for real-time autonomous health monitoring. MATLAB (and/or C++) software will be used for high fidelity modeling and fault tolerant mechanism.

  4. Advancing Autonomous Operations for Deep Space Vehicles

    NASA Technical Reports Server (NTRS)

    Haddock, Angie T.; Stetson, Howard K.

    2014-01-01

    Starting in Jan 2012, the Advanced Exploration Systems (AES) Autonomous Mission Operations (AMO) Project began to investigate the ability to create and execute "single button" crew initiated autonomous activities [1]. NASA Marshall Space Flight Center (MSFC) designed and built a fluid transfer hardware test-bed to use as a sub-system target for the investigations of intelligent procedures that would command and control a fluid transfer test-bed, would perform self-monitoring during fluid transfers, detect anomalies and faults, isolate the fault and recover the procedures function that was being executed, all without operator intervention. In addition to the development of intelligent procedures, the team is also exploring various methods for autonomous activity execution where a planned timeline of activities are executed autonomously and also the initial analysis of crew procedure development. This paper will detail the development of intelligent procedures for the NASA MSFC Autonomous Fluid Transfer System (AFTS) as well as the autonomous plan execution capabilities being investigated. Manned deep space missions, with extreme communication delays with Earth based assets, presents significant challenges for what the on-board procedure content will encompass as well as the planned execution of the procedures.

  5. Development of Mission Enabling Infrastructure — Cislunar Autonomous Positioning System (CAPS)

    NASA Astrophysics Data System (ADS)

    Cheetham, B. W.

    2017-10-01

    Advanced Space, LLC is developing the Cislunar Autonomous Positioning System (CAPS) which would provide a scalable and evolvable architecture for navigation to reduce ground congestion and improve operations for missions throughout cislunar space.

  6. Technologies for Human Exploration

    NASA Technical Reports Server (NTRS)

    Drake, Bret G.

    2014-01-01

    Access to Space, Chemical Propulsion, Advanced Propulsion, In-Situ Resource Utilization, Entry, Descent, Landing and Ascent, Humans and Robots Working Together, Autonomous Operations, In-Flight Maintenance, Exploration Mobility, Power Generation, Life Support, Space Suits, Microgravity Countermeasures, Autonomous Medicine, Environmental Control.

  7. Autonomous Integrated Receive System (AIRS) requirements definition. Volume 3: Performance and simulation

    NASA Technical Reports Server (NTRS)

    Chie, C. M.; Su, Y. T.; Lindsey, W. C.; Koukos, J.

    1984-01-01

    The autonomous and integrated aspects of the operation of the AIRS (Autonomous Integrated Receive System) are discussed from a system operation point of view. The advantages of AIRS compared to the existing SSA receive chain equipment are highlighted. The three modes of AIRS operation are addressed in detail. The configurations of the AIRS are defined as a function of the operating modes and the user signal characteristics. Each AIRS configuration selection is made up of three components: the hardware, the software algorithms and the parameters used by these algorithms. A comparison between AIRS and the wide dynamics demodulation (WDD) is provided. The organization of the AIRS analytical/simulation software is described. The modeling and analysis is for simulating the performance of the PN subsystem is documented. The frequence acquisition technique using a frequency-locked loop is also documented. Doppler compensation implementation is described. The technological aspects of employing CCD's for PN acquisition are addressed.

  8. Autonomous Exploration for Gathering Increased Science

    NASA Technical Reports Server (NTRS)

    Bornstein, Benjamin J.; Castano, Rebecca; Estlin, Tara A.; Gaines, Daniel M.; Anderson, Robert C.; Thompson, David R.; DeGranville, Charles K.; Chien, Steve A.; Tang, Benyang; Burl, Michael C.; hide

    2010-01-01

    The Autonomous Exploration for Gathering Increased Science System (AEGIS) provides automated targeting for remote sensing instruments on the Mars Exploration Rover (MER) mission, which at the time of this reporting has had two rovers exploring the surface of Mars (see figure). Currently, targets for rover remote-sensing instruments must be selected manually based on imagery already on the ground with the operations team. AEGIS enables the rover flight software to analyze imagery onboard in order to autonomously select and sequence targeted remote-sensing observations in an opportunistic fashion. In particular, this technology will be used to automatically acquire sub-framed, high-resolution, targeted images taken with the MER panoramic cameras. This software provides: 1) Automatic detection of terrain features in rover camera images, 2) Feature extraction for detected terrain targets, 3) Prioritization of terrain targets based on a scientist target feature set, and 4) Automated re-targeting of rover remote-sensing instruments at the highest priority target.

  9. 3D Reconfigurable MPSoC for Unmanned Spacecraft Navigation

    NASA Astrophysics Data System (ADS)

    Dekoulis, George

    2016-07-01

    This paper describes the design of a new lightweight spacecraft navigation system for unmanned space missions. The system addresses the demands for more efficient autonomous navigation in the near-Earth environment or deep space. The proposed instrumentation is directly suitable for unmanned systems operation and testing of new airborne prototypes for remote sensing applications. The system features a new sensor technology and significant improvements over existing solutions. Fluxgate type sensors have been traditionally used in unmanned defense systems such as target drones, guided missiles, rockets and satellites, however, the guidance sensors' configurations exhibit lower specifications than the presented solution. The current implementation is based on a recently developed material in a reengineered optimum sensor configuration for unprecedented low-power consumption. The new sensor's performance characteristics qualify it for spacecraft navigation applications. A major advantage of the system is the efficiency in redundancy reduction achieved in terms of both hardware and software requirements.

  10. A framework for building real-time expert systems

    NASA Technical Reports Server (NTRS)

    Lee, S. Daniel

    1991-01-01

    The Space Station Freedom is an example of complex systems that require both traditional and artificial intelligence (AI) real-time methodologies. It was mandated that Ada should be used for all new software development projects. The station also requires distributed processing. Catastrophic failures on the station can cause the transmission system to malfunction for a long period of time, during which ground-based expert systems cannot provide any assistance to the crisis situation on the station. This is even more critical for other NASA projects that would have longer transmission delays (e.g., the lunar base, Mars missions, etc.). To address these issues, a distributed agent architecture (DAA) is proposed that can support a variety of paradigms based on both traditional real-time computing and AI. The proposed testbed for DAA is an autonomous power expert (APEX) which is a real-time monitoring and diagnosis expert system for the electrical power distribution system of the space station.

  11. Orbital Express Advanced Video Guidance Sensor: Ground Testing, Flight Results and Comparisons

    NASA Technical Reports Server (NTRS)

    Pinson, Robin M.; Howard, Richard T.; Heaton, Andrew F.

    2008-01-01

    Orbital Express (OE) was a successful mission demonstrating automated rendezvous and docking. The 2007 mission consisted of two spacecraft, the Autonomous Space Transport Robotic Operations (ASTRO) and the Next Generation Serviceable Satellite (NEXTSat) that were designed to work together and test a variety of service operations in orbit. The Advanced Video Guidance Sensor, AVGS, was included as one of the primary proximity navigation sensors on board the ASTRO. The AVGS was one of four sensors that provided relative position and attitude between the two vehicles. Marshall Space Flight Center was responsible for the AVGS software and testing (especially the extensive ground testing), flight operations support, and analyzing the flight data. This paper briefly describes the historical mission, the data taken on-orbit, the ground testing that occurred, and finally comparisons between flight data and ground test data for two different flight regimes.

  12. Integrating CLIPS applications into heterogeneous distributed systems

    NASA Technical Reports Server (NTRS)

    Adler, Richard M.

    1991-01-01

    SOCIAL is an advanced, object-oriented development tool for integrating intelligent and conventional applications across heterogeneous hardware and software platforms. SOCIAL defines a family of 'wrapper' objects called agents, which incorporate predefined capabilities for distributed communication and control. Developers embed applications within agents and establish interactions between distributed agents via non-intrusive message-based interfaces. This paper describes a predefined SOCIAL agent that is specialized for integrating C Language Integrated Production System (CLIPS)-based applications. The agent's high-level Application Programming Interface supports bidirectional flow of data, knowledge, and commands to other agents, enabling CLIPS applications to initiate interactions autonomously, and respond to requests and results from heterogeneous remote systems. The design and operation of CLIPS agents are illustrated with two distributed applications that integrate CLIPS-based expert systems with other intelligent systems for isolating and mapping problems in the Space Shuttle Launch Processing System at the NASA Kennedy Space Center.

  13. Swarmathon 2018

    NASA Image and Video Library

    2018-04-17

    Students from Montgomery College in Rockville in Maryland, follow the progress of their Swarmie robots during the Swarmathon competition at the Kennedy Space Center Visitor Complex. Students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.

  14. Swarmathon 2018

    NASA Image and Video Library

    2018-04-18

    In the Swarmathon competition at the Kennedy Space Center Visitor Complex, students were asked to develop computer code for the small robots, programming them to look for "resources" in the form of AprilTag cubes, similar to barcodes. To add to the challenge, obstacles in the form of simulated rocks were placed in the completion arena. Teams developed search algorithms for the Swarmies to operate autonomously, communicating and interacting as a collective swarm similar to ants foraging for food. In the spaceport's third annual Swarmathon, 23 teams represented 24 minority serving universities and community colleges were invited to develop software code to operate these innovative robots known as "Swarmies" to help find resources when astronauts explore distant locations, such as the Moon or Mars.

  15. Autonomous manipulation on a robot: Summary of manipulator software functions

    NASA Technical Reports Server (NTRS)

    Lewis, R. A.

    1974-01-01

    A six degree-of-freedom computer-controlled manipulator is examined, and the relationships between the arm's joint variables and 3-space are derived. Arm trajectories using sequences of third-degree polynomials to describe the time history of each joint variable are presented and two approaches to the avoidance of obstacles are given. The equations of motion for the arm are derived and then decomposed into time-dependent factors and time-independent coefficients. Several new and simplifying relationships among the coefficients are proven. Two sample trajectories are analyzed in detail for purposes of determining the most important contributions to total force in order that relatively simple approximations to the equations of motion can be used.

  16. Development of Autonomous Aerobraking (Phase 1)

    NASA Technical Reports Server (NTRS)

    Murri, Daniel G.; Powell, Richard W.; Prince, Jill L.

    2012-01-01

    The NASA Engineering and Safety Center received a request from Mr. Daniel Murri (NASA Technical Fellow for Flight Mechanics) to develop an autonomous aerobraking capability. An initial evaluation for all phases of this assessment was approved to proceed at the NESC Review Board meeting. The purpose of phase 1 of this study was to provide an assessment of the feasibility of autonomous aerobraking. During this phase, atmospheric, aerodynamic, and thermal models for a representative spacecraft were developed for both the onboard algorithm known as Autonomous Aerobraking Development Software, and a ground-based "truth" simulation developed for testing purposes. The results of the phase 1 assessment are included in this report.

  17. A Buyer Behaviour Framework for the Development and Design of Software Agents in E-Commerce.

    ERIC Educational Resources Information Center

    Sproule, Susan; Archer, Norm

    2000-01-01

    Software agents are computer programs that run in the background and perform tasks autonomously as delegated by the user. This paper blends models from marketing research and findings from the field of decision support systems to build a framework for the design of software agents to support in e-commerce buying applications. (Contains 35…

  18. Autonomous Relative Navigation for Formation-Flying Satellites Using GPS

    NASA Technical Reports Server (NTRS)

    Gramling, Cheryl; Carpenter, J. Russell; Long, Anne; Kelbel, David; Lee, Taesul

    2000-01-01

    The Goddard Space Flight Center is currently developing advanced spacecraft systems to provide autonomous navigation and control of formation flyers. This paper discusses autonomous relative navigation performance for a formation of four eccentric, medium-altitude Earth-orbiting satellites using Global Positioning System (GPS) Standard Positioning Service (SPS) and "GPS-like " intersatellite measurements. The performance of several candidate relative navigation approaches is evaluated. These analyses indicate that an autonomous relative navigation position accuracy of 1meter root-mean-square can be achieved by differencing high-accuracy filtered solutions if only measurements from common GPS space vehicles are used in the independently estimated solutions.

  19. The Challenges of Sensing and Repairing Software Defects in Autonomous Systems

    DTIC Science & Technology

    2014-05-09

    From - To) 28 Nov 2012 – 24 Feb 2014 4 . TITLE AND SUBTITLE The Challenges of Sensing and Repairing Software Defects in Autonomous Systems 5a...fault localization to focus edit locations; (3) existing code to provide the seed of new repairs; ( 4 ) fitness approximation to reduce required test...Assumptions, and Procedures…………………………………………………...….1 4 . Results and Discussion………………………………………………………………………..3 4.1 Technical Approach

  20. A six-legged rover for planetary exploration

    NASA Technical Reports Server (NTRS)

    Simmons, Reid; Krotkov, Eric; Bares, John

    1991-01-01

    To survive the rigors and isolation of planetary exploration, an autonomous rover must be competent, reliable, and efficient. This paper presents the Ambler, a six-legged robot featuring orthogonal legs and a novel circulating gait, which has been designed for traversal of rugged, unknown environments. An autonomous software system that integrates perception, planning, and real-time control has been developed to walk the Ambler through obstacle strewn terrain. The paper describes the information and control flow of the walking system, and how the design of the mechanism and software combine to achieve competent walking, reliable behavior in the face of unexpected failures, and efficient utilization of time and power.

  1. Challenges of Developing New Classes of NASA Self-Managing Mission

    NASA Technical Reports Server (NTRS)

    Hinchey, M. G.; Rash, J. I.; Truszkowski, W. F.; Rouff, C. A.; Sterritt, R.

    2005-01-01

    NASA is proposing increasingly complex missions that will require a high degree of autonomy and autonomicity. These missions pose hereto unforeseen problems and raise issues that have not been well-addressed by the community. Assuring success of such missions will require new software development techniques and tools. This paper discusses some of the challenges that NASA and the rest of the software development community are facing in developing these ever-increasingly complex systems. We give an overview of a proposed NASA mission as well as techniques and tools that are being developed to address autonomic management and the complexity issues inherent in these missions.

  2. SMART Solar Sail

    NASA Technical Reports Server (NTRS)

    Curtis, Steven A.

    2005-01-01

    A report summarizes the design concept of a super miniaturized autonomous reconfigurable technology (SMART) solar sail a proposed deployable, fully autonomous solar sail for use in very fine station keeping of a spacecraft. The SMART solar sail would include a reflective film stretched among nodes of a SMART space frame made partly of nanotubule struts. A microelectromechanical system (MEMS) at each vertex of the frame would spool and unspool nanotubule struts between itself and neighboring nodes to vary the shape of the frame. The MEMSs would be linked, either wirelessly or by thin wires within the struts, to an evolvable neural software system (ENSS) that would control the MEMSs to reconfigure the sail as needed. The solar sail would be highly deformable from an initially highly compressed configuration, yet also capable of enabling very fine maneuvering of the spacecraft by means of small sail-surface deformations. The SMART Solar Sail would be connected to the main body of the spacecraft by a SMART multi-tether structure, which would include MEMS actuators like those of the frame plus tethers in the form of longer versions of the struts in the frame.

  3. Nasa's Ant-Inspired Swarmie Robots

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.

    2016-01-01

    As humans push further beyond the grasp of earth, robotic missions in advance of human missions will play an increasingly important role. These robotic systems will find and retrieve valuable resources as part of an in-situ resource utilization (ISRU) strategy. They will need to be highly autonomous while maintaining high task performance levels. NASA Kennedy Space Center has teamed up with the Biological Computation Lab at the University of New Mexico to create a swarm of small, low-cost, autonomous robots to be used as a ground-based research platform for ISRU missions. The behavior of the robot swarm mimics the central-place foraging strategy of ants to find and collect resources in a previously unmapped environment and return those resources to a central site. This talk will guide the audience through the Swarmie robot project from its conception by students in a New Mexico research lab to its robot trials in an outdoor parking lot at NASA. The software technologies and techniques used on the project will be discussed, as well as various challenges and solutions that were encountered by the development team along the way.

  4. Development of the NASA MCAT Auxiliary Telescope for Orbital Debris Research

    NASA Technical Reports Server (NTRS)

    Frith, James; Lederer, Sue; Cowardin, Heather; Buckalew, Brent; Hickson, Paul; Anz-Meador, Phillip

    2016-01-01

    The National Aeronautical Space Administration has deployed the Meter Class Autonomous Telescope (MCAT) to Ascension Island with plans for it to become fully operational by summer 2016. This telescope will be providing data in support of research being conducted by the Orbital Debris Program Office at the Johnson Space Center. In addition to the main observatory, a smaller, auxiliary telescope is being deployed to the same location to augment and support observations generated by MCAT. It will provide near-simultaneous photometry and astrometry of debris objects, independent measurements of the seeing conditions, and offload low priority targets from MCAT's observing queue. Its hardware and software designs are presented here The National Aeronautical and Space Administration (NASA) has recently deployed the Meter Class Autonomous Telescope (MCAT) to Ascension Island. MCAT will provide NASA with a dedicated optical sensor for observations of orbital debris with the goal of statistically sampling the orbital and photometric characteristics of the population from low Earth to Geosynchronous orbits. Additionally, a small auxiliary telescope, co-located with MCAT, is being deployed to augment its observations by providing near-simultaneous photometry and astrometry, as well as offloading low priority targets from MCAT's observing queue. It will also serve to provide an independent measurement of the seeing conditions to help monitor the quality of the data being produced by the larger telescope. Comprised of off-the-shelf-components, the MCAT Auxiliary Telescope will have a 16-inch optical tube assembly, Sloan g'r'i'z' and Johnson/Cousins BVRI filters, and a fast tracking mount to help facilitate the tracking of objects in low Earth orbit. Tracking modes and tasking will be similar to MCAT except an emphasis will be placed on observations that provide more accurate initial orbit determination for the objects detected by MCAT. The near-simultaneous observations will also provide the opportunity for multi-filter color information of the debris objects to be obtained. Color information can further distinguish the individual objects within the population and provide insight into the reflectance properties of their surface material. The specific hardware, software, and tasking methodology of the MCAT Auxiliary Telescope is presented here..

  5. Error Analysis System for Spacecraft Navigation Using the Global Positioning System (GPS)

    NASA Technical Reports Server (NTRS)

    Truong, S. H.; Hart, R. C.; Hartman, K. R.; Tomcsik, T. L.; Searl, J. E.; Bernstein, A.

    1997-01-01

    The Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA) Goddard Space Flight Center (GSFC) is currently developing improved space-navigation filtering algorithms to use the Global Positioning System (GPS) for autonomous real-time onboard orbit determination. In connection with a GPS technology demonstration on the Small Satellite Technology Initiative (SSTI)/Lewis spacecraft, FDD analysts and programmers have teamed with the GSFC Guidance, Navigation, and Control Branch to develop the GPS Enhanced Orbit Determination Experiment (GEODE) system. The GEODE system consists of a Kalman filter operating as a navigation tool for estimating the position, velocity, and additional states required to accurately navigate the orbiting Lewis spacecraft by using astrodynamic modeling and GPS measurements from the receiver. A parallel effort at the FDD is the development of a GPS Error Analysis System (GEAS) that will be used to analyze and improve navigation filtering algorithms during development phases and during in-flight calibration. For GEAS, the Kalman filter theory is extended to estimate the errors in position, velocity, and other error states of interest. The estimation of errors in physical variables at regular intervals will allow the time, cause, and effect of navigation system weaknesses to be identified. In addition, by modeling a sufficient set of navigation system errors, a system failure that causes an observed error anomaly can be traced and accounted for. The GEAS software is formulated using Object Oriented Design (OOD) techniques implemented in the C++ programming language on a Sun SPARC workstation. The Phase 1 of this effort is the development of a basic system to be used to evaluate navigation algorithms implemented in the GEODE system. This paper presents the GEAS mathematical methodology, systems and operations concepts, and software design and implementation. Results from the use of the basic system to evaluate navigation algorithms implemented on GEODE are also discussed. In addition, recommendations for generalization of GEAS functions and for new techniques to optimize the accuracy and control of the GPS autonomous onboard navigation are presented.

  6. UAV Research at NASA Langley: Towards Safe, Reliable, and Autonomous Operations

    NASA Technical Reports Server (NTRS)

    Davila, Carlos G.

    2016-01-01

    Unmanned Aerial Vehicles (UAV) are fundamental components in several aspects of research at NASA Langley, such as flight dynamics, mission-driven airframe design, airspace integration demonstrations, atmospheric science projects, and more. In particular, NASA Langley Research Center (Langley) is using UAVs to develop and demonstrate innovative capabilities that meet the autonomy and robotics challenges that are anticipated in science, space exploration, and aeronautics. These capabilities will enable new NASA missions such as asteroid rendezvous and retrieval (ARRM), Mars exploration, in-situ resource utilization (ISRU), pollution measurements in historically inaccessible areas, and the integration of UAVs into our everyday lives all missions of increasing complexity, distance, pace, and/or accessibility. Building on decades of NASA experience and success in the design, fabrication, and integration of robust and reliable automated systems for space and aeronautics, Langley Autonomy Incubator seeks to bridge the gap between automation and autonomy by enabling safe autonomous operations via onboard sensing and perception systems in both data-rich and data-deprived environments. The Autonomy Incubator is focused on the challenge of mobility and manipulation in dynamic and unstructured environments by integrating technologies such as computer vision, visual odometry, real-time mapping, path planning, object detection and avoidance, object classification, adaptive control, sensor fusion, machine learning, and natural human-machine teaming. These technologies are implemented in an architectural framework developed in-house for easy integration and interoperability of cutting-edge hardware and software.

  7. Autonomous Mission Operations

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy; Spirkovska, Lilijana; McCann, Rob; Wang, Lui; Pohlkamp, Kara; Morin, Lee

    2012-01-01

    NASA's Advanced Exploration Systems Autonomous Mission Operations (AMO) project conducted an empirical investigation of the impact of time-delay on todays mission operations, and of the effect of processes and mission support tools designed to mitigate time-delay related impacts. Mission operation scenarios were designed for NASA's Deep Space Habitat (DSH), an analog spacecraft habitat, covering a range of activities including nominal objectives, DSH system failures, and crew medical emergencies. The scenarios were simulated at time-delay values representative of Lunar (1.2-5 sec), Near Earth Object (NEO) (50 sec) and Mars (300 sec) missions. Each combination of operational scenario and time-delay was tested in a Baseline configuration, designed to reflect present-day operations of the International Space Station, and a Mitigation configuration in which a variety of software tools, information displays, and crew-ground communications protocols were employed to assist both crews and Flight Control Team (FCT) members with the long-delay conditions. Preliminary findings indicate: 1) Workload of both crew members and FCT members generally increased along with increasing time delay. 2) Advanced procedure execution viewers, caution and warning tools, and communications protocols such as text messaging decreased the workload of both flight controllers and crew, and decreased the difficulty of coordinating activities. 3) Whereas crew workload ratings increased between 50 sec and 300 sec of time-delay in the Baseline configuration, workload ratings decreased (or remained flat) in the Mitigation configuration.

  8. Orion Optical Navigation Progress Toward Exploration Mission 1

    NASA Technical Reports Server (NTRS)

    Holt, Greg N.; D'Souza, Christopher N.; Saley, David

    2018-01-01

    Optical navigation of human spacecraft was proposed on Gemini and implemented successfully on Apollo as a means of autonomously operating the vehicle in the event of lost communication with controllers on Earth. The Orion emergency return system utilizing optical navigation has matured in design over the last several years, and is currently undergoing the final implementation and test phase in preparation for Exploration Mission 1 (EM-1) in 2019. The software development is past its Critical Design Review, and is progressing through test and certification for human rating. The filter architecture uses a square-root-free UDU covariance factorization. Linear Covariance Analysis (LinCov) was used to analyze the measurement models and the measurement error models on a representative EM-1 trajectory. The Orion EM-1 flight camera was calibrated at the Johnson Space Center (JSC) electro-optics lab. To permanently stake the focal length of the camera a 500 mm focal length refractive collimator was used. Two Engineering Design Unit (EDU) cameras and an EDU star tracker were used for a live-sky test in Denver. In-space imagery with high-fidelity truth metadata is rare so these live-sky tests provide one of the closest real-world analogs to operational use. A hardware-in-the-loop test rig was developed in the Johnson Space Center Electro-Optics Lab to exercise the OpNav system prior to integrated testing on the Orion vehicle. The software is verified with synthetic images. Several hundred off-nominal images are also used to analyze robustness and fault detection in the software. These include effects such as stray light, excess radiation damage, and specular reflections, and are used to help verify the tuning parameters chosen for the algorithms such as earth atmosphere bias, minimum pixel intensity, and star detection thresholds.

  9. Design of a walking robot

    NASA Technical Reports Server (NTRS)

    Whittaker, William; Dowling, Kevin

    1994-01-01

    Carnegie Mellon University's Autonomous Planetary Exploration Program (APEX) is currently building the Daedalus robot; a system capable of performing extended autonomous planetary exploration missions. Extended autonomy is an important capability because the continued exploration of the Moon, Mars and other solid bodies within the solar system will probably be carried out by autonomous robotic systems. There are a number of reasons for this - the most important of which are the high cost of placing a man in space, the high risk associated with human exploration and communication delays that make teleoperation infeasible. The Daedalus robot represents an evolutionary approach to robot mechanism design and software system architecture. Daedalus incorporates key features from a number of predecessor systems. Using previously proven technologies, the Apex project endeavors to encompass all of the capabilities necessary for robust planetary exploration. The Ambler, a six-legged walking machine was developed by CMU for demonstration of technologies required for planetary exploration. In its five years of life, the Ambler project brought major breakthroughs in various areas of robotic technology. Significant progress was made in: mechanism and control, by introducing a novel gait pattern (circulating gait) and use of orthogonal legs; perception, by developing sophisticated algorithms for map building; and planning, by developing and implementing the Task Control Architecture to coordinate tasks and control complex system functions. The APEX project is the successor of the Ambler project.

  10. Design of a walking robot

    NASA Astrophysics Data System (ADS)

    Whittaker, William; Dowling, Kevin

    1994-03-01

    Carnegie Mellon University's Autonomous Planetary Exploration Program (APEX) is currently building the Daedalus robot; a system capable of performing extended autonomous planetary exploration missions. Extended autonomy is an important capability because the continued exploration of the Moon, Mars and other solid bodies within the solar system will probably be carried out by autonomous robotic systems. There are a number of reasons for this - the most important of which are the high cost of placing a man in space, the high risk associated with human exploration and communication delays that make teleoperation infeasible. The Daedalus robot represents an evolutionary approach to robot mechanism design and software system architecture. Daedalus incorporates key features from a number of predecessor systems. Using previously proven technologies, the Apex project endeavors to encompass all of the capabilities necessary for robust planetary exploration. The Ambler, a six-legged walking machine was developed by CMU for demonstration of technologies required for planetary exploration. In its five years of life, the Ambler project brought major breakthroughs in various areas of robotic technology. Significant progress was made in: mechanism and control, by introducing a novel gait pattern (circulating gait) and use of orthogonal legs; perception, by developing sophisticated algorithms for map building; and planning, by developing and implementing the Task Control Architecture to coordinate tasks and control complex system functions. The APEX project is the successor of the Ambler project.

  11. Baseline tests of an autonomous telerobotic system for assembly of space truss structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Coung

    1994-01-01

    Several proposed space missions include precision reflectors that are larger in diameter than any current or proposed launch vehicle. Most of these reflectors will require a truss structure to accurately position the reflector panels and these reflectors will likely require assembly in orbit. A research program has been conducted at the NASA Langley Research Center to develop the technology required for the robotic assembly of truss structures. The focus of this research has been on hardware concepts, computer software control systems, and operator interfaces necessary to perform supervised autonomous assembly. A special facility was developed and four assembly and disassembly tests of a 102-strut tetrahedral truss have been conducted. The test procedures were developed around traditional 'pick-and-place' robotic techniques that rely on positioning repeatability for successful operation. The data from two of the four tests were evaluated and are presented in this report. All operations in the tests were controlled by predefined sequences stored in a command file, and the operator intervened only when the system paused because of the failure of an actuator command. The tests were successful in identifying potential pitfalls in a telerobotic system, many of which would not have been readily anticipated or incurred through simulation studies. Addressing the total integrated task, instead of bench testing the component parts, forced all aspects of the task to be evaluated. Although the test results indicate that additional developments should be pursued, no problems were encountered that would preclude automated assembly in space as a viable construction method.

  12. Optimized Autonomous Space In-situ Sensor-Web for volcano monitoring

    USGS Publications Warehouse

    Song, W.-Z.; Shirazi, B.; Kedar, S.; Chien, S.; Webb, F.; Tran, D.; Davis, A.; Pieri, D.; LaHusen, R.; Pallister, J.; Dzurisin, D.; Moran, S.; Lisowski, M.

    2008-01-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, a multidisciplinary team involving sensor-network experts (Washington State University), space scientists (JPL), and Earth scientists (USGS Cascade Volcano Observatory (CVO)), is developing a prototype dynamic and scaleable hazard monitoring sensor-web and applying it to volcano monitoring. The combined Optimized Autonomous Space -In-situ Sensor-web (OASIS) will have two-way communication capability between ground and space assets, use both space and ground data for optimal allocation of limited power and bandwidth resources on the ground, and use smart management of competing demands for limited space assets. It will also enable scalability and seamless infusion of future space and in-situ assets into the sensor-web. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been active since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO-1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real-time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of autonomously tasking the other. Sensor-web data acquisition and dissemination will be accomplished through the use of the Open Geospatial Consortium Sensorweb Enablement protocols. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform. ??2008 IEEE.

  13. Space station automation study: Autonomous systems and assembly, volume 2

    NASA Technical Reports Server (NTRS)

    Bradford, K. Z.

    1984-01-01

    This final report, prepared by Martin Marietta Denver Aerospace, provides the technical results of their input to the Space Station Automation Study, the purpose of which is to develop informed technical guidance in the use of autonomous systems to implement space station functions, many of which can be programmed in advance and are well suited for automated systems.

  14. AERCam Autonomy: Intelligent Software Architecture for Robotic Free Flying Nanosatellite Inspection Vehicles

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.; Duran, Steve G.; Braun, Angela N.; Straube, Timothy M.; Mitchell, Jennifer D.

    2006-01-01

    The NASA Johnson Space Center has developed a nanosatellite-class Free Flyer intended for future external inspection and remote viewing of human spacecraft. The Miniature Autonomous Extravehicular Robotic Camera (Mini AERCam) technology demonstration unit has been integrated into the approximate form and function of a flight system. The spherical Mini AERCam Free Flyer is 7.5 inches in diameter and weighs approximately 10 pounds, yet it incorporates significant additional capabilities compared to the 35-pound, 14-inch diameter AERCam Sprint that flew as a Shuttle flight experiment in 1997. Mini AERCam hosts a full suite of miniaturized avionics, instrumentation, communications, navigation, power, propulsion, and imaging subsystems, including digital video cameras and a high resolution still image camera. The vehicle is designed for either remotely piloted operations or supervised autonomous operations, including automatic stationkeeping, point-to-point maneuvering, and waypoint tracking. The Mini AERCam Free Flyer is accompanied by a sophisticated control station for command and control, as well as a docking system for automated deployment, docking, and recharge at a parent spacecraft. Free Flyer functional testing has been conducted successfully on both an airbearing table and in a six-degree-of-freedom closed-loop orbital simulation with avionics hardware in the loop. Mini AERCam aims to provide beneficial on-orbit views that cannot be obtained from fixed cameras, cameras on robotic manipulators, or cameras carried by crewmembers during extravehicular activities (EVA s). On Shuttle or International Space Station (ISS), for example, Mini AERCam could support external robotic operations by supplying orthogonal views to the intravehicular activity (IVA) robotic operator, supply views of EVA operations to IVA and/or ground crews monitoring the EVA, and carry out independent visual inspections of areas of interest around the spacecraft. To enable these future benefits with minimal impact on IVA operators and ground controllers, the Mini AERCam system architecture incorporates intelligent systems attributes that support various autonomous capabilities. 1) A robust command sequencer enables task-level command scripting. Command scripting is employed for operations such as automatic inspection scans over a region of interest, and operator-hands-off automated docking. 2) A system manager built on the same expert-system software as the command sequencer provides detection and smart-response capability for potential system-level anomalies, like loss of communications between the Free Flyer and control station. 3) An AERCam dynamics manager provides nominal and off-nominal management of guidance, navigation, and control (GN&C) functions. It is employed for safe trajectory monitoring, contingency maneuvering, and related roles. This paper will describe these architectural components of Mini AERCam autonomy, as well as the interaction of these elements with a human operator during supervised autonomous control.

  15. A Flight Deck Decision Support Tool for Autonomous Airborne Operations

    NASA Technical Reports Server (NTRS)

    Ballin, Mark G.; Sharma, Vivek; Vivona, Robert A.; Johnson, Edward J.; Ramiscal, Ermin

    2002-01-01

    NASA is developing a flight deck decision support tool to support research into autonomous operations in a future distributed air/ground traffic management environment. This interactive real-time decision aid, referred to as the Autonomous Operations Planner (AOP), will enable the flight crew to plan autonomously in the presence of dense traffic and complex flight management constraints. In assisting the flight crew, the AOP accounts for traffic flow management and airspace constraints, schedule requirements, weather hazards, aircraft operational limits, and crew or airline flight-planning goals. This paper describes the AOP and presents an overview of functional and implementation design considerations required for its development. Required AOP functionality is described, its application in autonomous operations research is discussed, and a prototype software architecture for the AOP is presented.

  16. A fault-tolerant intelligent robotic control system

    NASA Technical Reports Server (NTRS)

    Marzwell, Neville I.; Tso, Kam Sing

    1993-01-01

    This paper describes the concept, design, and features of a fault-tolerant intelligent robotic control system being developed for space and commercial applications that require high dependability. The comprehensive strategy integrates system level hardware/software fault tolerance with task level handling of uncertainties and unexpected events for robotic control. The underlying architecture for system level fault tolerance is the distributed recovery block which protects against application software, system software, hardware, and network failures. Task level fault tolerance provisions are implemented in a knowledge-based system which utilizes advanced automation techniques such as rule-based and model-based reasoning to monitor, diagnose, and recover from unexpected events. The two level design provides tolerance of two or more faults occurring serially at any level of command, control, sensing, or actuation. The potential benefits of such a fault tolerant robotic control system include: (1) a minimized potential for damage to humans, the work site, and the robot itself; (2) continuous operation with a minimum of uncommanded motion in the presence of failures; and (3) more reliable autonomous operation providing increased efficiency in the execution of robotic tasks and decreased demand on human operators for controlling and monitoring the robotic servicing routines.

  17. Autonomous space target recognition and tracking approach using star sensors based on a Kalman filter.

    PubMed

    Ye, Tao; Zhou, Fuqiang

    2015-04-10

    When imaged by detectors, space targets (including satellites and debris) and background stars have similar point-spread functions, and both objects appear to change as detectors track targets. Therefore, traditional tracking methods cannot separate targets from stars and cannot directly recognize targets in 2D images. Consequently, we propose an autonomous space target recognition and tracking approach using a star sensor technique and a Kalman filter (KF). A two-step method for subpixel-scale detection of star objects (including stars and targets) is developed, and the combination of the star sensor technique and a KF is used to track targets. The experimental results show that the proposed method is adequate for autonomously recognizing and tracking space targets.

  18. Evolution of the Hubble Space Telescope Safing Systems

    NASA Technical Reports Server (NTRS)

    Pepe, Joyce; Myslinski, Michael

    2006-01-01

    The Hubble Space Telescope (HST) was launched on April 24 1990, with an expected lifespan of 15 years. Central to the spacecraft design was the concept of a series of on-orbit shuttle servicing missions permitting astronauts to replace failed equipment, update the scientific instruments and keep the HST at the forefront of astronomical discoveries. One key to the success of the Hubble mission has been the robust Safing systems designed to monitor the performance of the observatory and to react to keep the spacecraft safe in the event of equipment anomaly. The spacecraft Safing System consists of a range of software tests in the primary flight computer that evaluate the performance of mission critical hardware, safe modes that are activated when the primary control mode is deemed inadequate for protecting the vehicle, and special actions that the computer can take to autonomously reconfigure critical hardware. The HST Safing System was structured to autonomously detect electrical power system, data management system, and pointing control system malfunctions and to configure the vehicle to ensure safe operation without ground intervention for up to 72 hours. There is also a dedicated safe mode computer that constantly monitors a keep-alive signal from the primary computer. If this signal stops, the safe mode computer shuts down the primary computer and takes over control of the vehicle, putting it into a safe, low-power configuration. The HST Safing system has continued to evolve as equipment has aged, as new hardware has been installed on the vehicle, and as the operation modes have matured during the mission. Along with the continual refinement of the limits used in the safing tests, several new tests have been added to the monitoring system, and new safe modes have been added to the flight software. This paper will focus on the evolution of the HST Safing System and Safing tests, and the importance of this evolution to prolonging the science operations of the telescope.

  19. Analysis and design of a capsule landing system and surface vehicle control system for Mars exploration. [performance tests of remote control equipment for roving vehicles

    NASA Technical Reports Server (NTRS)

    Gisser, D. G.; Frederick, D. K.; Sandor, G. N.; Shen, C. N.; Yerazunis, S. W.

    1976-01-01

    Problems related to the design and control of an autonomous rover for the purpose of unmanned exploration of the planets were considered. Building on the basis of prior studies, a four wheeled rover of unusual mobility and maneuverability was further refined and tested under both laboratory and field conditions. A second major effort was made to develop autonomous guidance. Path selection systems capable of dealing with relatively formidable hazard and terrains involving various short range (1.0-3.0 meters), hazard detection systems using a triangulation detection concept were simulated and evaluated. The mechanical/electronic systems required to implement such a scheme were constructed and tested. These systems include: laser transmitter, photodetectors, the necessary data handling/controlling systems and a scanning mast. In addition, a telemetry system to interface the vehicle, the off-board computer and a remote control module for operator intervention were developed. Software for the autonomous control concept was written. All of the systems required for complete autonomous control were shown to be satisfactory except for that portion of the software relating to the handling of interrupt commands.

  20. Secure Autonomous Automated Scheduling (SAAS). Rev. 1.1

    NASA Technical Reports Server (NTRS)

    Walke, Jon G.; Dikeman, Larry; Sage, Stephen P.; Miller, Eric M.

    2010-01-01

    This report describes network-centric operations, where a virtual mission operations center autonomously receives sensor triggers, and schedules space and ground assets using Internet-based technologies and service-oriented architectures. For proof-of-concept purposes, sensor triggers are received from the United States Geological Survey (USGS) to determine targets for space-based sensors. The Surrey Satellite Technology Limited (SSTL) Disaster Monitoring Constellation satellite, the UK-DMC, is used as the space-based sensor. The UK-DMC's availability is determined via machine-to-machine communications using SSTL's mission planning system. Access to/from the UK-DMC for tasking and sensor data is via SSTL's and Universal Space Network's (USN) ground assets. The availability and scheduling of USN's assets can also be performed autonomously via machine-to-machine communications. All communication, both on the ground and between ground and space, uses open Internet standards

  1. Active Control of NITINOL-Reinforced Structural Composites

    DTIC Science & Technology

    1992-10-12

    useful in many critical structures that are intended to operate autonomously for long durations in isolated environments such as defense vehicles , space...durations in isolated environment such as defense vehicles , space structures and satellites. ACKNOWLEDGEMENTS This work is funded by a grant from the US Army...are intended to operate autonomously for long durations in isolated environment such as defense vehicles , space structures and satellites. REFERENCES

  2. Demonstration of Self-Training Autonomous Neural Networks in Space Vehicle Docking Simulations

    NASA Technical Reports Server (NTRS)

    Patrick, M. Clinton; Thaler, Stephen L.; Stevenson-Chavis, Katherine

    2006-01-01

    Neural Networks have been under examination for decades in many areas of research, with varying degrees of success and acceptance. Key goals of computer learning, rapid problem solution, and automatic adaptation have been elusive at best. This paper summarizes efforts at NASA's Marshall Space Flight Center harnessing such technology to autonomous space vehicle docking for the purpose of evaluating applicability to future missions.

  3. Utilization of the International Space Station for Crew Autonomous Scheduling Test (CAST)

    NASA Technical Reports Server (NTRS)

    Healy, Matthew; Marquez, Jesica; Hillenius, Steven; Korth, David; Bakalyar, Laure Rush; Woodbury, Neil; Larsen, Crystal M.; Bates, Shelby; Kockler, Mikayla; Rhodes, Brooke; hide

    2017-01-01

    The United States space policy is evolving toward missions beyond low Earth orbit. In an effort to meet that policy, NASA has recognized Autonomous Mission Operations (AMO) as a valuable capability. Identified within AMO capabilities is the potential for autonomous planning and replanning during human spaceflight operations. That is allowing crew members to collectively or individually participate in the development of their own schedules. Currently, dedicated mission operations planners collaborate with international partners to create daily plans for astronauts aboard the International Space Station (ISS), taking into account mission requirements, ground rules, and various vehicle and payload constraints. In future deep space operations the crew will require more independence from ground support due to communication transmission delays. Furthermore, crew members who are provided with the capability to schedule their own activities are able to leverage direct experience operating in the space environment, and possibly maximize their efficiency. CAST (Crew Autonomous Scheduling Test) is an ISS investigation designed to analyze three important hypotheses about crew autonomous scheduling. First, given appropriate inputs, the crew is able to create and execute a plan in a reasonable period of time without impacts to mission success. Second, the proximity of the planner, in this case the crew, to the planned operations increases their operational efficiency. Third, crew members are more satisfied when given a role in plan development. This paper presents the results from a single astronaut test subject who participated in five CAST sessions. The details on the operational philosophy of CAST are discussed, including the approach to crew training, selection criteria for test days, and data collection methods. CAST is a technology demonstration payload sponsored by the ISS Research Science and Technology Office, and performed by experts in Mission Operations Planning from the Flight Operations Directorate at NASA Johnson Space Center, and researchers across multiple NASA centers. It is hoped the results of this investigation will guide NASA's implementation of autonomous mission operations for long duration human space missions to Mars and beyond.

  4. Autonomous biological system-an unique method of conducting long duration space flight experiments for pharmaceutical and gravitational biology research

    NASA Astrophysics Data System (ADS)

    Anderson, G. A.; MacCallum, T. K.; Poynter, J. E.; Klaus, D., Dr.

    1998-01-01

    Paragon Space Development Corporation (SDC) has developed an Autonomous Biological System (ABS) that can be flown in space to provide for long term growth and breeding of aquatic plants, animals, microbes and algae. The system functions autonomously and in isolation from the spacecraft life support systems and with no mandatory crew time required for function or observation. The ABS can also be used for long term plant and animal life support and breeding on a free flyer space craft. The ABS units are a research tool for both pharmaceutical and basic space biological sciences. Development flights in May of 1996 and September, 1996 through January, 1997 were largely successful, showing both that the hardware and life systems are performing with beneficial results, though some surprises were still found. The two space flights, plus the current flight now on Mir, are expected to result in both a scientific and commercially usable system for breeding and propagation of animals and plants in space.

  5. Systems autonomy

    NASA Technical Reports Server (NTRS)

    Lum, Henry, Jr.

    1988-01-01

    Information on systems autonomy is given in viewgraph form. Information is given on space systems integration, intelligent autonomous systems, automated systems for in-flight mission operations, the Systems Autonomy Demonstration Project on the Space Station Thermal Control System, the architecture of an autonomous intelligent system, artificial intelligence research issues, machine learning, and real-time image processing.

  6. Autonomous Mission Manager for Rendezvous, Inspection and Mating

    NASA Technical Reports Server (NTRS)

    Zimpfer, Douglas J.

    2003-01-01

    To meet cost and safety objectives, space missions that involve proximity operations between two vehicles require a high level of autonomy to successfully complete their missions. The need for autonomy is primarily driven by the need to conduct complex operations outside of communication windows, and the communication time delays inherent in space missions. Autonomy also supports the goals of both NASA and the DOD to make space operations more routine, and lower operational costs by reducing the requirement for ground personnel. NASA and the DoD have several programs underway that require a much higher level of autonomy for space vehicles. NASA's Space Launch Initiative (SLI) program has ambitious goals of reducing costs by a factor or 10 and improving safety by a factor of 100. DARPA has recently begun its Orbital Express to demonstrate key technologies to make satellite servicing routine. The Air Force's XSS-ll program is developing a protoflight demonstration of an autonomous satellite inspector. A common element in space operations for many NASA and DOD missions is the ability to rendezvous, inspect anclJor dock with another spacecraft. For DARPA, this is required to service or refuel military satellites. For the Air Force, this is required to inspect un-cooperative resident space objects. For NASA, this is needed to meet the primary SLI design reference mission of International Space Station re-supply. A common aspect for each of these programs is an Autonomous Mission Manager that provides highly autonomous planning, execution and monitoring of the rendezvous, inspection and docking operations. This paper provides an overview of the Autonomous Mission Manager (AMM) design being incorporated into many of these technology programs. This AMM provides a highly scalable level of autonomous operations, ranging from automatic execution of ground-derived plans to highly autonomous onboard planning to meet ground developed mission goals. The AMM provides the capability to automatically execute the plans and monitor the system performance. In the event of system dispersions or failures the AMM can modify plans or abort to assure overall system safety. This paper describes the design and functionality of Draper's AMM framework, presents concept of operations associated with the use of the AMM, and outlines the relevant features of the flight demonstrations.

  7. In-Space Networking on NASA's SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Eddy, Wesley M.; Clark, Gilbert J.; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios and a flight computer for supporting in-space communication research. New technologies being studied using the SCaN Testbed include advanced networking, coding, and modulation protocols designed to support the transition of NASAs mission systems from primarily point to point data links and preplanned routes towards adaptive, autonomous internetworked operations needed to meet future mission objectives. Networking protocols implemented on the SCaN Testbed include the Advanced Orbiting Systems (AOS) link-layer protocol, Consultative Committee for Space Data Systems (CCSDS) Encapsulation Packets, Internet Protocol (IP), Space Link Extension (SLE), CCSDS File Delivery Protocol (CFDP), and Delay-Tolerant Networking (DTN) protocols including the Bundle Protocol (BP) and Licklider Transmission Protocol (LTP). The SCaN Testbed end-to-end system provides three S-band data links and one Ka-band data link to exchange space and ground data through NASAs Tracking Data Relay Satellite System or a direct-to-ground link to ground stations. The multiple data links and nodes provide several upgradable elements on both the space and ground systems. This paper will provide a general description of the testbeds system design and capabilities, discuss in detail the design and lessons learned in the implementation of the network protocols, and describe future plans for continuing research to meet the communication needs for evolving global space systems.

  8. Electronic and software subsystems for an autonomous roving vehicle. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Doig, G. A.

    1980-01-01

    The complete electronics packaging which controls the Mars roving vehicle is described in order to provide a broad overview of the systems that are part of that package. Some software debugging tools are also discussed. Particular emphasis is given to those systems that are controlled by the microprocessor. These include the laser mast, the telemetry system, the command link prime interface board, and the prime software.

  9. Advanced Automation for Ion Trap Mass Spectrometry-New Opportunities for Real-Time Autonomous Analysis

    NASA Technical Reports Server (NTRS)

    Palmer, Peter T.; Wong, C. M.; Salmonson, J. D.; Yost, R. A.; Griffin, T. P.; Yates, N. A.; Lawless, James G. (Technical Monitor)

    1994-01-01

    The utility of MS/MS for both target compound analysis and the structure elucidation of unknowns has been described in a number of references. A broader acceptance of this technique has not yet been realized as it requires large, complex, and costly instrumentation which has not been competitive with more conventional techniques. Recent advancements in ion trap mass spectrometry promise to change this situation. Although the ion trap's small size, sensitivity, and ability to perform multiple stages of mass spectrometry have made it eminently suitable for on-line, real-time monitoring applications, advance automation techniques are required to make these capabilities more accessible to non-experts. Towards this end we have developed custom software for the design and implementation of MS/MS experiments. This software allows the user to take full advantage of the ion trap's versatility with respect to ionization techniques, scan proxies, and ion accumulation/ejection methods. Additionally, expert system software has been developed for autonomous target compound analysis. This software has been linked to ion trap control software and a commercial data system to bring all of the steps in the analysis cycle under control of the expert system. These software development efforts and their utilization for a number of trace analysis applications will be described.

  10. Application of "FLUOR-P" device for analysis of the space flight effects on the intracellular level.

    NASA Astrophysics Data System (ADS)

    Grigorieva, Olga; Rudimov, Evgeny; Buravkova, Ludmila; Galchuk, Sergey

    The mechanisms of cellular gravisensitivity still remain unclear despite the intensive research in the hypogravity effects on cellular function. In most cell culture experiments on unmanned vehicles "Bion" and "Photon", as well as on the ISS only allow post-flight analysis of biological material, including fixed cells is provided. The dynamic evaluation cellular parameters over a prolonged period of time is not possible. Thus, a promising direction is the development of equipment for onboard autonomous experiments. For this purpose, the SSC RF IBMP RAS has developed "FLUOR-P" device for measurement and recording of the dynamic differential fluorescent signal from nano- and microsized objects of organic and inorganic nature (human and animal cells, unicellular algae, bacteria, cellular organelles suspension) in hermetically sealed cuvettes. Besides, the device allows to record the main physical factors affecting the analyzed object (temperature and gravity loads: position in space, any vector acceleration, shock) in sync with the main measurements. The device is designed to perform long-term programmable autonomous experiments in space flight on biological satellites. The device software of allows to carry out complex experiments using cell. Permanent registration of data on built-in flash will give the opportunity to analyze the dynamics of the estimated parameters. FLUOR-P is designed as a monobloc (5.5 kg weight), 8 functional blocks are located in the inner space of the device. Each registration unit of the FLUOR-P has two channels of fluorescence intensity and excitation light source with the wavelength range from 300 nm to 700 nm. During biosatellite "Photon" flight is supposed to conduct a full analysis of the most important intracellular parameters (mitochondria activity and intracellular pH) dynamics under space flight factors and to assess the possible contribution of temperature on the effects of microgravity. Work is supported by Roskosmos and the Russian Academy of Sciences.

  11. Issues in the design of an executive controller shell for Space Station automation

    NASA Technical Reports Server (NTRS)

    Erickson, William K.; Cheeseman, Peter C.

    1986-01-01

    A major goal of NASA's Systems Autonomy Demonstration Project is to focus research in artificial intelligence, human factors, and dynamic control systems in support of Space Station automation. Another goal is to demonstrate the use of these technologies in real space systems, for both round-based mission support and on-board operations. The design, construction, and evaluation of an intelligent autonomous system shell is recognized as an important part of the Systems Autonomy research program. His paper describes autonomous systems and executive controllers, outlines how these intelligent systems can be utilized within the Space Station, and discusses a number of key design issues that have been raised during some preliminary work to develop an autonomous executive controller shell at NASA Ames Research Center.

  12. Overview of Intelligent Power Controller Development for the Deep Space Gateway

    NASA Technical Reports Server (NTRS)

    Csank, Jeffrey

    2017-01-01

    Intelligent, or autonomous, control of a spacecraft is an enabling technology that must be developed for deep space human exploration. NASAs current long term human space platform, the International Space Station, which is in Low Earth Orbit, is in almost continuous communication with ground based mission control. This allows near real-time control of all the vehicle core systems, including power, to be controlled by the ground. As focus shifts from Low Earth Orbit, communication time-lag and communication bandwidth limitations beyond geosynchronous orbit does not permit this type of operation. This presentation contains ongoing work at NASA to develop an architecture for autonomous power control and the vehicle manager which monitors, coordinates, and delegates to all the on-board subsystems to enable autonomous control of the complete spacecraft.

  13. Autonomous interplanetary constellation design

    NASA Astrophysics Data System (ADS)

    Chow, Cornelius Channing, II

    According to NASA's integrated space technology roadmaps, space-based infrastructures are envisioned as necessary ingredients to a sustained effort in continuing space exploration. Whether it be for extra-terrestrial habitats, roving/cargo vehicles, or space tourism, autonomous space networks will provide a vital communications lifeline for both future robotic and human missions alike. Projecting that the Moon will be a bustling hub of activity within a few decades, a near-term opportunity for in-situ infrastructure development is within reach. This dissertation addresses the anticipated need for in-space infrastructure by investigating a general design methodology for autonomous interplanetary constellations; to illustrate the theory, this manuscript presents results from an application to the Earth-Moon neighborhood. The constellation design methodology is formulated as an optimization problem, involving a trajectory design step followed by a spacecraft placement sequence. Modeling the dynamics as a restricted 3-body problem, the investigated design space consists of families of periodic orbits which play host to the constellations, punctuated by arrangements of spacecraft autonomously guided by a navigation strategy called LiAISON (Linked Autonomous Interplanetary Satellite Orbit Navigation). Instead of more traditional exhaustive search methods, a numerical continuation approach is implemented to map the admissible configuration space. In particular, Keller's pseudo-arclength technique is used to follow folding/bifurcating solution manifolds, which are otherwise inaccessible with other parameter continuation schemes. A succinct characterization of the underlying structure of the local, as well as global, extrema is thus achievable with little a priori intuition of the solution space. Furthermore, the proposed design methodology offers benefits in computation speed plus the ability to handle mildly stochastic systems. An application of the constellation design methodology to the restricted Earth-Moon system, reveals optimal pairwise configurations for various L1, L2, and L5 (halo, axial, and vertical) periodic orbit families. Navigation accuracies, ranging from O (10+/-1) meters in position space, are obtained for the optimal Earth-Moon constellations, given measurement noise on the order of 1 meter.

  14. Application of parallelized software architecture to an autonomous ground vehicle

    NASA Astrophysics Data System (ADS)

    Shakya, Rahul; Wright, Adam; Shin, Young Ho; Momin, Orko; Petkovsek, Steven; Wortman, Paul; Gautam, Prasanna; Norton, Adam

    2011-01-01

    This paper presents improvements made to Q, an autonomous ground vehicle designed to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2010 IGVC, Q was upgraded with a new parallelized software architecture and a new vision processor. Improvements were made to the power system reducing the number of batteries required for operation from six to one. In previous years, a single state machine was used to execute the bulk of processing activities including sensor interfacing, data processing, path planning, navigation algorithms and motor control. This inefficient approach led to poor software performance and made it difficult to maintain or modify. For IGVC 2010, the team implemented a modular parallel architecture using the National Instruments (NI) LabVIEW programming language. The new architecture divides all the necessary tasks - motor control, navigation, sensor data collection, etc. into well-organized components that execute in parallel, providing considerable flexibility and facilitating efficient use of processing power. Computer vision is used to detect white lines on the ground and determine their location relative to the robot. With the new vision processor and some optimization of the image processing algorithm used last year, two frames can be acquired and processed in 70ms. With all these improvements, Q placed 2nd in the autonomous challenge.

  15. Reconfigurable Autonomy for Future Planetary Rovers

    NASA Astrophysics Data System (ADS)

    Burroughes, Guy

    Extra-terrestrial Planetary rover systems are uniquely remote, placing constraints in regard to communication, environmental uncertainty, and limited physical resources, and requiring a high level of fault tolerance and resistance to hardware degradation. This thesis presents a novel self-reconfiguring autonomous software architecture designed to meet the needs of extraterrestrial planetary environments. At runtime it can safely reconfigure low-level control systems, high-level decisional autonomy systems, and managed software architecture. The architecture can perform automatic Verification and Validation of self-reconfiguration at run-time, and enables a system to be self-optimising, self-protecting, and self-healing. A novel self-monitoring system, which is non-invasive, efficient, tunable, and autonomously deploying, is also presented. The architecture was validated through the use-case of a highly autonomous extra-terrestrial planetary exploration rover. Three major forms of reconfiguration were demonstrated and tested: first, high level adjustment of system internal architecture and goal; second, software module modification; and third, low level alteration of hardware control in response to degradation of hardware and environmental change. The architecture was demonstrated to be robust and effective in a Mars sample return mission use-case testing the operational aspects of a novel, reconfigurable guidance, navigation, and control system for a planetary rover, all operating in concert through a scenario that required reconfiguration of all elements of the system.

  16. Autonomous Navigation Using Celestial Objects

    NASA Technical Reports Server (NTRS)

    Folta, David; Gramling, Cheryl; Leung, Dominic; Belur, Sheela; Long, Anne

    1999-01-01

    In the twenty-first century, National Aeronautics and Space Administration (NASA) Enterprises envision frequent low-cost missions to explore the solar system, observe the universe, and study our planet. Satellite autonomy is a key technology required to reduce satellite operating costs. The Guidance, Navigation, and Control Center (GNCC) at the Goddard Space Flight Center (GSFC) currently sponsors several initiatives associated with the development of advanced spacecraft systems to provide autonomous navigation and control. Autonomous navigation has the potential both to increase spacecraft navigation system performance and to reduce total mission cost. By eliminating the need for routine ground-based orbit determination and special tracking services, autonomous navigation can streamline spacecraft ground systems. Autonomous navigation products can be included in the science telemetry and forwarded directly to the scientific investigators. In addition, autonomous navigation products are available onboard to enable other autonomous capabilities, such as attitude control, maneuver planning and orbit control, and communications signal acquisition. Autonomous navigation is required to support advanced mission concepts such as satellite formation flying. GNCC has successfully developed high-accuracy autonomous navigation systems for near-Earth spacecraft using NASA's space and ground communications systems and the Global Positioning System (GPS). Recently, GNCC has expanded its autonomous navigation initiative to include satellite orbits that are beyond the regime in which use of GPS is possible. Currently, GNCC is assessing the feasibility of using standard spacecraft attitude sensors and communication components to provide autonomous navigation for missions including: libration point, gravity assist, high-Earth, and interplanetary orbits. The concept being evaluated uses a combination of star, Sun, and Earth sensor measurements along with forward-link Doppler measurements from the command link carrier to autonomously estimate the spacecraft's orbit and reference oscillator's frequency. To support autonomous attitude determination and control and maneuver planning and control, the orbit determination accuracy should be on the order of kilometers in position and centimeters per second in velocity. A less accurate solution (one hundred kilometers in position) could be used for acquisition purposes for command and science downloads. This paper provides performance results for both libration point orbiting and high Earth orbiting satellites as a function of sensor measurement accuracy, measurement types, measurement frequency, initial state errors, and dynamic modeling errors.

  17. ANTS: A New Concept for Very Remote Exploration with Intelligent Software Agents

    NASA Astrophysics Data System (ADS)

    Clark, P. E.; Curtis, S.; Rilee, M.; Truszkowski, W.; Iyengar, J.; Crawford, H.

    2001-12-01

    ANTS (Autonomous Nano-Technology Swarm), a NASA advanced mission concept, is a large (100 to 1000 member) swarm of pico-class (1 kg) totally autonomous spacecraft that prospect the asteroid belt. As the capacity and complexity of hardware and software, and the sophistication of technical and scientific goals have increased, greater cost constraints have led to fewer resources and thus, the need to operate spacecraft with less frequent contact. At present, autonomous operation of spacecraft systems allows great capability of spacecraft to 'safe' themselves when conditions threaten spacecraft safety. To further develop spacecraft capability, NASA is at the forefront of Intelligent Software Agent (ISA) research, performing experiments in space and on the ground to advance deliberative and collaborative autonomous control techniques. Selected missions in current planning stages require small groups of spacecraft to cooperate at a tactical level to select and schedule measurements to be made by appropriate instruments to characterize rapidly unfolding real-time events on a routine basis. The next level of development, which we are considering here, is in the use of ISAs at a strategic level, to explore the final, remote frontiers of the solar system, potentially involving a large class of objects with only infrequent contact possible. Obvious mission candidates are mainbelt asteroids, a population consisting of more than a million small bodies. Although a large fraction of solar system objects are asteroids, little data is available for them because the vast majority of them are too small to be observed except in close proximity. Asteroids originated in the transitional region between the inner (rocky) and outer (solidified gases) solar system, have remained largely unmodified since formation, and thus have a more primitive composition which includes higher abundances of siderophile (metallic iron-associated) elements and volatiles than other planetary surfaces. As a result, there has been interest in asteroids as sources of exploitable resources. Far more reconnaissance is required before such a program is undertaken. A traditional mission approach (to explore larger asteroids sequentially) is not adequate for determining the systematic distribution of exploitable material in the asteroid population. Our approach involves the use of distributed intelligence in a swarm of tiny spacecraft, each with specialized instrument capability (e.g., advanced computing, imaging, spectrometry, etc.) to evaluate the resource potential of the entire population. Supervised clusters of spacecraft will operate simultaneously within a broadly defined framework of goals to select targets (>1000) from among available candidates and to develop scenarios for studying targets simultaneously. Spacecraft use solar sails to fly directly to asteroids 1 kilometer or greater in diameter. Selected swarm members return to Earth with data, replacements join the swarm as needed. We would like to acknowledge our students R. Watson, V. Cox, and F. Olukomo for their support of this work.

  18. Optimized Autonomous Space - In-situ Sensorweb: A new Tool for Monitoring Restless Volcanoes

    NASA Astrophysics Data System (ADS)

    Lahusen, R. G.; Kedar, S.; Song, W.; Chien, S.; Shirazi, B.; Davies, A.; Tran, D.; Pieri, D.

    2007-12-01

    An interagency team of earth scientists, space scientists and computer scientists are collaborating to develop a real-time monitoring system optimized for rapid deployment at restless volcanoes. The primary goals of this Optimized Autonomous Space In-situ Sensorweb (OASIS) are: 1) integrate complementary space and in-situ (ground-based) elements into an interactive, autonomous sensorweb; 2) advance sensorweb power and communication resource management technology; and 3) enable scalability for seamless infusion of future space and in-situ assets into the sensorweb. A prototype system will be deployed on Mount St. Helens by December 2009. Each node will include GPS, seismic, infrasonic and lightning (for ash plume detection) sensors plus autonomous decision making capabilities and interaction with EO-1 multi-spectral satellite. This three year project is jointly funded by NASA AIST program and USGS Volcano Hazards Program. Work has begun with a rigorous multi-disciplinary discussion and resulted in a system requirements document aimed to guide the design of OASIS and future networks and to achieve the project's stated goals. In this presentation we will highlight the key OASIS system requirements, their rationale and the physical and technical challenges they pose. Preliminary design decisions will be presented.

  19. NASA Ames Sustainability Initiatives: Aeronautics, Space Exploration, and Sustainable Futures

    NASA Technical Reports Server (NTRS)

    Grymes, Rosalind A.

    2015-01-01

    In support of the mission-specific challenges of aeronautics and space exploration, NASA Ames produces a wealth of research and technology advancements with significant relevance to larger issues of planetary sustainability. NASA research on NexGen airspace solutions and its development of autonomous and intelligent technologies will revolutionize both the nation's air transporation systems and have applicability to the low altitude flight economy and to both air and ground transporation, more generally. NASA's understanding of the Earth as a complex of integrated systems contributes to humanity's perception of the sustainability of our home planet. Research at NASA Ames on closed environment life support systems produces directly applicable lessons on energy, water, and resource management in ground-based infrastructure. Moreover, every NASA campus is a 'city'; including an urbanscape and a workplace including scientists, human relations specialists, plumbers, engineers, facility managers, construction trades, transportation managers, software developers, leaders, financial planners, technologists, electricians, students, accountants, and even lawyers. NASA is applying the lessons of our mission-related activities to our urbanscapes and infrastructure, and also anticipates a leadership role in developing future environments for living and working in space.

  20. In-Space Networking On NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David; Eddy, Wesley M.; Clark, Gilbert J., III; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios (SDRs) and a programmable flight computer. The purpose of the Testbed is to conduct inspace research in the areas of communication, navigation, and networking in support of NASA missions and communication infrastructure. Multiple reprogrammable elements in the end to end system, along with several communication paths and a semi-operational environment, provides a unique opportunity to explore networking concepts and protocols envisioned for the future Solar System Internet (SSI). This paper will provide a general description of the system's design and the networking protocols implemented and characterized on the testbed, including Encapsulation, IP over CCSDS, and Delay-Tolerant Networking (DTN). Due to the research nature of the implementation, flexibility and robustness are considered in the design to enable expansion for future adaptive and cognitive techniques. Following a detailed design discussion, lessons learned and suggestions for future missions and communication infrastructure elements will be provided. Plans for the evolving research on SCaN Testbed as it moves towards a more adaptive, autonomous system will be discussed.

  1. G-Guidance Interface Design for Small Body Mission Simulation

    NASA Technical Reports Server (NTRS)

    Acikmese, Behcet; Carson, John; Phan, Linh

    2008-01-01

    The G-Guidance software implements a guidance and control (G and C) algorithm for small-body, autonomous proximity operations, developed under the Small Body GN and C task at JPL. The software is written in Matlab and interfaces with G-OPT, a JPL-developed optimization package written in C that provides G-Guidance with guaranteed convergence to a solution in a finite computation time with a prescribed accuracy. The resulting program is computationally efficient and is a prototype of an onboard, real-time algorithm for autonomous guidance and control. Two thruster firing schemes are available in G-Guidance, allowing tailoring of the software for specific mission maneuvers. For example, descent, landing, or rendezvous benefit from a thruster firing at the maneuver termination to mitigate velocity errors. Conversely, ascent or separation maneuvers benefit from an immediate firing to avoid potential drift toward a second body. The guidance portion of this software explicitly enforces user-defined control constraints and thruster silence times while minimizing total fuel usage. This program is currently specialized to small-body proximity operations, but the underlying method can be generalized to other applications.

  2. Optimized autonomous space in-situ sensor web for volcano monitoring

    USGS Publications Warehouse

    Song, W.-Z.; Shirazi, B.; Huang, R.; Xu, M.; Peterson, N.; LaHusen, R.; Pallister, J.; Dzurisin, D.; Moran, S.; Lisowski, M.; Kedar, S.; Chien, S.; Webb, F.; Kiely, A.; Doubleday, J.; Davies, A.; Pieri, D.

    2010-01-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, a multidisciplinary team involving sensor-network experts (Washington State University), space scientists (JPL), and Earth scientists (USGS Cascade Volcano Observatory (CVO)), have developed a prototype of dynamic and scalable hazard monitoring sensor-web and applied it to volcano monitoring. The combined Optimized Autonomous Space In-situ Sensor-web (OASIS) has two-way communication capability between ground and space assets, uses both space and ground data for optimal allocation of limited bandwidth resources on the ground, and uses smart management of competing demands for limited space assets. It also enables scalability and seamless infusion of future space and in-situ assets into the sensor-web. The space and in-situ control components of the system are integrated such that each element is capable of autonomously tasking the other. The ground in-situ was deployed into the craters and around the flanks of Mount St. Helens in July 2009, and linked to the command and control of the Earth Observing One (EO-1) satellite. ?? 2010 IEEE.

  3. ISS Mini AERCam Radio Frequency (RF) Coverage Analysis Using iCAT Development Tool

    NASA Technical Reports Server (NTRS)

    Bolen, Steve; Vazquez, Luis; Sham, Catherine; Fredrickson, Steven; Fink, Patrick; Cox, Jan; Phan, Chau; Panneton, Robert

    2003-01-01

    The long-term goals of the National Aeronautics and Space Administration's (NASA's) Human Exploration and Development of Space (HEDS) enterprise may require the development of autonomous free-flier (FF) robotic devices to operate within the vicinity of low-Earth orbiting spacecraft to supplement human extravehicular activities (EVAs) in space. Future missions could require external visual inspection of the spacecraft that would be difficult, or dangerous, for humans to perform. Under some circumstance, it may be necessary to employ an un-tethered communications link between the FF and the users. The interactive coverage analysis tool (ICAT) is a software tool that has been developed to perform critical analysis of the communications link performance for a FF operating in the vicinity of the International Space Station (ISS) external environment. The tool allows users to interactively change multiple parameters of the communications link parameters to efficiently perform systems engineering trades on network performance. These trades can be directly translated into design and requirements specifications. This tool significantly reduces the development time in determining a communications network topology by allowing multiple parameters to be changed, and the results of link coverage to be statistically characterized and plotted interactively.

  4. Key Issues for Navigation and Time Dissemination in NASA's Space Exploration Program

    NASA Technical Reports Server (NTRS)

    Nelson, R. A.; Brodsky, B.; Oria, A. J.; Connolly, J. W.; Sands, O. S.; Welch, B. W.; Ely T.; Orr, R.; Schuchman, L.

    2006-01-01

    The renewed emphasis on robotic and human missions within NASA's space exploration program warrants a detailed consideration of how the positions of objects in space will be determined and tracked, whether they be spacecraft, human explorers, robots, surface vehicles, or science instrumentation. The Navigation Team within the NASA Space Communications Architecture Working Group (SCAWG) has addressed several key technical issues in this area and the principle findings are reported here. For navigation in the vicinity of the Moon, a variety of satellite constellations have been investigated that provide global or regional surface position determination and timely services analogous to those offered by GPS at Earth. In the vicinity of Mars, there are options for satellite constellations not available at the Moon due to the gravitational perturbations from Earth, such as two satellites in an aerostationary orbit. Alternate methods of radiometric navigation as considered, including one- and two-way signals, as well as autonomous navigation. The use of a software radio capable of receiving all available signal sources, such as GPS, pseudolites, and communication channels, is discussed. Methods of time transfer and dissemination are also considered in this paper.

  5. 2006 NASA Range Safety Annual Report

    NASA Technical Reports Server (NTRS)

    TenHaken, Ron; Daniels, B.; Becker, M.; Barnes, Zack; Donovan, Shawn; Manley, Brenda

    2007-01-01

    Throughout 2006, Range Safety was involved in a number of exciting and challenging activities and events, from developing, implementing, and supporting Range Safety policies and procedures-such as the Space Shuttle Launch and Landing Plans, the Range Safety Variance Process, and the Expendable Launch Vehicle Safety Program procedures-to evaluating new technologies. Range Safety training development is almost complete with the last course scheduled to go on line in mid-2007. Range Safety representatives took part in a number of panels and councils, including the newly formed Launch Constellation Range Safety Panel, the Range Commanders Council and its subgroups, the Space Shuttle Range Safety Panel, and the unmanned aircraft systems working group. Space based range safety demonstration and certification (formerly STARS) and the autonomous flight safety system were successfully tested. The enhanced flight termination system will be tested in early 2007 and the joint advanced range safety system mission analysis software tool is nearing operational status. New technologies being evaluated included a processor for real-time compensation in long range imaging, automated range surveillance using radio interferometry, and a space based range command and telemetry processor. Next year holds great promise as we continue ensuring safety while pursuing our quest beyond the Moon to Mars.

  6. Automation and robotics

    NASA Technical Reports Server (NTRS)

    Montemerlo, Melvin

    1988-01-01

    The Autonomous Systems focus on the automation of control systems for the Space Station and mission operations. Telerobotics focuses on automation for in-space servicing, assembly, and repair. The Autonomous Systems and Telerobotics each have a planned sequence of integrated demonstrations showing the evolutionary advance of the state-of-the-art. Progress is briefly described for each area of concern.

  7. Life Science Research in Outer Space: New Platform Technologies for Low-Cost, Autonomous Small Satellite Missions

    NASA Technical Reports Server (NTRS)

    Ricco, Antonio J.; Parra, Macarena P.; Niesel, David; McGinnis, Michael; Ehrenfreund, Pascale; Nicholson, Wayne; Mancinelli, Rocco; Piccini, Matthew E.; Beasley, Christopher C.; Timucin, Linda R.; hide

    2009-01-01

    We develop integrated instruments and platforms suitable for economical, frequent space access for autonomous life science experiments and processes in outer space. The technologies represented by three of our recent free-flyer small-satellite missions are the basis of a rapidly growing toolbox of miniaturized biologically/biochemically-oriented instrumentation now enabling a new generation of in-situ space experiments. Autonomous small satellites ( 1 50 kg) are less expensive to develop and build than fullsize spacecraft and not subject to the comparatively high costs and scheduling challenges of human-tended experimentation on the International Space Station, Space Shuttle, and comparable platforms. A growing number of commercial, government, military, and civilian space launches now carry small secondary science payloads at far lower cost than dedicated missions; the number of opportunities is particularly large for so-called cube-sat and multicube satellites in the 1 10 kg range. The recent explosion in nano-, micro-, and miniature technologies, spanning fields from telecommunications to materials to bio/chemical analysis, enables development of remarkably capable autonomous miniaturized instruments to accomplish remote biological experimentation. High-throughput drug discovery, point-of-care medical diagnostics, and genetic analysis are applications driving rapid progress in autonomous bioanalytical technology. Three of our recent missions exemplify the development of miniaturized analytical payload instrumentation: GeneSat-1 (launched: December 2006), PharmaSat (launched: May 2009), and O/OREOS (organism/organics exposure to orbital stresses; scheduled launch: May 2010). We will highlight the overall architecture and integration of fluidic, optical, sensor, thermal, and electronic technologies and subsystems to support and monitor the growth of microorganisms in culture in these small autonomous space satellites, including real-time tracking of their culture density, gene expression, and metabolic activity while in the space environment. Flight data and results will be presented from GeneSat-1, which tracked gene expression levels of GFP-labeled E. coli and from PharmaSat, which monitored the dose dependency of an antifungal agent against S. cerevisiae. The O/OREOS SESLO instrument, which will study the effects of radiation and microgravity upon the viability and growth characteristics of B. subtilis and the halophile Halorubrum chaoviatoris for periods of 0 - 6 months in space, will be described as well. The ongoing expansion of the small satellite toolbox of biological technologies will be summarized.

  8. Intelligence algorithms for autonomous navigation in a ground vehicle

    NASA Astrophysics Data System (ADS)

    Petkovsek, Steve; Shakya, Rahul; Shin, Young Ho; Gautam, Prasanna; Norton, Adam; Ahlgren, David J.

    2012-01-01

    This paper will discuss the approach to autonomous navigation used by "Q," an unmanned ground vehicle designed by the Trinity College Robot Study Team to participate in the Intelligent Ground Vehicle Competition (IGVC). For the 2011 competition, Q's intelligence was upgraded in several different areas, resulting in a more robust decision-making process and a more reliable system. In 2010-2011, the software of Q was modified to operate in a modular parallel manner, with all subtasks (including motor control, data acquisition from sensors, image processing, and intelligence) running simultaneously in separate software processes using the National Instruments (NI) LabVIEW programming language. This eliminated processor bottlenecks and increased flexibility in the software architecture. Though overall throughput was increased, the long runtime of the image processing process (150 ms) reduced the precision of Q's realtime decisions. Q had slow reaction times to obstacles detected only by its cameras, such as white lines, and was limited to slow speeds on the course. To address this issue, the image processing software was simplified and also pipelined to increase the image processing throughput and minimize the robot's reaction times. The vision software was also modified to detect differences in the texture of the ground, so that specific surfaces (such as ramps and sand pits) could be identified. While previous iterations of Q failed to detect white lines that were not on a grassy surface, this new software allowed Q to dynamically alter its image processing state so that appropriate thresholds could be applied to detect white lines in changing conditions. In order to maintain an acceptable target heading, a path history algorithm was used to deal with local obstacle fields and GPS waypoints were added to provide a global target heading. These modifications resulted in Q placing 5th in the autonomous challenge and 4th in the navigation challenge at IGVC.

  9. [Characteristics of communication systems of suspected occupational disease in the Autonomous Communities, Spain].

    PubMed

    García Gómez, Montserrat; Urbaneja Arrúe, Félix; García López, Vega; Estaban Buedo, Valentín; Rodríguez Suárez, Valentín; Miralles Martínez-Portillo, Lourdes; González García, Isabel; Egea Garcia, Josefa; Corraliza Infanzon, Emma; Ramírez Salvador, Laura; Briz Blázquez, Santiago; Armengol Rosell, Ricard; Cisnal Gredilla, José María; Correa Rodríguez, Juan Francisco; Coto Fernández, Juan Carlos; Díaz Peral, Mª Rosario; Elvira Espinosa, Mercedes; Fernández Fernández, Iñigo; García-Ramos Alonso, Eduardo; Martínez Arguisuelas, Nieves; Rivas Pérez, Ana Isabel

    2017-03-17

    There are several initiatives to develop systems for the notification of suspected occupational disease (OD) in different autonomous communities. The objective was to describe the status of development and characteristics of these systems implemented by the health authorities. A cross-sectional descriptive study was carried out on the existence of systems for the information and surveillance of suspected OD, their legal framework, responsible institution and availability of information. A specific meeting was held and a survey was designed and sent to all autonomous communities and autonomous cities (AACC). Information was collected on the existence of a regulatory standard, assigned human resources, notifiers, coverage and number of suspected OD received, processed and recognized. 18 of 19 AACC responded. 10 have developed a suspected OD notification system, 3 of them supported by specific autonomic law. The notifiers were physicians of the public health services, physicians of the occupational health services and, in 2 cases, medical inspectors. 7 AACC had specific software to support the system. The OD recognition rate of suspected cases was 53% in the Basque Country; 41% in Castilla-La Mancha; 36% in Murcia; 32.6% in the Valencian Community and 31% in La Rioja. The study has revealed an heterogeneous development of suspected OD reporting systems in Spain. Although the trend is positive, only 55% of the AACC have some type of development and 39% have specific software supporting it. Therefore unequal OD recognition rates have been obtained depending on the territory.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Hsien-Hsin S

    The overall objective of this research project is to develop novel architectural techniques as well as system software to achieve a highly secure and intrusion-tolerant computing system. Such system will be autonomous, self-adapting, introspective, with self-healing capability under the circumstances of improper operations, abnormal workloads, and malicious attacks. The scope of this research includes: (1) System-wide, unified introspection techniques for autonomic systems, (2) Secure information-flow microarchitecture, (3) Memory-centric security architecture, (4) Authentication control and its implication to security, (5) Digital right management, (5) Microarchitectural denial-of-service attacks on shared resources. During the period of the project, we developed several architectural techniquesmore » and system software for achieving a robust, secure, and reliable computing system toward our goal.« less

  11. Galileo spacecraft power distribution and autonomous fault recovery

    NASA Technical Reports Server (NTRS)

    Detwiler, R. C.

    1982-01-01

    There is a trend in current spacecraft design to achieve greater fault tolerance through the implemenation of on-board software dedicated to detecting and isolating failures. A combination of hardware and software is utilized in the Galileo power system for autonomous fault recovery. Galileo is a dual-spun spacecraft designed to carry a number of scientific instruments into a series of orbits around the planet Jupiter. In addition to its self-contained scientific payload, it will also carry a probe system which will be separated from the spacecraft some 150 days prior to Jupiter encounter. The Galileo spacecraft is scheduled to be launched in 1985. Attention is given to the power system, the fault protection requirements, and the power fault recovery implementation.

  12. NASA Tech Briefs, June 2013

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Topics include: Cloud Absorption Radiometer Autonomous Navigation System - CANS, Software Method for Computed Tomography Cylinder Data Unwrapping, Re-slicing, and Analysis, Discrete Data Qualification System and Method Comprising Noise Series Fault Detection, Simple Laser Communications Terminal for Downlink from Earth Orbit at Rates Exceeding 10 Gb/s, Application Program Interface for the Orion Aerodynamics Database, Hyperspectral Imager-Tracker, Web Application Software for Ground Operations Planning Database (GOPDb) Management, Software Defined Radio with Parallelized Software Architecture, Compact Radar Transceiver with Included Calibration, Software Defined Radio with Parallelized Software Architecture, Phase Change Material Thermal Power Generator, The Thermal Hogan - A Means of Surviving the Lunar Night, Micromachined Active Magnetic Regenerator for Low-Temperature Magnetic Coolers, Nano-Ceramic Coated Plastics, Preparation of a Bimetal Using Mechanical Alloying for Environmental or Industrial Use, Phase Change Material for Temperature Control of Imager or Sounder on GOES Type Satellites in GEO, Dual-Compartment Inflatable Suitlock, Modular Connector Keying Concept, Genesis Ultrapure Water Megasonic Wafer Spin Cleaner, Piezoelectrically Initiated Pyrotechnic Igniter, Folding Elastic Thermal Surface - FETS, Multi-Pass Quadrupole Mass Analyzer, Lunar Sulfur Capture System, Environmental Qualification of a Single-Crystal Silicon Mirror for Spaceflight Use, Planar Superconducting Millimeter-Wave/Terahertz Channelizing Filter, Qualification of UHF Antenna for Extreme Martian Thermal Environments, Ensemble Eclipse: A Process for Prefab Development Environment for the Ensemble Project, ISS Live!, Space Operations Learning Center (SOLC) iPhone/iPad Application, Software to Compare NPP HDF5 Data Files, Planetary Data Systems (PDS) Imaging Node Atlas II, Automatic Calibration of an Airborne Imaging System to an Inertial Navigation Unit, Translating MAPGEN to ASPEN for MER, Support Routines for In Situ Image Processing, and Semi-Supervised Eigenbasis Novelty Detection.

  13. Autonomous Rendezvous and Docking Conference, volume 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This document consists of the presentation submitted at the Autonomous Rendezvous and Docking (ARD) Conference. It contains three volumes: ARD hardware technology; ARD software technology; and ARD operations. The purpose of this conference is to identify the technologies required for an on orbit demonstration of the ARD, assess the maturity of these technologies, and provide the necessary insight for a quality assessment of the programmatic management, technical, schedule, and cost risks.

  14. Autonomous Rendezvous and Docking Conference, volume 3

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This document consists of the presentation submitted at the Autonomous Rendezvous and Docking (ARD) Conference. The document contains three volumes: ARD hardware technology; ARD software technology; and ARD operations. The purpose of this conference is to identify the technologies required for an on orbit demonstration of ARD, assess the maturity of these technologies, and provide the necessary insight for a quality assessment of programmatic management, technical, schedule, and cost risks.

  15. Advanced Communication and Networking Technologies for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Hayden, Jeff; Agre, Jonathan R.; Clare, Loren P.; Yan, Tsun-Yee

    2001-01-01

    Next-generation Mars communications networks will provide communications and navigation services to a wide variety of Mars science vehicles including: spacecraft that are arriving at Mars, spacecraft that are entering and descending in the Mars atmosphere, scientific orbiter spacecraft, spacecraft that return Mars samples to Earth, landers, rovers, aerobots, airplanes, and sensing pods. In the current architecture plans, the communication services will be provided using capabilities deployed on the science vehicles as well as dedicated communication satellites that will together make up the Mars network. This network will evolve as additional vehicles arrive, depart or end their useful missions. Cost savings and increased reliability will result from the ability to share communication services between missions. This paper discusses the basic architecture that is needed to support the Mars Communications Network part of NASA's Space Science Enterprise (SSE) communications architecture. The network may use various networking technologies such as those employed in the terrestrial Internet, as well as special purpose deep-space protocols to move data and commands autonomously between vehicles, at disparate Mars vicinity sites (on the surface or in near-Mars space) and between Mars vehicles and earthbound users. The architecture of the spacecraft on-board local communications is being reconsidered in light of these new networking requirements. The trend towards increasingly autonomous operation of the spacecraft is aimed at reducing the dependence on resource scheduling provided by Earth-based operators and increasing system fault tolerance. However, these benefits will result in increased communication and software development requirements. As a result, the envisioned Mars communications infrastructure requires both hardware and protocol technology advancements. This paper will describe a number of the critical technology needs and some of the ongoing research activities.

  16. Autonomous Assembly of Modular Structures in Space and on Extraterrestrial Locations

    NASA Technical Reports Server (NTRS)

    Alhorn, Dean C.

    2005-01-01

    The fulfillment of the new US. National Vision for Space Exploration requires many new enabling technologies to accomplish the goal of utilizing space for commercial activities and for returning humans to the moon and extraterrestrial environments. Traditionally, flight structures are manufactured as complete systems and require humans to complete the integration and assembly in orbit. These structures are bulky and require the use of heavy launch vehicles to send the units to the desired location, e.g. International Space Station (ISS). This method requires a high degree of safety, numerous space walks and significant cost for the humans to perform the assembly in orbit. For example, for assembly and maintenance of the ISS, 52 Extravehicular Activities (EVA's) have been performed so far with a total EVA time of approximately 322 hours. Sixteen (16) shuttle flights haw been to the ISS to perform these activities with an approximate cost of $450M per mission. For future space missions, costs have to be reduced to reasonably achieve the exploration goals. One concept that has been proposed is the autonomous assembly of space structures. This concept is an affordable, reliable solution to in-space and extraterrestrial assembly operations. Assembly is autonomously performed when two components containing onboard electronics join after recognizing that the joint is appropriate and in the precise position and orientation required for assembly. The mechanism only activates when the specifications are correct and m a nominal range. After assembly, local sensors and electronics monitor the integrity of the joint for feedback to a master controller. To achieve this concept will require a shift in the methods for designing space structures. In addition, innovative techniques will be required to perform the assembly autonomously. Monitoring of the assembled joint will be necessary for safety and structural integrity. If a very large structure is to be assembled in orbit, then the number of integrity sensors will be significant. Thus simple, low cost sensors are integral to the success of this concept. This paper will address these issues and will propose a novel concept for assembling space structures autonomously. The paper will present Several autonomous assembly methods. Core technologies required to achieve in space assembly will be discussed and novel techniques for communicating, sensing, docking and assembly will be detailed. These core technologies are critical to the goal of utilizing space in a cost efficient and safe manner. Finally, these technologies can also be applied to other systems both on earth and extraterrestrial environments.

  17. Autonomous assistance navigation for robotic wheelchairs in confined spaces.

    PubMed

    Cheein, Fernando Auat; Carelli, Ricardo; De la Cruz, Celso; Muller, Sandra; Bastos Filho, Teodiano F

    2010-01-01

    In this work, a visual interface for the assistance of a robotic wheelchair's navigation is presented. The visual interface is developed for the navigation in confined spaces such as narrows corridors or corridor-ends. The interface performs two navigation modus: non-autonomous and autonomous. The non-autonomous driving of the robotic wheelchair is made by means of a hand-joystick. The joystick directs the motion of the vehicle within the environment. The autonomous driving is performed when the user of the wheelchair has to turn (90, 90 or 180 degrees) within the environment. The turning strategy is performed by a maneuverability algorithm compatible with the kinematics of the wheelchair and by the SLAM (Simultaneous Localization and Mapping) algorithm. The SLAM algorithm provides the interface with the information concerning the environment disposition and the pose -position and orientation-of the wheelchair within the environment. Experimental and statistical results of the interface are also shown in this work.

  18. Next Generation Remote Agent Planner

    NASA Technical Reports Server (NTRS)

    Jonsson, Ari K.; Muscettola, Nicola; Morris, Paul H.; Rajan, Kanna

    1999-01-01

    In May 1999, as part of a unique technology validation experiment onboard the Deep Space One spacecraft, the Remote Agent became the first complete autonomous spacecraft control architecture to run as flight software onboard an active spacecraft. As one of the three components of the architecture, the Remote Agent Planner had the task of laying out the course of action to be taken, which included activities such as turning, thrusting, data gathering, and communicating. Building on the successful approach developed for the Remote Agent Planner, the Next Generation Remote Agent Planner is a completely redesigned and reimplemented version of the planner. The new system provides all the key capabilities of the original planner, while adding functionality, improving performance and providing a modular and extendible implementation. The goal of this ongoing project is to develop a system that provides both a basis for future applications and a framework for further research in the area of autonomous planning for spacecraft. In this article, we present an introductory overview of the Next Generation Remote Agent Planner. We present a new and simplified definition of the planning problem, describe the basics of the planning process, lay out the new system design and examine the functionality of the core reasoning module.

  19. Autonomous Telemetry Collection for Single-Processor Small Satellites

    NASA Technical Reports Server (NTRS)

    Speer, Dave

    2003-01-01

    For the Space Technology 5 mission, which is being developed under NASA's New Millennium Program, a single spacecraft processor will be required to do on-board real-time computations and operations associated with attitude control, up-link and down-link communications, science data processing, solid-state recorder management, power switching and battery charge management, experiment data collection, health and status data collection, etc. Much of the health and status information is in analog form, and each of the analog signals must be routed to the input of an analog-to-digital converter, converted to digital form, and then stored in memory. If the micro-operations of the analog data collection process are implemented in software, the processor may use up a lot of time either waiting for the analog signal to settle, waiting for the analog-to-digital conversion to complete, or servicing a large number of high frequency interrupts. In order to off-load a very busy processor, the collection and digitization of all analog spacecraft health and status data will be done autonomously by a field-programmable gate array that can configure the analog signal chain, control the analog-to-digital converter, and store the converted data in memory.

  20. Bi-Level Integrated System Synthesis (BLISS)

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Agte, Jeremy S.; Sandusky, Robert R., Jr.

    1998-01-01

    BLISS is a method for optimization of engineering systems by decomposition. It separates the system level optimization, having a relatively small number of design variables, from the potentially numerous subsystem optimizations that may each have a large number of local design variables. The subsystem optimizations are autonomous and may be conducted concurrently. Subsystem and system optimizations alternate, linked by sensitivity data, producing a design improvement in each iteration. Starting from a best guess initial design, the method improves that design in iterative cycles, each cycle comprised of two steps. In step one, the system level variables are frozen and the improvement is achieved by separate, concurrent, and autonomous optimizations in the local variable subdomains. In step two, further improvement is sought in the space of the system level variables. Optimum sensitivity data link the second step to the first. The method prototype was implemented using MATLAB and iSIGHT programming software and tested on a simplified, conceptual level supersonic business jet design, and a detailed design of an electronic device. Satisfactory convergence and favorable agreement with the benchmark results were observed. Modularity of the method is intended to fit the human organization and map well on the computing technology of concurrent processing.

  1. Navigation of military and space unmanned ground vehicles in unstructured terrains

    NASA Technical Reports Server (NTRS)

    Lescoe, Paul; Lavery, David; Bedard, Roger

    1991-01-01

    Development of unmanned vehicles for local navigation in terrains unstructured by humans is reviewed. Modes of navigation include teleoperation or remote control, computer assisted remote driving (CARD), and semiautonomous navigation (SAN). A first implementation of a CARD system was successfully tested using the Robotic Technology Test Vehicle developed by Jet Propulsion Laboratory. Stereo pictures were transmitted to a remotely located human operator, who performed the sensing, perception, and planning functions of navigation. A computer provided range and angle measurements and the path plan was transmitted to the vehicle which autonomously executed the path. This implementation is to be enhanced by providing passive stereo vision and a reflex control system for autonomously stopping the vehicle if blocked by an obstacle. SAN achievements include implementation of a navigation testbed on a six wheel, three-body articulated rover vehicle, development of SAN algorithms and code, integration of SAN software onto the vehicle, and a successful feasibility demonstration that represents a step forward towards the technology required for long-range exploration of the lunar or Martian surface. The vehicle includes a passive stereo vision system with real-time area-based stereo image correlation, a terrain matcher, a path planner, and a path execution planner.

  2. Grasping objects autonomously in simulated KC-135 zero-g

    NASA Technical Reports Server (NTRS)

    Norsworthy, Robert S.

    1994-01-01

    The KC-135 aircraft was chosen for simulated zero gravity testing of the Extravehicular Activity Helper/retriever (EVAHR). A software simulation of the EVAHR hardware, KC-135 flight dynamics, collision detection and grasp inpact dynamics has been developed to integrate and test the EVAHR software prior to flight testing on the KC-135. The EVAHR software will perform target pose estimation, tracking, and motion estimation for rigid, freely rotating, polyhedral objects. Manipulator grasp planning and trajectory control software has also been developed to grasp targets while avoiding collisions.

  3. Managing Complexity in Next Generation Robotic Spacecraft: From a Software Perspective

    NASA Technical Reports Server (NTRS)

    Reinholtz, Kirk

    2008-01-01

    This presentation highlights the challenges in the design of software to support robotic spacecraft. Robotic spacecraft offer a higher degree of autonomy, however currently more capabilities are required, primarily in the software, while providing the same or higher degree of reliability. The complexity of designing such an autonomous system is great, particularly while attempting to address the needs for increased capabilities and high reliability without increased needs for time or money. The efforts to develop programming models for the new hardware and the integration of software architecture are highlighted.

  4. Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue

    NASA Technical Reports Server (NTRS)

    Zornetzer, Steve; Gage, Douglas

    2005-01-01

    Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.

  5. Case Studies of Software Development Tools for Parallel Architectures

    DTIC Science & Technology

    1993-06-01

    Simulation ............................................. 29 4.7.3 Visualization...autonomous entities, each with its own state and set of behaviors, as in simulation , tracking, or Battle Management. Because C2 applications are often... simulation , that is used to help the developer solve the problems. The new tool/problem solution matrix is structured in terms of the software development

  6. Autonomous onboard crew operations: A review and developmental approach

    NASA Technical Reports Server (NTRS)

    Rogers, J. G.

    1982-01-01

    A review of the literature generated by an intercenter mission approach and consolidation team and their contractors was performed to obtain background information on the development of autonomous operations concepts for future space shuttle and space platform missions. The Boeing 757/767 flight management system was examined to determine the relevance for transfer of the developmental approach and technology to the performance of the crew operations function. In specific, the engine indications and crew alerting system was studied to determine the relevance of this display for the performance of crew operations onboard the vehicle. It was concluded that the developmental approach and technology utilized in the aeronautics industry would be appropriate for development of an autonomous operations concept for the space platform.

  7. Verification Test of Automated Robotic Assembly of Space Truss Structures

    NASA Technical Reports Server (NTRS)

    Rhodes, Marvin D.; Will, Ralph W.; Quach, Cuong C.

    1995-01-01

    A multidisciplinary program has been conducted at the Langley Research Center to develop operational procedures for supervised autonomous assembly of truss structures suitable for large-aperture antennas. The hardware and operations required to assemble a 102-member tetrahedral truss and attach 12 hexagonal panels were developed and evaluated. A brute-force automation approach was used to develop baseline assembly hardware and software techniques. However, as the system matured and operations were proven, upgrades were incorporated and assessed against the baseline test results. These upgrades included the use of distributed microprocessors to control dedicated end-effector operations, machine vision guidance for strut installation, and the use of an expert system-based executive-control program. This paper summarizes the developmental phases of the program, the results of several assembly tests, and a series of proposed enhancements. No problems that would preclude automated in-space assembly or truss structures have been encountered. The test system was developed at a breadboard level and continued development at an enhanced level is warranted.

  8. A Custom Robotic System for Inspecting HEPA Filters in the Payload Changeout Room at the NASA Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Spencer, James E., Jr.; Looney, Joe

    1994-01-01

    In this paper, the prime objective is to describe a custom 4-dof (degree-of-freedom) robotic arm capable of autonomously or telerobotically performing systematic HEPA filter inspection and certification in the Shuttle Launch Pad Payload Changeout Rooms (PCR's) on pads A and B at the Kennedy Space Center, Florida. This HEPA filter inspection robot (HFIR) has been designed to be easily deployable and is equipped with the necessary sensory devices, control hardware, software and man-machine interfaces needed to implement HEPA filter inspection reliably and efficiently without damaging the filters or colliding with existing PCR structures or filters. The main purpose of the HFIR is to implement an automated positioning system to move special inspection sensors in pre-defined or manual patterns for the purpose of verifying filter integrity and efficiency. This will ultimately relieve NASA Payload Operations from significant problems associated with time, cost and personnel safety, impacts realized during non-automated PCR HFIR filter certification.

  9. A Model-Based Expert System for Space Power Distribution Diagnostics

    NASA Technical Reports Server (NTRS)

    Quinn, Todd M.; Schlegelmilch, Richard F.

    1994-01-01

    When engineers diagnose system failures, they often use models to confirm system operation. This concept has produced a class of advanced expert systems that perform model-based diagnosis. A model-based diagnostic expert system for the Space Station Freedom electrical power distribution test bed is currently being developed at the NASA Lewis Research Center. The objective of this expert system is to autonomously detect and isolate electrical fault conditions. Marple, a software package developed at TRW, provides a model-based environment utilizing constraint suspension. Originally, constraint suspension techniques were developed for digital systems. However, Marple provides the mechanisms for applying this approach to analog systems such as the test bed, as well. The expert system was developed using Marple and Lucid Common Lisp running on a Sun Sparc-2 workstation. The Marple modeling environment has proved to be a useful tool for investigating the various aspects of model-based diagnostics. This report describes work completed to date and lessons learned while employing model-based diagnostics using constraint suspension within an analog system.

  10. Towards Autonomous Inspection of Space Systems Using Mobile Robotic Sensor Platforms

    NASA Technical Reports Server (NTRS)

    Wong, Edmond; Saad, Ashraf; Litt, Jonathan S.

    2007-01-01

    The space transportation systems required to support NASA's Exploration Initiative will demand a high degree of reliability to ensure mission success. This reliability can be realized through autonomous fault/damage detection and repair capabilities. It is crucial that such capabilities are incorporated into these systems since it will be impractical to rely upon Extra-Vehicular Activity (EVA), visual inspection or tele-operation due to the costly, labor-intensive and time-consuming nature of these methods. One approach to achieving this capability is through the use of an autonomous inspection system comprised of miniature mobile sensor platforms that will cooperatively perform high confidence inspection of space vehicles and habitats. This paper will discuss the efforts to develop a small scale demonstration test-bed to investigate the feasibility of using autonomous mobile sensor platforms to perform inspection operations. Progress will be discussed in technology areas including: the hardware implementation and demonstration of robotic sensor platforms, the implementation of a hardware test-bed facility, and the investigation of collaborative control algorithms.

  11. Modular Autonomous Systems Technology Framework: A Distributed Solution for System Monitoring and Control

    NASA Technical Reports Server (NTRS)

    Badger, Julia M.; Claunch, Charles; Mathis, Frank

    2017-01-01

    The Modular Autonomous Systems Technology (MAST) framework is a tool for building distributed, hierarchical autonomous systems. Originally intended for the autonomous monitoring and control of spacecraft, this framework concept provides support for variable autonomy, assume-guarantee contracts, and efficient communication between subsystems and a centralized systems manager. MAST was developed at NASA's Johnson Space Center (JSC) and has been applied to an integrated spacecraft example scenario.

  12. Integration of the Remote Agent for the NASA Deep Space One Autonomy Experiment

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Bernard, Douglas E.; Gamble, Edward B., Jr.; Kanefsky, Bob; Kurien, James; Muscettola, Nicola; Nayak, P. Pandurang; Rajan, Kanna; Lau, Sonie (Technical Monitor)

    1998-01-01

    This paper describes the integration of the Remote Agent (RA), a spacecraft autonomy system which is scheduled to control the Deep Space 1 spacecraft during a flight experiment in 1999. The RA is a reusable, model-based autonomy system that is quite different from software typically used to control an aerospace system. We describe the integration challenges we faced, how we addressed them, and the lessons learned. We focus on those aspects of integrating the RA that were either easier or more difficult than integrating a more traditional large software application because the RA is a model-based autonomous system. A number of characteristics of the RA made integration process easier. One example is the model-based nature of RA. Since the RA is model-based, most of its behavior is not hard coded into procedural program code. Instead, engineers specify high level models of the spacecraft's components from which the Remote Agent automatically derives correct system-wide behavior on the fly. This high level, modular, and declarative software description allowed some interfaces between RA components and between RA and the flight software to be automatically generated and tested for completeness against the Remote Agent's models. In addition, the Remote Agent's model-based diagnosis system automatically diagnoses when the RA models are not consistent with the behavior of the spacecraft. In flight, this feature is used to diagnose failures in the spacecraft hardware. During integration, it proved valuable in finding problems in the spacecraft simulator or flight software. In addition, when modifications are made to the spacecraft hardware or flight software, the RA models are easily changed because they only capture a description of the spacecraft. one does not have to maintain procedural code that implements the correct behavior for every expected situation. On the other hand, several features of the RA made it more difficult to integrate than typical flight software. For example, the definition of correct behavior is more difficult to specify for a system that is expected to reason about and flexibly react to its environment than for a traditional flight software system. Consequently, whenever a change is made to the RA it is more time consuming to determine if the resulting behavior is correct. We conclude the paper with a discussion of future work on the Remote Agent as well as recommendations to ease integration of similar autonomy projects.

  13. A Fundamental Mathematical Model of a Microbial Predenitrification System

    NASA Technical Reports Server (NTRS)

    Hoo, Karlene A.

    2005-01-01

    Space flight beyond Low Earth Orbit requires sophisticated systems to support all aspects of the mission (life support, real-time communications, etc.). A common concern that cuts across all these systems is the selection of information technology (IT) methodology, software and hardware architectures to provide robust monitoring, diagnosis, and control support. Another dimension of the problem space is that different systems must be integrated seamlessly so that communication speed and data handling appear as a continuum (un-interrupted). One such team investigating this problem is the Advanced Integration Matrix (AIM) team whose role is to define the critical requirements expected of software and hardware to support an integrated approach to the command and control of Advanced Life Support (ALS) for future long-duration human space missions, including permanent human presence on the Moon and Mars. A goal of the AIM team is to set the foundation for testing criteria that will assist in specifying tasks, control schemes and test scenarios to validate and verify systems capabilities. This project is to contribute to the goals of the AIM team by assisting with controls planning for ALS. Control for ALS is an enormous problem it involves air revitalization, water recovery, food production, solids processing and crew. In more general terms, these systems can be characterized as involving both continuous and discrete processes, dynamic interactions among the sub-systems, nonlinear behavior due to the complex operations, and a large number of multivariable interactions due to the dimension of the state space. It is imperative that a baseline approach from which to measure performance is established especially when the expectation for the control system is complete autonomous control.

  14. Advances in Robotic, Human, and Autonomous Systems for Missions of Space Exploration

    NASA Technical Reports Server (NTRS)

    Gross, Anthony R.; Briggs, Geoffrey A.; Glass, Brian J.; Pedersen, Liam; Kortenkamp, David M.; Wettergreen, David S.; Nourbakhsh, I.; Clancy, Daniel J.; Zornetzer, Steven (Technical Monitor)

    2002-01-01

    Space exploration missions are evolving toward more complex architectures involving more capable robotic systems, new levels of human and robotic interaction, and increasingly autonomous systems. How this evolving mix of advanced capabilities will be utilized in the design of new missions is a subject of much current interest. Cost and risk constraints also play a key role in the development of new missions, resulting in a complex interplay of a broad range of factors in the mission development and planning of new missions. This paper will discuss how human, robotic, and autonomous systems could be used in advanced space exploration missions. In particular, a recently completed survey of the state of the art and the potential future of robotic systems, as well as new experiments utilizing human and robotic approaches will be described. Finally, there will be a discussion of how best to utilize these various approaches for meeting space exploration goals.

  15. Creating a Mobile Autonomous Robot Research System (MARRS)

    DTIC Science & Technology

    1984-12-01

    Laboratory was made possible through the energetic support of many individuals and organizations. In particluar, we want to thank our thesis advisor Dr...subsystems. Computer Hardware Until a few years ago autonomous vehicles were unheard of in real life. The advent of the microcomputer has made fact...8217i.vjf/^vf.’ Most software development efforts for MARRS-1 took advantage of Virtual Devices Robo C compiler and Robo Assembler. The next best

  16. Galileo spacecraft autonomous attitude determination using a V-slit star scanner

    NASA Technical Reports Server (NTRS)

    Mobasser, Sohrab; Lin, Shuh-Ren

    1991-01-01

    The autonomous attitude determination system of Galileo spacecraft, consisting of a radiation hardened star scanner and a processing algorithm is presented. The algorithm applying to this system are the sequential star identification and attitude estimation. The star scanner model is reviewed in detail and the flight software parameters that must be updated frequently during flight, due to degradation of the scanner response and the star background change are identified.

  17. Autonomous, Decentralized Grid Architecture: Prosumer-Based Distributed Autonomous Cyber-Physical Architecture for Ultra-Reliable Green Electricity Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    None

    2012-01-11

    GENI Project: Georgia Tech is developing a decentralized, autonomous, internet-like control architecture and control software system for the electric power grid. Georgia Tech’s new architecture is based on the emerging concept of electricity prosumers—economically motivated actors that can produce, consume, or store electricity. Under Georgia Tech’s architecture, all of the actors in an energy system are empowered to offer associated energy services based on their capabilities. The actors achieve their sustainability, efficiency, reliability, and economic objectives, while contributing to system-wide reliability and efficiency goals. This is in marked contrast to the current one-way, centralized control paradigm.

  18. Evolution of Autonomous Self-Righting Behaviors for Articulated Nanorovers

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward

    1999-01-01

    Miniature rovers with articulated mobility mechanisms are being developed for planetary surface exploration on Mars and small solar system bodies. These vehicles are designed to be capable of autonomous recovery from overturning during surface operations. This paper describes a computational means of developing motion behaviors that achieve the autonomous recovery function. It proposes a control software design approach aimed at reducing the effort involved in developing self-righting behaviors. The approach is based on the integration of evolutionary computing with a dynamics simulation environment for evolving and evaluating motion behaviors. The automated behavior design approach is outlined and its underlying genetic programming infrastructure is described.

  19. An Architecture to Enable Autonomous Control of Spacecraft

    NASA Technical Reports Server (NTRS)

    May, Ryan D.; Dever, Timothy P.; Soeder, James F.; George, Patrick J.; Morris, Paul H.; Colombano, Silvano P.; Frank, Jeremy D.; Schwabacher, Mark A.; Wang, Liu; LawLer, Dennis

    2014-01-01

    Autonomy is required for manned spacecraft missions distant enough that light-time communication delays make ground-based mission control infeasible. Presently, ground controllers develop a complete schedule of power modes for all spacecraft components based on a large number of factors. The proposed architecture is an early attempt to formalize and automate this process using on-vehicle computation resources. In order to demonstrate this architecture, an autonomous electrical power system controller and vehicle Mission Manager are constructed. These two components are designed to work together in order to plan upcoming load use as well as respond to unanticipated deviations from the plan. The communication protocol was developed using "paper" simulations prior to formally encoding the messages and developing software to implement the required functionality. These software routines exchange data via TCP/IP sockets with the Mission Manager operating at NASA Ames Research Center and the autonomous power controller running at NASA Glenn Research Center. The interconnected systems are tested and shown to be effective at planning the operation of a simulated quasi-steady state spacecraft power system and responding to unexpected disturbances.

  20. ICAROUS: Integrated Configurable Architecture for Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.

    2016-01-01

    NASA's Unmanned Aerial System (UAS) Traffic Management (UTM) project aims at enabling near-term, safe operations of small UAS vehicles in uncontrolled airspace, i.e., Class G airspace. A far-term goal of UTM research and development is to accommodate the expected rise in small UAS traffic density throughout the National Airspace System (NAS) at low altitudes for beyond visual line-of-sight operations. This video describes a new capability referred to as ICAROUS (Integrated Configurable Algorithms for Reliable Operations of Unmanned Systems), which is being developed under the auspices of the UTM project. ICAROUS is a software architecture comprised of highly assured algorithms for building safety-centric, autonomous, unmanned aircraft applications. Central to the development of the ICAROUS algorithms is the use of well-established formal methods to guarantee higher levels of safety assurance by monitoring and bounding the behavior of autonomous systems. The core autonomy-enabling capabilities in ICAROUS include constraint conformance monitoring and autonomous detect and avoid functions. ICAROUS also provides a highly configurable user interface that enables the modular integration of mission-specific software components.

  1. An Optimized Autonomous Space In-situ Sensorweb (OASIS) for Volcano Monitoring

    NASA Astrophysics Data System (ADS)

    Song, W.; Shirazi, B.; Lahusen, R.; Chien, S.; Kedar, S.; Webb, F.

    2006-12-01

    In response to NASA's announced requirement for Earth hazard monitoring sensor-web technology, we are developing a prototype real-time Optimized Autonomous Space In-situ Sensorweb. The prototype will be focused on volcano hazard monitoring at Mount St. Helens, which has been in continuous eruption since October 2004. The system is designed to be flexible and easily configurable for many other applications as well. The primary goals of the project are: 1) integrating complementary space (i.e., Earth Observing One (EO- 1) satellite) and in-situ (ground-based) elements into an interactive, autonomous sensor-web; 2) advancing sensor-web power and communication resource management technology; and 3) enabling scalability for seamless infusion of future space and in-situ assets into the sensor-web. To meet these goals, we are developing: 1) a test-bed in-situ array with smart sensor nodes capable of making autonomous data acquisition decisions; 2) efficient self-organization algorithm of sensor-web topology to support efficient data communication and command control; 3) smart bandwidth allocation algorithms in which sensor nodes autonomously determine packet priorities based on mission needs and local bandwidth information in real- time; and 4) remote network management and reprogramming tools. The space and in-situ control components of the system will be integrated such that each element is capable of triggering the other. Sensor-web data acquisition and dissemination will be accomplished through the use of SensorML language standards for geospatial information. The three-year project will demonstrate end-to-end system performance with the in-situ test-bed at Mount St. Helens and NASA's EO-1 platform.

  2. Monitoring and Correcting Autonomic Function Aboard Mir: NASA Technology Used in Space and on Earth to Facilitate Adaptation

    NASA Technical Reports Server (NTRS)

    Cowings, P.; Toscano, W.; Taylor, B.; DeRoshia, C.; Kornilova, L.; Koslovskaya, I.; Miller, N.

    1999-01-01

    The broad objective of the research was to study individual characteristics of human adaptation to long duration spaceflight and possibilities of their correction using autonomic conditioning. The changes in autonomic state during adaptation to microgravity can have profound effects on the operational efficiency of crewmembers and may result in debilitating biomedical symptoms. Ground-based and inflight experiment results showed that certain responses of autonomic nervous system were correlated with, or consistently preceded, reports of performance decrements or the symptoms. Autogenic-Feedback-Training Exercise (AFTE) is a physiological conditioning method that has been used to train people to voluntary control several of their own physiological responses. The specific objectives were: 1) To study human autonomic nervous system (ANS) responses to sustained exposure to microgravity; 2) To study human behavior/performance changes related to physiology; 3) To evaluate the effectiveness of preflight autonomic conditioning (AFTE) for facilitating adaptation to space and readaptation to Earth; and 4) To archive these data for the NASA Life Sciences Data Archive and thereby make this information available to the international scientific community.

  3. Evolution and advanced technology. [of Flight Telerobotic Servicer

    NASA Technical Reports Server (NTRS)

    Ollendorf, Stanford; Pennington, Jack E.; Hansen, Bert, III

    1990-01-01

    The NASREM architecture with its standard interfaces permits development and evolution of the Flight Telerobotic Servicer to greater autonomy. Technologies in control strategies for an arm with seven DOF, including a safety system containing skin sensors for obstacle avoidance, are being developed. Planning and robotic execution software includes symbolic task planning, world model data bases, and path planning algorithms. Research over the last five years has led to the development of laser scanning and ranging systems, which use coherent semiconductor laser diodes for short range sensing. The possibility of using a robot to autonomously assemble space structures is being investigated. A control framework compatible with NASREM is being developed that allows direct global control of the manipulator. Researchers are developing systems that permit an operator to quickly reconfigure the telerobot to do new tasks safely.

  4. Automated Attitude Sensor Calibration: Progress and Plans

    NASA Technical Reports Server (NTRS)

    Sedlak, Joseph; Hashmall, Joseph

    2004-01-01

    This paper describes ongoing work a NASA/Goddard Space Flight Center to improve the quality of spacecraft attitude sensor calibration and reduce costs by automating parts of the calibration process. The new calibration software can autonomously preview data quality over a given time span, select a subset of the data for processing, perform the requested calibration, and output a report. This level of automation is currently being implemented for two specific applications: inertial reference unit (IRU) calibration and sensor alignment calibration. The IRU calibration utility makes use of a sequential version of the Davenport algorithm. This utility has been successfully tested with simulated and actual flight data. The alignment calibration is still in the early testing stage. Both utilities will be incorporated into the institutional attitude ground support system.

  5. Flight Testing of Terrain-Relative Navigation and Large-Divert Guidance on a VTVL Rocket

    NASA Technical Reports Server (NTRS)

    Trawny, Nikolas; Benito, Joel; Tweddle, Brent; Bergh, Charles F.; Khanoyan, Garen; Vaughan, Geoffrey M.; Zheng, Jason X.; Villalpando, Carlos Y.; Cheng, Yang; Scharf, Daniel P.; hide

    2015-01-01

    Since 2011, the Autonomous Descent and Ascent Powered-Flight Testbed (ADAPT) has been used to demonstrate advanced descent and landing technologies onboard the Masten Space Systems (MSS) Xombie vertical-takeoff, vertical-landing suborbital rocket. The current instantiation of ADAPT is a stand-alone payload comprising sensing and avionics for terrain-relative navigation and fuel-optimal onboard planning of large divert trajectories, thus providing complete pin-point landing capabilities needed for planetary landers. To this end, ADAPT combines two technologies developed at JPL, the Lander Vision System (LVS), and the Guidance for Fuel Optimal Large Diverts (G-FOLD) software. This paper describes the integration and testing of LVS and G-FOLD in the ADAPT payload, culminating in two successful free flight demonstrations on the Xombie vehicle conducted in December 2014.

  6. Negotiating the Traffic: Can Cognitive Science Help Make Autonomous Vehicles a Reality?

    PubMed

    Chater, Nick; Misyak, Jennifer; Watson, Derrick; Griffiths, Nathan; Mouzakitis, Alex

    2018-02-01

    To drive safely among human drivers, cyclists and pedestrians, autonomous vehicles will need to mimic, or ideally improve upon, humanlike driving. Yet, driving presents us with difficult problems of joint action: 'negotiating' with other users over shared road space. We argue that autonomous driving provides a test case for computational theories of social interaction, with fundamental implications for the development of autonomous vehicles. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Development of Methodology for Programming Autonomous Agents

    NASA Technical Reports Server (NTRS)

    Erol, Kutluhan; Levy, Renato; Lang, Lun

    2004-01-01

    A brief report discusses the rationale for, and the development of, a methodology for generating computer code for autonomous-agent-based systems. The methodology is characterized as enabling an increase in the reusability of the generated code among and within such systems, thereby making it possible to reduce the time and cost of development of the systems. The methodology is also characterized as enabling reduction of the incidence of those software errors that are attributable to the human failure to anticipate distributed behaviors caused by the software. A major conceptual problem said to be addressed in the development of the methodology was that of how to efficiently describe the interfaces between several layers of agent composition by use of a language that is both familiar to engineers and descriptive enough to describe such interfaces unambivalently

  8. NASA Technology Transfer - Human Robot Teaming

    NASA Image and Video Library

    2016-12-23

    Produced for Intelligent Robotics Group to show at January 2017 Consumer Electronics Show (CES). Highlights development of VERVE (Visual Environment for Remote Virtual Exploration) software used on K-10, K-REX, SPHERES and AstroBee projects for 3D awareness. Also mentions transfer of software to Nissan for their development in their Autonomous Vehicle project. Video includes Nissan's self-driving car around NASA Ames.

  9. General Automatic Components of Motion Sickness

    NASA Technical Reports Server (NTRS)

    Suter, S.; Toscano, W. B.; Kamiya, J.; Naifeh, K.

    1985-01-01

    A body of investigations performed in support of experiments aboard the space shuttle, and designed to counteract the symptoms of Space Adaptation Syndrome, which resemble those of motion sickness on Earth is reviewed. For these supporting studies, the automatic manifestations of earth-based motion sickness was examined. Heart rate, respiration rate, finger pulse volume and basal skin resistance were measured on 127 men and women before, during and after exposure to nauseogenic rotating chair tests. Significant changes in all autonomic responses were observed across the tests. Significant differences in autonomic responses among groups divided according to motion sickness susceptibility were also observed. Results suggest that the examination of autonomic responses as an objective indicator of motion sickness malaise is warranted and may contribute to the overall understanding of the syndrome on Earth and in Space.

  10. Data Reduction and Control Software for Meteor Observing Stations Based on CCD Video Systems

    NASA Technical Reports Server (NTRS)

    Madiedo, J. M.; Trigo-Rodriguez, J. M.; Lyytinen, E.

    2011-01-01

    The SPanish Meteor Network (SPMN) is performing a continuous monitoring of meteor activity over Spain and neighbouring countries. The huge amount of data obtained by the 25 video observing stations that this network is currently operating made it necessary to develop new software packages to accomplish some tasks, such as data reduction and remote operation of autonomous systems based on high-sensitivity CCD video devices. The main characteristics of this software are described here.

  11. Autonomous Space Shuttle

    NASA Technical Reports Server (NTRS)

    Siders, Jeffrey A.; Smith, Robert H.

    2004-01-01

    The continued assembly and operation of the International Space Station (ISS) is the cornerstone within NASA's overall Strategic P an. As indicated in NASA's Integrated Space Transportation Plan (ISTP), the International Space Station requires Shuttle to fly through at least the middle of the next decade to complete assembly of the Station, provide crew transport, and to provide heavy lift up and down mass capability. The ISTP reflects a tight coupling among the Station, Shuttle, and OSP programs to support our Nation's space goal . While the Shuttle is a critical component of this ISTP, there is a new emphasis for the need to achieve greater efficiency and safety in transporting crews to and from the Space Station. This need is being addressed through the Orbital Space Plane (OSP) Program. However, the OSP is being designed to "complement" the Shuttle as the primary means for crew transfer, and will not replace all the Shuttle's capabilities. The unique heavy lift capabilities of the Space Shuttle is essential for both ISS, as well as other potential missions extending beyond low Earth orbit. One concept under discussion to better fulfill this role of a heavy lift carrier, is the transformation of the Shuttle to an "un-piloted" autonomous system. This concept would eliminate the loss of crew risk, while providing a substantial increase in payload to orbit capability. Using the guidelines reflected in the NASA ISTP, the autonomous Shuttle a simplified concept of operations can be described as; "a re-supply of cargo to the ISS through the use of an un-piloted Shuttle vehicle from launch through landing". Although this is the primary mission profile, the other major consideration in developing an autonomous Shuttle is maintaining a crew transportation capability to ISS as an assured human access to space capability.

  12. NASA Tech Briefs, January 2007

    NASA Technical Reports Server (NTRS)

    2007-01-01

    Topics covered include: Flexible Skins Containing Integrated Sensors and Circuitry; Artificial Hair Cells for Sensing Flows; Video Guidance Sensor and Time-of-Flight Rangefinder; Optical Beam-Shear Sensors; Multiple-Agent Air/Ground Autonomous Exploration Systems; A 640 512-Pixel Portable Long-Wavelength Infrared Camera; An Array of Optical Receivers for Deep-Space Communications; Microstrip Antenna Arrays on Multilayer LCP Substrates; Applications for Subvocal Speech; Multiloop Rapid-Rise/Rapid Fall High-Voltage Power Supply; The PICWidget; Fusing Symbolic and Numerical Diagnostic Computations; Probabilistic Reasoning for Robustness in Automated Planning; Short-Term Forecasting of Radiation Belt and Ring Current; JMS Proxy and C/C++ Client SDK; XML Flight/Ground Data Dictionary Management; Cross-Compiler for Modeling Space-Flight Systems; Composite Elastic Skins for Shape-Changing Structures; Glass/Ceramic Composites for Sealing Solid Oxide Fuel Cells; Aligning Optical Fibers by Means of Actuated MEMS Wedges; Manufacturing Large Membrane Mirrors at Low Cost; Double-Vacuum-Bag Process for Making Resin- Matrix Composites; Surface Bacterial-Spore Assay Using Tb3+/DPA Luminescence; Simplified Microarray Technique for Identifying mRNA in Rare Samples; High-Resolution, Wide-Field-of-View Scanning Telescope; Multispectral Imager With Improved Filter Wheel and Optics; Integral Radiator and Storage Tank; Compensation for Phase Anisotropy of a Metal Reflector; Optical Characterization of Molecular Contaminant Films; Integrated Hardware and Software for No-Loss Computing; Decision-Tree Formulation With Order-1 Lateral Execution; GIS Methodology for Planning Planetary-Rover Operations; Optimal Calibration of the Spitzer Space Telescope; Automated Detection of Events of Scientific Interest; Representation-Independent Iteration of Sparse Data Arrays; Mission Operations of the Mars Exploration Rovers; and More About Software for No-Loss Computing.

  13. Automated Transfer Vehicle Proximity Flight Safety Overview

    NASA Astrophysics Data System (ADS)

    Cornier, Dominique; Berthelier, David; Requiston, Helene; Zekri, Eric; Chase, Richard

    2005-12-01

    The European Automated Transfer Vehicle (ATV) is an unmanned transportation spacecraft designed to contribute to the logistic servicing of the ISS. The ATV will be launched by ARIANE 5 and, after phasing and rendezvous maneuvers, it autonomously docks to the International Space Station (ISS).The ATV control is nominally handled by the Guidance, Navigation and Control (GNC) function using computers, software, sensors and actuators. During rendezvous operations, in order to cover the extreme situations where the GNC function fails to ensure a safe trajectory with respect to the ISS, a segregated Proximity Flight Safety (PFS) function is activated : this function will initiate a collision avoidance maneuver which will place the ATV on a trajectory ensuring safety with respect to the ISS. The PFS function relies on segregated computers, the Monitoring and Safing Units (MSUs) running specific software, on four dedicated thrusters, on dedicated batteries and on specific interfaces with ATV gyrometers.The PFS function being the ultimate protection to ensure ISS safety in case of ATV malfunction, specific rules have been applied to its implementation, in particular for the development of the MSU software, which is critical since any failure of this software may result in catastrophic consequences.This paper provides an overview of the ATV Proximity Flight Safety function. After a short description of the overall ATV avionics architecture and its rationale, the second part of the paper presents more details on the PFS function both in terms of hardware and software implementation. The third part of the paper is dedicated to the MSU software validation method that is specific considering its criticality. The last part of the paper provides information on the different operations related to the use of the PFS function during an ATV flight.

  14. Impact of space flight on cardiovascular autonomic control

    NASA Astrophysics Data System (ADS)

    Beckers, F.; Verheyden, B.; Morukov, B.; Aubert, Ae

    Introduction: Space flight alters the distribution of blood in the human body, leading to altered cardiovascular neurohumoral regulation with a blunted carotid-cardiac baroreflex. These changes contribute to the occurrence of orthostatic intolerance after space flight. Heart rate variability (HRV) and blood pressure variability (BPV) provide non-invasive means to study the autonomic modulation of the heart. Low frequency (LF) oscillations provide information about sympathetic modulation and baroreflex, while high frequency (HF) modulation is an index of vagal heart rate modulation. Methods: ECG and continuous blood pressure were measured for at least 10 minutes in supine, sitting and standing position 45 days and 10 days (L-45, L-10) before launch; and at 1, 2, 4, 9, 15, 19 and 25 days after return to earth (R+x). In space, ECG and continuous blood pressure were measured at day 5 (FD5) and day 8 (FD8). These measurements were performed in 3. HRV and BPV indices were calculated in time and frequency domain. Results: Measurements in supine position and sitting position did not show as high differences as the measurements in standing position. During space flight heart rate was significantly lower compared to the pre- and post-flight measurements in standing position (RR-values: L-45: 837± 42 ms; FD5: 1004± 40 ms; FD8: 1038± 53 ms; R+1: 587± 21 ms; p<0.05). This was accompanied by a significant increase in the proportion of HF power during space flight and a decrease in LF power. Immediately after space flight both LF and HF modulation of heart rate were extremely depressed compared to the pre-flight conditions (p<0.005). A gradual recovery towards baseline conditions of both indices was observed up to 25 days after return from space (LF: L-45: 3297± 462 ms2; FD5: 1251± 332 ms2; FD8: 1322± 462 ms2; R+1: 547± 188 ms2; R+4: 1958± 709 ms2; R+9: 1220± 148 ms2; R+15: 1704± 497 ms2; R+25: 2644± 573 ms2). However, even 25 days after return, values were below baseline condition. Mean systolic blood pressure did not differ significantly before during and after space flight. In space both LF and HF were decreased compared the standing measurements pre- and post-flight. No evolution was present in BPV after return to Earth. Conclusion: During space flight autonomic modulation is characterised by a vagal predominance. Immediately after return to Earth overall autonomic modulation is extremely depressed. Vasomotor autonomic control is restored rather quickly after space flight, while the restoration of autonomic modulation of heart rate is very slow.

  15. Using Multimodal Input for Autonomous Decision Making for Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Neilan, James H.; Cross, Charles; Rothhaar, Paul; Tran, Loc; Motter, Mark; Qualls, Garry; Trujillo, Anna; Allen, B. Danette

    2016-01-01

    Autonomous decision making in the presence of uncertainly is a deeply studied problem space particularly in the area of autonomous systems operations for land, air, sea, and space vehicles. Various techniques ranging from single algorithm solutions to complex ensemble classifier systems have been utilized in a research context in solving mission critical flight decisions. Realized systems on actual autonomous hardware, however, is a difficult systems integration problem, constituting a majority of applied robotics development timelines. The ability to reliably and repeatedly classify objects during a vehicles mission execution is vital for the vehicle to mitigate both static and dynamic environmental concerns such that the mission may be completed successfully and have the vehicle operate and return safely. In this paper, the Autonomy Incubator proposes and discusses an ensemble learning and recognition system planned for our autonomous framework, AEON, in selected domains, which fuse decision criteria, using prior experience on both the individual classifier layer and the ensemble layer to mitigate environmental uncertainty during operation.

  16. Data analysis-based autonomic bandwidth adjustment in software defined multi-vendor optical transport networks.

    PubMed

    Li, Yajie; Zhao, Yongli; Zhang, Jie; Yu, Xiaosong; Jing, Ruiquan

    2017-11-27

    Network operators generally provide dedicated lightpaths for customers to meet the demand for high-quality transmission. Considering the variation of traffic load, customers usually rent peak bandwidth that exceeds the practical average traffic requirement. In this case, bandwidth provisioning is unmetered and customers have to pay according to peak bandwidth. Supposing that network operators could keep track of traffic load and allocate bandwidth dynamically, bandwidth can be provided as a metered service and customers would pay for the bandwidth that they actually use. To achieve cost-effective bandwidth provisioning, this paper proposes an autonomic bandwidth adjustment scheme based on data analysis of traffic load. The scheme is implemented in a software defined networking (SDN) controller and is demonstrated in the field trial of multi-vendor optical transport networks. The field trial shows that the proposed scheme can track traffic load and realize autonomic bandwidth adjustment. In addition, a simulation experiment is conducted to evaluate the performance of the proposed scheme. We also investigate the impact of different parameters on autonomic bandwidth adjustment. Simulation results show that the step size and adjustment period have significant influences on bandwidth savings and packet loss. A small value of step size and adjustment period can bring more benefits by tracking traffic variation with high accuracy. For network operators, the scheme can serve as technical support of realizing bandwidth as metered service in the future.

  17. Guidance and Control System for an Autonomous Vehicle

    DTIC Science & Technology

    1990-06-01

    implementing an appropriate computer architecture in support of these goals is also discussed and detailed, along with the choice of associated computer hardware and real - time operating system software. (rh)

  18. Vision Based Autonomous Robotic Control for Advanced Inspection and Repair

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2014-01-01

    The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.

  19. NASA Tech Briefs, April 2012

    NASA Technical Reports Server (NTRS)

    2012-01-01

    Topics include: Computational Ghost Imaging for Remote Sensing; Digital Architecture for a Trace Gas Sensor Platform; Dispersed Fringe Sensing Analysis - DFSA; Indium Tin Oxide Resistor-Based Nitric Oxide Microsensors; Gas Composition Sensing Using Carbon Nanotube Arrays; Sensor for Boundary Shear Stress in Fluid Flow; Model-Based Method for Sensor Validation; Qualification of Engineering Camera for Long-Duration Deep Space Missions; Remotely Powered Reconfigurable Receiver for Extreme Environment Sensing Platforms; Bump Bonding Using Metal-Coated Carbon Nanotubes; In Situ Mosaic Brightness Correction; Simplex GPS and InSAR Inversion Software; Virtual Machine Language 2.1; Multi-Scale Three-Dimensional Variational Data Assimilation System for Coastal Ocean Prediction; Pandora Operation and Analysis Software; Fabrication of a Cryogenic Bias Filter for Ultrasensitive Focal Plane; Processing of Nanosensors Using a Sacrificial Template Approach; High-Temperature Shape Memory Polymers; Modular Flooring System; Non-Toxic, Low-Freezing, Drop-In Replacement Heat Transfer Fluids; Materials That Enhance Efficiency and Radiation Resistance of Solar Cells; Low-Cost, Rugged High-Vacuum System; Static Gas-Charging Plug; Floating Oil-Spill Containment Device; Stemless Ball Valve; Improving Balance Function Using Low Levels of Electrical Stimulation of the Balance Organs; Oxygen-Methane Thruster; Lunar Navigation Determination System - LaNDS; Launch Method for Kites in Low-Wind or No-Wind Conditions; Supercritical CO2 Cleaning System for Planetary Protection and Contamination Control Applications; Design and Performance of a Wideband Radio Telescope; Finite Element Models for Electron Beam Freeform Fabrication Process Autonomous Information Unit for Fine-Grain Data Access Control and Information Protection in a Net-Centric System; Vehicle Detection for RCTA/ANS (Autonomous Navigation System); Image Mapping and Visual Attention on the Sensory Ego-Sphere; HyDE Framework for Stochastic and Hybrid Model-Based Diagnosis; and IMAGESEER - IMAGEs for Education and Research.

  20. Space Flight Software Development Software for Intelligent System Health Management

    NASA Technical Reports Server (NTRS)

    Trevino, Luis C.; Crumbley, Tim

    2004-01-01

    The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.

  1. Very fast motion planning for highly dexterous-articulated robots

    NASA Technical Reports Server (NTRS)

    Challou, Daniel J.; Gini, Maria; Kumar, Vipin

    1994-01-01

    Due to the inherent danger of space exploration, the need for greater use of teleoperated and autonomous robotic systems in space-based applications has long been apparent. Autonomous and semi-autonomous robotic devices have been proposed for carrying out routine functions associated with scientific experiments aboard the shuttle and space station. Finally, research into the use of such devices for planetary exploration continues. To accomplish their assigned tasks, all such autonomous and semi-autonomous devices will require the ability to move themselves through space without hitting themselves or the objects which surround them. In space it is important to execute the necessary motions correctly when they are first attempted because repositioning is expensive in terms of both time and resources (e.g., fuel). Finally, such devices will have to function in a variety of different environments. Given these constraints, a means for fast motion planning to insure the correct movement of robotic devices would be ideal. Unfortunately, motion planning algorithms are rarely used in practice because of their computational complexity. Fast methods have been developed for detecting imminent collisions, but the more general problem of motion planning remains computationally intractable. However, in this paper we show how the use of multicomputers and appropriate parallel algorithms can substantially reduce the time required to synthesize paths for dexterous articulated robots with a large number of joints. We have developed a parallel formulation of the Randomized Path Planner proposed by Barraquand and Latombe. We have shown that our parallel formulation is capable of formulating plans in a few seconds or less on various parallel architectures including: the nCUBE2 multicomputer with up to 1024 processors (nCUBE2 is a registered trademark of the nCUBE corporation), and a network of workstations.

  2. Progress of Crew Autonomous Scheduling Test (CAST) On the ISS

    NASA Technical Reports Server (NTRS)

    Healy, Matthew; Marquez, Jessica; Hillenius, Steven; Korth, David; Bakalyar, Lauren Rush; Woodbury, Neil; Larsen, Crystal M.; Bates, Shelby; Kockler, Mikayla; Rhodes, Brooke; hide

    2017-01-01

    The United States space policy is evolving toward missions beyond low Earth orbit. In an effort to meet that policy, NASA has recognized Autonomous Mission Operations (AMO) as a valuable capability. Identified within AMO capabilities is the potential for autonomous planning and replanning during human spaceflight operations. That is allowing crew members to collectively or individually participate in the development of their own schedules. Currently, dedicated mission operations planners collaborate with international partners to create daily plans for astronauts aboard the International Space Station (ISS), taking into account mission requirements, ground rules, and various vehicle and payload constraints. In future deep space operations the crew will require more independence from ground support due to communication transmission delays. Furthermore, crew members who are provided with the capability to schedule their own activities are able to leverage direct experience operating in the space environment, and possibly maximize their efficiency. CAST (Crew Autonomous Scheduling Test) is an ISS investigation designed to analyze three important hypotheses about crew autonomous scheduling. First, given appropriate inputs, the crew is able to create and execute a plan in a reasonable period of time without impacts to mission success. Second, the proximity of the planner, in this case the crew, to the planned operations increases their operational efficiency. Third, crew members are more satisfied when given a role in plan development. This presentation shows the progress done in this study with a single astronaut test subject participating in five CAST sessions. CAST is a technology demonstration payload sponsored by the ISS Research Science and Technology Office, and performed by experts in Mission Operations Planning from the Flight Operations Directorate at NASA Johnson Space Center, and researchers across multiple NASA centers.

  3. An Overview of the Guided Parafoil System Derived from X-38 Experience

    NASA Technical Reports Server (NTRS)

    Stein, Jenny M.; Madsen, Chris M.; Strahan, Alan L.

    2005-01-01

    The NASA Johnson Space Center built a 4200 sq ft parafoil for the U.S. Army Natick Soldier Center to demonstrate autonomous flight using a guided parafoil system to deliver 10,000 lbs of useable payload. The parafoil's design was based upon that developed during the X-38 program. The drop test payload consisted of a standard 20-foot Type V airdrop platform, a standard 12-foot weight tub, a 60 ft drogue parachute, a 4200 ft2 parafoil, an instrumentation system, and a Guidance, Navigation, and Control (GN&C) system. Instrumentation installed on the load was used to gather data to validate simulation models and preflight loads predictions and to perform post flight trajectory and performance reconstructions. The GN&C system, developed during NASA's X-38 program, consisted of a flight computer, modems for uplink commands and downlink data, a compass, laser altimeter, and two winches. The winches were used to steer the parafoil and to perform the dynamic flare maneuver for a soft landing. The laser was used to initiate the flare. The GN&C software was originally provided to NASA by the European Space Agency. NASA incorporated further software refinements based upon the X-38 flight test results. Three full-scale drop tests were conducted, with the third being performed during the Precision Airdrop Technology Conference and Demonstration (PATCAD) Conference at the U.S. Army Yuma Proving Ground (YPG) in November of 2003. For the PATCAD demonstration, the parafoil and GN&C software and hardware performed well, concluding with a good flare and the smallest miss distance ever experienced in NASA's parafoil drop test program. This paper describes the 4200 sq ft parafoil system, simulation results, and the results of the drop tests.

  4. Autonomous celestial navigation based on Earth ultraviolet radiance and fast gradient statistic feature extraction

    NASA Astrophysics Data System (ADS)

    Lu, Shan; Zhang, Hanmo

    2016-01-01

    To meet the requirement of autonomous orbit determination, this paper proposes a fast curve fitting method based on earth ultraviolet features to obtain accurate earth vector direction, in order to achieve the high precision autonomous navigation. Firstly, combining the stable characters of earth ultraviolet radiance and the use of transmission model software of atmospheric radiation, the paper simulates earth ultraviolet radiation model on different time and chooses the proper observation band. Then the fast improved edge extracting method combined Sobel operator and local binary pattern (LBP) is utilized, which can both eliminate noises efficiently and extract earth ultraviolet limb features accurately. And earth's centroid locations on simulated images are estimated via the least square fitting method using part of the limb edges. Taken advantage of the estimated earth vector direction and earth distance, Extended Kalman Filter (EKF) is applied to realize the autonomous navigation finally. Experiment results indicate the proposed method can achieve a sub-pixel earth centroid location estimation and extremely enhance autonomous celestial navigation precision.

  5. SAURON: The Wallace Observatory Small AUtonomous Robotic Optical Nightwatcher

    NASA Astrophysics Data System (ADS)

    Kosiarek, M.; Mansfield, M.; Brothers, T.; Bates, H.; Aviles, R.; Brode-Roger, O.; Person, M.; Russel, M.

    2017-07-01

    The Small AUtonomous Robotic Optical Nightwatcher (SAURON) is an autonomous telescope consisting of an 11-inch Celestron Nexstar telescope on a SoftwareBisque Paramount ME II in a Technical Innovations ProDome located at the MIT George R. Wallace, Jr. Astrophysical Observatory. This paper describes the construction of the telescope system and its first light data on T-And0-15785, an eclipsing binary star. The out-of-eclipse R magnitude of T-And0-15785 was found to be 13.3258 ± 0.0015 R magnitude, and the magnitude changes for the primary and secondary eclipses were found to be 0.7145 ± 0.0515 and 0.6085 ± 0.0165 R magnitudes, respectively.

  6. Ground crewmen help guide the alignment of the X-40A as the experimental craft is gently lowered to

    NASA Technical Reports Server (NTRS)

    2000-01-01

    Ground crewmen help guide the alignment of the X-40 technology demonstrator as the experimental craft is gently lowered to the ground by a U.S. Army CH-47 Chinook cargo helicopter following a captive-carry test flight at NASA's Dryden Flight Research Center, Edwards, California. The X-40 is an unpowered 82 percent scale version of the X-37, a Boeing-developed spaceplane designed to demonstrate various advanced technologies for development of future lower-cost access to space vehicles. The X-37 will be carried into space aboard a space shuttle and then released to perform various maneuvers and a controlled re-entry through the Earth's atmosphere to an airplane-style landing on a runway, controlled entirely by pre-programmed computer software. Following a series of captive-carry flights, the X-40 made several free flights from a launch altitude of about 15,000 feet above ground, gliding to a fully autonomous landing. The captive carry flights helped verify the X-40's navigation and control systems, rigging angles for its sling, and stability and control of the helicopter while carrying the X-40 on a tether.

  7. Navigation for the new millennium: Autonomous navigation for Deep Space 1

    NASA Technical Reports Server (NTRS)

    Reidel, J. E.; Bhaskaran, S.; Synnott, S. P.; Desai, S. D.; Bollman, W. E.; Dumont, P. J.; Halsell, C. A.; Han, D.; Kennedy, B. M.; Null, G. W.; hide

    1997-01-01

    The autonomous optical navigation system technology for the Deep Space 1 (DS1) mission is reported on. The DS1 navigation system will be the first to use autonomous navigation in deep space. The systems tasks are to: perform interplanetary cruise orbit determination using images of distant asteroids; control and maintain the orbit of the spacecraft with an ion propulsion system and conventional thrusters, and perform late knowledge updates of target position during close flybys in order to facilitate high quality data return from asteroid MaAuliffe and comet West-Kohoutek-Ikemura. To accomplish these tasks, the following functions are required: picture planning; image processing; dynamical modeling and integration; planetary ephemeris and star catalog handling; orbit determination; data filtering and estimation; maneuver estimation, and spacecraft ephemeris updating. These systems and functions are described and preliminary performance data are presented.

  8. Virtual Engineering and Science Team - Reusable Autonomy for Spacecraft Subsystems

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.; Johnson, Michael A.; Rilee, Michael L.; Truszkowski, Walt; Thompson, Bryan; Day, John H. (Technical Monitor)

    2002-01-01

    In this paper we address the design, development, and evaluation of the Virtual Engineering and Science Team (VEST) tool - a revolutionary way to achieve onboard subsystem/instrument autonomy. VEST directly addresses the technology needed for advanced autonomy enablers for spacecraft subsystems. It will significantly support the efficient and cost effective realization of on-board autonomy and contribute directly to realizing the concept of an intelligent autonomous spacecraft. VEST will support the evolution of a subsystem/instrument model that is probably correct and from that model the automatic generation of the code needed to support the autonomous operation of what was modeled. VEST will directly support the integration of the efforts of engineers, scientists, and software technologists. This integration of efforts will be a significant advancement over the way things are currently accomplished. The model, developed through the use of VEST, will be the basis for the physical construction of the subsystem/instrument and the generated code will support its autonomous operation once in space. The close coupling between the model and the code, in the same tool environment, will help ensure that correct and reliable operational control of the subsystem/instrument is achieved.VEST will provide a thoroughly modern interface that will allow users to easily and intuitively input subsystem/instrument requirements and visually get back the system's reaction to the correctness and compatibility of the inputs as the model evolves. User interface/interaction, logic, theorem proving, rule-based and model-based reasoning, and automatic code generation are some of the basic technologies that will be brought into play in realizing VEST.

  9. Bio-inspired Autonomic Structures: a middleware for Telecommunications Ecosystems

    NASA Astrophysics Data System (ADS)

    Manzalini, Antonio; Minerva, Roberto; Moiso, Corrado

    Today, people are making use of several devices for communications, for accessing multi-media content services, for data/information retrieving, for processing, computing, etc.: examples are laptops, PDAs, mobile phones, digital cameras, mp3 players, smart cards and smart appliances. One of the most attracting service scenarios for future Telecommunications and Internet is the one where people will be able to browse any object in the environment they live: communications, sensing and processing of data and services will be highly pervasive. In this vision, people, machines, artifacts and the surrounding space will create a kind of computational environment and, at the same time, the interfaces to the network resources. A challenging technological issue will be interconnection and management of heterogeneous systems and a huge amount of small devices tied together in networks of networks. Moreover, future network and service infrastructures should be able to provide Users and Application Developers (at different levels, e.g., residential Users but also SMEs, LEs, ASPs/Web2.0 Service roviders, ISPs, Content Providers, etc.) with the most appropriate "environment" according to their context and specific needs. Operators must be ready to manage such level of complication enabling their latforms with technological advanced allowing network and services self-supervision and self-adaptation capabilities. Autonomic software solutions, enhanced with innovative bio-inspired mechanisms and algorithms, are promising areas of long term research to face such challenges. This chapter proposes a bio-inspired autonomic middleware capable of leveraging the assets of the underlying network infrastructure whilst, at the same time, supporting the development of future Telecommunications and Internet Ecosystems.

  10. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  11. Terminal Homing for Autonomous Underwater Vehicle Docking

    DTIC Science & Technology

    2016-06-01

    underwater domain, accurate navigation. Above the water, light and electromagnetic signals travel well through air and space, mediums that allow for a...DISTRIBUTION CODE 13. ABSTRACT The use of docking stations for autonomous underwater vehicles (AUV) provides the ability to keep a vehicle on...Mechanical and Aerospace Engineering iv THIS PAGE INTENTIONALLY LEFT BLANK v ABSTRACT The use of docking stations for autonomous underwater

  12. Autonomous Inspection of Electrical Transmission Structures with Airborne UV Sensors - NASA Report on Dominion Virginia Power Flights of November 2016

    NASA Technical Reports Server (NTRS)

    Moore, Andrew J.; Schubert, Matthew; Nicholas Rymer

    2017-01-01

    The report details test and measurement flights to demonstrate autonomous UAV inspection of high voltage electrical transmission structures. A UAV built with commercial, off-the-shelf hardware and software, supplemented with custom sensor logging software, measured ultraviolet emissions from a test generator placed on a low-altitude substation and a medium-altitude switching tower. Since corona discharge precedes catastrophic electrical faults on high-voltage structures, detection and geolocation of ultraviolet emissions is needed to develop a UAV-based self-diagnosing power grid. Signal readings from an onboard ultraviolet sensor were validated during flight with a commercial corona camera. Geolocation was accomplished with onboard GPS; the UAV position was logged to a local ground station and transmitted in real time to a NASA server for tracking in the national airspace.

  13. Programmable and autonomous computing machine made of biomolecules

    PubMed Central

    Benenson, Yaakov; Paz-Elizur, Tamar; Adar, Rivka; Keinan, Ehud; Livneh, Zvi; Shapiro, Ehud

    2013-01-01

    Devices that convert information from one form into another according to a definite procedure are known as automata. One such hypothetical device is the universal Turing machine1, which stimulated work leading to the development of modern computers. The Turing machine and its special cases2, including finite automata3, operate by scanning a data tape, whose striking analogy to information-encoding biopolymers inspired several designs for molecular DNA computers4–8. Laboratory-scale computing using DNA and human-assisted protocols has been demonstrated9–15, but the realization of computing devices operating autonomously on the molecular scale remains rare16–20. Here we describe a programmable finite automaton comprising DNA and DNA-manipulating enzymes that solves computational problems autonomously. The automaton’s hardware consists of a restriction nuclease and ligase, the software and input are encoded by double-stranded DNA, and programming amounts to choosing appropriate software molecules. Upon mixing solutions containing these components, the automaton processes the input molecule via a cascade of restriction, hybridization and ligation cycles, producing a detectable output molecule that encodes the automaton’s final state, and thus the computational result. In our implementation 1012 automata sharing the same software run independently and in parallel on inputs (which could, in principle, be distinct) in 120 μl solution at room temperature at a combined rate of 109 transitions per second with a transition fidelity greater than 99.8%, consuming less than 10−10 W. PMID:11719800

  14. Autonomous Control Capabilities for Space Reactor Power Systems

    NASA Astrophysics Data System (ADS)

    Wood, Richard T.; Neal, John S.; Brittain, C. Ray; Mullens, James A.

    2004-02-01

    The National Aeronautics and Space Administration's (NASA's) Project Prometheus, the Nuclear Systems Program, is investigating a possible Jupiter Icy Moons Orbiter (JIMO) mission, which would conduct in-depth studies of three of the moons of Jupiter by using a space reactor power system (SRPS) to provide energy for propulsion and spacecraft power for more than a decade. Terrestrial nuclear power plants rely upon varying degrees of direct human control and interaction for operations and maintenance over a forty to sixty year lifetime. In contrast, an SRPS is intended to provide continuous, remote, unattended operation for up to fifteen years with no maintenance. Uncertainties, rare events, degradation, and communications delays with Earth are challenges that SRPS control must accommodate. Autonomous control is needed to address these challenges and optimize the reactor control design. In this paper, we describe an autonomous control concept for generic SRPS designs. The formulation of an autonomous control concept, which includes identification of high-level functional requirements and generation of a research and development plan for enabling technologies, is among the technical activities that are being conducted under the U.S. Department of Energy's Space Reactor Technology Program in support of the NASA's Project Prometheus. The findings from this program are intended to contribute to the successful realization of the JIMO mission.

  15. Evaluating the Medical Kit System for the International Space Station(ISS) - A Paradigm Revisited

    NASA Technical Reports Server (NTRS)

    Hailey, Melinda J.; Urbina, Michelle C.; Hughlett, Jessica L.; Gilmore, Stevan; Locke, James; Reyna, Baraquiel; Smith, Gwyn E.

    2010-01-01

    Medical capabilities aboard the International Space Station (ISS) have been packaged to help astronaut crew medical officers (CMO) mitigate both urgent and non-urgent medical issues during their 6-month expeditions. Two ISS crewmembers are designated as CMOs for each 3-crewmember mission and are typically not physicians. In addition, the ISS may have communication gaps of up to 45 minutes during each orbit, necessitating medical equipment that can be reliably operated autonomously during flight. The retirement of the space shuttle combined with ten years of manned ISS expeditions led the Space Medicine Division at the NASA Johnson Space Center to reassess the current ISS Medical Kit System. This reassessment led to the system being streamlined to meet future logistical considerations with current Russian space vehicles and future NASA/commercial space vehicle systems. Methods The JSC Space Medicine Division coordinated the development of requirements, fabrication of prototypes, and conducted usability testing for the new ISS Medical Kit System in concert with implementing updated versions of the ISS Medical Check List and associated in-flight software applications. The teams constructed a medical kit system with the flexibility for use on the ISS, and resupply on the Russian Progress space vehicle and future NASA/commercial space vehicles. Results Prototype systems were developed, reviewed, and tested for implementation. Completion of Preliminary and Critical Design Reviews resulted in a streamlined ISS Medical Kit System that is being used for training by ISS crews starting with Expedition 27 (June 2011). Conclusions The team will present the process for designing, developing, , implementing, and training with this new ISS Medical Kit System.

  16. The Space Station Module Power Management and Distribution automation test bed

    NASA Technical Reports Server (NTRS)

    Lollar, Louis F.

    1991-01-01

    The Space Station Module Power Management And Distribution (SSM/PMAD) automation test bed project was begun at NASA/Marshall Space Flight Center (MSFC) in the mid-1980s to develop an autonomous, user-supportive power management and distribution test bed simulating the Space Station Freedom Hab/Lab modules. As the test bed has matured, many new technologies and projects have been added. The author focuses on three primary areas. The first area is the overall accomplishments of the test bed itself. These include a much-improved user interface, a more efficient expert system scheduler, improved communication among the three expert systems, and initial work on adding intermediate levels of autonomy. The second area is the addition of a more realistic power source to the SSM/PMAD test bed; this project is called the Large Autonomous Spacecraft Electrical Power System (LASEPS). The third area is the completion of a virtual link between the SSM/PMAD test bed at MSFC and the Autonomous Power Expert at Lewis Research Center.

  17. Variational estimate method for solving autonomous ordinary differential equations

    NASA Astrophysics Data System (ADS)

    Mungkasi, Sudi

    2018-04-01

    In this paper, we propose a method for solving first-order autonomous ordinary differential equation problems using a variational estimate formulation. The variational estimate is constructed with a Lagrange multiplier which is chosen optimally, so that the formulation leads to an accurate solution to the problem. The variational estimate is an integral form, which can be computed using a computer software. As the variational estimate is an explicit formula, the solution is easy to compute. This is a great advantage of the variational estimate formulation.

  18. Observing Active Volcanism on Earth and Beyond With an Autonomous Science Investigation Capability

    NASA Astrophysics Data System (ADS)

    Davies, A. G.; Mjolsness, E. D.; Fink, W.; Castano, R.; Park, H. G.; Zak, M.; Burl, M. C.

    2001-12-01

    Operational constraints imposed by restricted downlink and long communication delays make autonomous systems a necessity for exploring dynamic processes in the Solar System and beyond. Our objective is to develop an onboard, modular, automated science analysis tool that will autonomously detect unexpected events, identify rare events at predicted sites, quantify the processes under study, and prioritize the science data and analyses as they are collected. A primary target for this capability is terrestrial active volcanism. Our integrated, science-driven command and control package represents the next stage of the automatic monitoring of volcanic activity pioneered by GOES. The resulting system will maximize science return from day-to-day instrument use and provide immediate reaction to capture the fullest information from infrequent events. For example, a sensor suite consisting of a Galileo-like multi-filter visible wavelength camera and an infrared spectrometer, can acquire high-spatial resolution data of eruptions of lava and volcanic plumes and identify large concentrations of volcanic SO2. After image/spectrum formation, software is applied to the data which is capable of change detection (in the visible and infrared), feature identification (both in imagery and spectra), and novelty detection. In this particular case, the latter module detects change in the parameter space of an advanced multi-component black-body volcanic thermal emission model by means of a novel technique called the "Grey-Box" method which analyzes time series data through a combination of deterministic and stochastic models. This approach can be demonstrated using data obtained by the Galileo spacecraft of ionian volcanism. The system autonomously identifies the most scientifically important targets and prioritizes data and analyses for return. All of these techniques have been successfully demonstrated in laboratory experiments, and are ready to be tested in an operational environment. After identification of a target of interest, an onboard planner prioritizes resources to obtain the best possible dataset of the identified process. We emphasize that the software is modular. The change detection and feature identification modules can be applied to any imaged dataset, and are not confined to volcanic targets. Applications are therefore widespread, across all NASA Enterprises. Examples include detection and quantification of extraterrestrial volcanism (Io, Triton), the monitoring of features in planetary atmospheres (Earth, Gas Giants), the ebb and flow of ices (Earth, Mars), asteriod, comet and supernova detection, change detection in magnetic fields, and identification of structure within radio outbursts.

  19. Automation study for space station subsystems and mission ground support

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An automation concept for the autonomous operation of space station subsystems, i.e., electric power, thermal control, and communications and tracking are discussed. To assure that functions essential for autonomous operations are not neglected, an operations function (systems monitoring and control) is included in the discussion. It is recommended that automated speech recognition and synthesis be considered a basic mode of man/machine interaction for space station command and control, and that the data management system (DMS) and other systems on the space station be designed to accommodate fully automated fault detection, isolation, and recovery within the system monitoring function of the DMS.

  20. Autonomous Multi-Sensor Coordination: The Science Goal Monitor

    NASA Technical Reports Server (NTRS)

    Koratkar, Anuradha; Grosvenor, Sandy; Jung, John; Hess, Melissa; Jones, Jeremy

    2004-01-01

    Many dramatic earth phenomena are dynamic and coupled. In order to fully understand them, we need to obtain timely coordinated multi-sensor observations from widely dispersed instruments. Such a dynamic observing system must include the ability to Schedule flexibly and react autonomously to sciencehser driven events; Understand higher-level goals of a sciencehser defined campaign; Coordinate various space-based and ground-based resources/sensors effectively and efficiently to achieve goals. In order to capture transient events, such a 'sensor web' system must have an automated reactive capability built into its scientific operations. To do this, we must overcome a number of challenges inherent in infusing autonomy. The Science Goal Monitor (SGM) is a prototype software tool being developed to explore the nature of automation necessary to enable dynamic observing. The tools being developed in SGM improve our ability to autonomously monitor multiple independent sensors and coordinate reactions to better observe dynamic phenomena. The SGM system enables users to specify what to look for and how to react in descriptive rather than technical terms. The system monitors streams of data to identify occurrences of the key events previously specified by the scientisther. When an event occurs, the system autonomously coordinates the execution of the users' desired reactions between different sensors. The information can be used to rapidly respond to a variety of fast temporal events. Investigators will no longer have to rely on after-the-fact data analysis to determine what happened. Our paper describes a series of prototype demonstrations that we have developed using SGM and NASA's Earth Observing-1 (EO-1) satellite and Earth Observing Systems' Aqua/Terra spacecrafts' MODIS instrument. Our demonstrations show the promise of coordinating data from different sources, analyzing the data for a relevant event, autonomously updating and rapidly obtaining a follow-on relevant image. SGM was used to investigate forest fires, floods and volcanic eruptions. We are now identifying new Earth science scenarios that will have more complex SGM reasoning. By developing and testing a prototype in an operational environment, we are also establishing and gathering metrics to gauge the success of automating science campaigns.

  1. Autonomous Operations Mission Development Suite

    NASA Technical Reports Server (NTRS)

    Toro Medina, Jaime A.

    2016-01-01

    This is a presentation related to the development of Autonomous Operations Systems at NASA Kennedy Space Center. It covers a high level description of the work of FY14, FY15, FY16 for the AES IGODU and APL projects.

  2. Bilevel shared control for teleoperators

    NASA Technical Reports Server (NTRS)

    Hayati, Samad A. (Inventor); Venkataraman, Subramanian T. (Inventor)

    1992-01-01

    A shared system is disclosed for robot control including integration of the human and autonomous input modalities for an improved control. Autonomously planned motion trajectories are modified by a teleoperator to track unmodelled target motions, while nominal teleoperator motions are modified through compliance to accommodate geometric errors autonomously in the latter. A hierarchical shared system intelligently shares control over a remote robot between the autonomous and teleoperative portions of an overall control system. Architecture is hierarchical, and consists of two levels. The top level represents the task level, while the bottom, the execution level. In space applications, the performance of pure teleoperation systems depend significantly on the communication time delays between the local and the remote sites. Selection/mixing matrices are provided with entries which reflect how each input's signals modality is weighted. The shared control minimizes the detrimental effects caused by these time delays between earth and space.

  3. Comparison of three control methods for an autonomous vehicle

    NASA Astrophysics Data System (ADS)

    Deshpande, Anup; Mathur, Kovid; Hall, Ernest

    2010-01-01

    The desirability and challenge of developing a completely autonomous vehicle and the rising need for more efficient use of energy by automobiles motivate this research- a study for an optimum solution to computer control of energy efficient vehicles. The purpose of this paper is to compare three control methods - mechanical, hydraulic and electric that have been used to convert an experimental all terrain vehicle to drive by wire which would eventually act as a test bed for conducting research on various technologies for autonomous operation. Computer control of basic operations in a vehicle namely steering, braking and speed control have been implemented and will be described in this paper. The output from a 3 axis motion controller is used for this purpose. The motion controller is interfaced with a software program using WSDK (Windows Servo Design Kit) as an intermediate tuning layer for tuning and parameter settings in autonomous operation. The software program is developed in C++. The voltage signal sent to the motion controller can be varied through the control program for desired results in controlling the steering motor, activating the hydraulic brakes and varying the vehicle's speed. The vehicle has been tested for its basic functionality which includes testing of street legal operations and also a 1000 mile test while running in a hybrid mode. The vehicle has also been tested for control when it is interfaced with devices such as a keyboard, joystick and sensors under full autonomous operation. The vehicle is currently being tested in various safety studies and is being used as a test bed for experiments in control courses and research studies. The significance of this research is in providing a greater understanding of conventional driving controls and the possibility of improving automobile safety by removing human error in control of a motor vehicle.

  4. Toward autonomous rotorcraft flight in degraded visual environments: experiments and lessons learned

    NASA Astrophysics Data System (ADS)

    Stambler, Adam; Spiker, Spencer; Bergerman, Marcel; Singh, Sanjiv

    2016-05-01

    Unmanned cargo delivery to combat outposts will inevitably involve operations in degraded visual environments (DVE). When DVE occurs, the aircraft autonomy system needs to be able to function regardless of the obscurant level. In 2014, Near Earth Autonomy established a baseline perception system for autonomous rotorcraft operating in clear air conditions, when its m3 sensor suite and perception software enabled autonomous, no-hover landings onto unprepared sites populated with obstacles. The m3's long-range lidar scanned the helicopter's path and the perception software detected obstacles and found safe locations for the helicopter to land. This paper presents the results of initial tests with the Near Earth perception system in a variety of DVE conditions and analyzes them from the perspective of mission performance and risk. Tests were conducted with the m3's lidar and a lightweight synthetic aperture radar in rain, smoke, snow, and controlled brownout experiments. These experiments showed the capability to penetrate through mild DVE but the perceptual capabilities became degraded with the densest brownouts. The results highlight the need for not only improved ability to see through DVE, but also for improved algorithms to monitor and report DVE conditions.

  5. The Carl Sagan solar and stellar observatories as remote observatories

    NASA Astrophysics Data System (ADS)

    Saucedo-Morales, J.; Loera-Gonzalez, P.

    In this work we summarize recent efforts made by the University of Sonora, with the goal of expanding the capability for remote operation of the Carl Sagan Solar and Stellar Observatories, as well as the first steps that have been taken in order to achieve autonomous robotic operation in the near future. The solar observatory was established in 2007 on the university campus by our late colleague A. Sánchez-Ibarra. It consists of four solar telescopes mounted on a single equatorial mount. On the other hand, the stellar observatory, which saw the first light on 16 February 2010, is located 21 km away from Hermosillo, Sonora at the site of the School of Agriculture of the University of Sonora. Both observatories can now be remotely controlled, and to some extent are able to operate autonomously. In this paper we discuss how this has been accomplished in terms of the use of software as well as the instruments under control. We also briefly discuss the main scientific and educational objectives, the future plans to improve the control software and to construct an autonomous observatory on a mountain site, as well as the opportunities for collaborations.

  6. Systems, methods and apparatus for modeling, specifying and deploying policies in autonomous and autonomic systems using agent-oriented software engineering

    NASA Technical Reports Server (NTRS)

    Sterritt, Roy (Inventor); Hinchey, Michael G. (Inventor); Penn, Joaquin (Inventor)

    2011-01-01

    Systems, methods and apparatus are provided through which in some embodiments, an agent-oriented specification modeled with MaCMAS, is analyzed, flaws in the agent-oriented specification modeled with MaCMAS are corrected, and an implementation is derived from the corrected agent-oriented specification. Described herein are systems, method and apparatus that produce fully (mathematically) tractable development of agent-oriented specification(s) modeled with methodology fragment for analyzing complex multiagent systems (MaCMAS) and policies for autonomic systems from requirements through to code generation. The systems, method and apparatus described herein are illustrated through an example showing how user formulated policies can be translated into a formal mode which can then be converted to code. The requirements-based programming systems, method and apparatus described herein may provide faster, higher quality development and maintenance of autonomic systems based on user formulation of policies.

  7. Draper Laboratory small autonomous aerial vehicle

    NASA Astrophysics Data System (ADS)

    DeBitetto, Paul A.; Johnson, Eric N.; Bosse, Michael C.; Trott, Christian A.

    1997-06-01

    The Charles Stark Draper Laboratory, Inc. and students from Massachusetts Institute of Technology and Boston University have cooperated to develop an autonomous aerial vehicle that won the 1996 International Aerial Robotics Competition. This paper describes the approach, system architecture and subsystem designs for the entry. This entry represents a combination of many technology areas: navigation, guidance, control, vision processing, human factors, packaging, power, real-time software, and others. The aerial vehicle, an autonomous helicopter, performs navigation and control functions using multiple sensors: differential GPS, inertial measurement unit, sonar altimeter, and a flux compass. The aerial transmits video imagery to the ground. A ground based vision processor converts the image data into target position and classification estimates. The system was designed, built, and flown in less than one year and has provided many lessons about autonomous vehicle systems, several of which are discussed. In an appendix, our current research in augmenting the navigation system with vision- based estimates is presented.

  8. Informed maintenance for next generation space transportation systems

    NASA Astrophysics Data System (ADS)

    Fox, Jack J.

    2001-02-01

    Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives-maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2nd Generation Reusable Launch Vehicle Program. .

  9. System Engineering and Integration of Controls for Advanced Life Support

    NASA Technical Reports Server (NTRS)

    Overland, David; Hoo, Karlene; Ciskowski, Marvin

    2006-01-01

    The Advanced Integration Matrix (AIM) project at the Johnson Space Center (JSC) was chartered to study and solve systems-level integration issues for exploration missions. One of the first issues identified was an inability to conduct trade studies on control system architectures due to the absence of mature evaluation criteria. Such architectures are necessary to enable integration of regenerative life support systems. A team was formed to address issues concerning software and hardware architectures and system controls.. The team has investigated what is required to integrate controls for the types of non-linear dynamic systems encountered in advanced life support. To this end, a water processing bioreactor testbed is being developed which will enable prototyping and testing of integration strategies and technologies. Although systems such as the water bioreactors exhibit the complexities of interactions between control schemes most vividly, it is apparent that this behavior and its attendant risks will manifest itself among any set of interdependent autonomous control systems. A methodology for developing integration requirements for interdependent and autonomous systems is a goal of this team and this testbed. This paper is a high-level summary of the current status of the investigation, the issues encountered, some tentative conclusions, and the direction expected for further research.

  10. Reactive Sequencing for Autonomous Navigation Evolving from Phoenix Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Grasso, Christopher A.; Riedel, Joseph E.; Vaughan, Andrew T.

    2010-01-01

    Virtual Machine Language (VML) is an award-winning advanced procedural sequencing language in use on NASA deep-space missions since 1997, and was used for the successful entry, descent, and landing (EDL) of the Phoenix spacecraft onto the surface of Mars. Phoenix EDL utilized a state-oriented operations architecture which executed within the constraints of the existing VML 2.0 flight capability, compatible with the linear "land or die" nature of the mission. The intricacies of Phoenix EDL included the planned discarding of portions of the vehicle, the complex communications management for relay through on-orbit assets, the presence of temporally indeterminate physical events, and the need to rapidly catch up four days of sequencing should a reboot of the spacecraft flight computer occur shortly before atmospheric entry. These formidable operational challenges led to new techniques for packaging and coordinating reusable sequences called blocks using one-way synchronization via VML sequencing global variable events. The coordinated blocks acted as an ensemble to land the spacecraft, while individually managing various elements in as simple a fashion as possible. This paper outlines prototype VML 2.1 flight capabilities that have evolved from the one-way synchronization techniques in order to implement even more ambitious autonomous mission capabilities. Target missions for these new capabilities include autonomous touch-and-go sampling of cometary and asteroidal bodies, lunar landing of robotic missions, and ultimately landing of crewed lunar vehicles. Close proximity guidance, navigation, and control operations, on-orbit rendezvous, and descent and landing events featured in these missions require elaborate abort capability, manifesting highly non-linear scenarios that are so complex as to overtax traditional sequencing, or even the sort of one-way coordinated sequencing used during EDL. Foreseeing advanced command and control needs for small body and lunar landing guidance, navigation and control scenarios, work began three years ago on substantial upgrades to VML that are now being exercised in scenarios for lunar landing and comet/asteroid rendezvous. The advanced state-based approach includes coordinated state transition machines with distributed decision-making logic. These state machines are not merely sequences - they are reactive logic constructs capable of autonomous decision making within a well-defined domain. Combined with the JPL's AutoNav software used on Deep Space 1 and Deep Impact, the system allows spacecraft to autonomously navigate to an unmapped surface, soft-contact, and either land or ascend. The state machine architecture enabled by VML 2.1 has successfully performed sampling missions and lunar descent missions in a simulated environment, and is progressing toward flight capability. The authors are also investigating using the VML 2.1 flight director architecture to perform autonomous activities like rendezvous with a passive hypothetical Mars sample return capsule. The approach being pursued is similar to the touch-and-go sampling state machines, with the added complications associated with the search for, physical capture of, and securing of a separate spacecraft. Complications include optically finding and tracking the Orbiting Sample Capsule (OSC), keeping the OSC illuminated, making orbital adjustments, and physically capturing the OSC. Other applications could include autonomous science collection and fault compensation.

  11. Space Station module Power Management And Distribution (PMAD) system

    NASA Technical Reports Server (NTRS)

    Walls, Bryan

    1990-01-01

    This project consists of several tasks which are unified toward experimentally demonstrating the operation of a highly autonomous, user-supportive power management and distribution system for Space Station Freedom (SSF) habitation/laboratory modules. This goal will be extended to a demonstration of autonomous, cooperative power system operation for the whole SSF power system through a joint effort with NASA's Lewis Research Center, using their Autonomous Power System. Short term goals for the space station module power management and distribution include having an operational breadboard reflecting current plans for SSF, improving performance of the system communications, and improving the organization and mutability of the artificial intelligence (AI) systems. In the middle term, intermediate levels of autonomy will be added, user interfaces will be modified, and enhanced modeling capabilities will be integrated in the system. Long term goals involve conversion of all software into Ada, vigorous verification and validation efforts and, finally, seeing an impact of this research on the operation of SSF. Conversion of the system to a DC Star configuration is now in progress, and should be completed by the end of October, 1989. This configuration reflects the latest SSF module architecture. Hardware is now being procured which will improve system communications significantly. The Knowledge-Based Management System (KBMS) is initially developed and the rules from FRAMES have been implemented in the KBMS. Rules in the other two AI systems are also being grouped modularly, making them more tractable, and easier to eventually move into the KBMS. Adding an intermediate level of autonomy will require development of a planning utility, which will also be built using the KBMS. These changes will require having the user interface for the whole system available from one interface. An Enhanced Model will be developed, which will allow exercise of the system through the interface without requiring all of the power hardware to be operational. The functionality of the AI systems will continue to be advanced, including incipient failure detection. Ada conversion will begin with the lowest level processor (LLP) code. Then selected pieces of the higher level functionality will be recorded in Ada and, where possible, moved to the LLP level. Validation and verification will be done on the Ada code, and will complete sometimes after completion of the Ada conversion.

  12. Health monitoring of Japanese payload specialist: Autonomic nervous and cardiovascular responses under reduced gravity condition (L-0)

    NASA Technical Reports Server (NTRS)

    Sekiguchi, Chiharu

    1993-01-01

    In addition to health monitoring of the Japanese Payload Specialists (PS) during the flight, this investigation also focuses on the changes of cardiovascular hemodynamics during flight which will be conducted under the science collaboration with the Lower Body Negative Pressure (LBNP) Experiment of NASA. For the Japanese, this is an opportunity to examine firsthand the effects of microgravity of human physiology. We are particularly interested in the adaption process and how it relates to space motion sickness and cardiovascular deconditioning. By comparing data from our own experiment to data collected by others, we hope to understand the processes involved and find ways to avoid these problems for future Japanese astronauts onboard Space Station Freedom and other Japanese space ventures. The primary objective of this experiment is to monitor the health condition of Japanese Payload Specialists to maintain a good health status during and after space flight. The second purpose is to investigate the autonomic nervous system's response to space motion sickness. To achieve this, the function of the autonomic nervous system will be monitored using non-invasive techniques. Data obtained will be employed to evaluate the role of autonomic nervous system in space motion sickness and to predict susceptibility to space motion sickness. The third objective is evaluation of the adaption process of the cardiovascular system to microgravity. By observation of the hemodynamics using an echocardiogram we will gain insight on cardiovascular deconditioning. The last objective is to create a data base for use in the health care of Japanese astronauts by obtaining control data in experiment L-O in the SL-J mission.

  13. Multisensor robotic system for autonomous space maintenance and repair

    NASA Technical Reports Server (NTRS)

    Abidi, M. A.; Green, W. L.; Chandra, T.; Spears, J.

    1988-01-01

    The feasibility of realistic autonomous space manipulation tasks using multisensory information is demonstrated. The system is capable of acquiring, integrating, and interpreting multisensory data to locate, mate, and demate a Fluid Interchange System (FIS) and a Module Interchange System (MIS). In both cases, autonomous location of a guiding light target, mating, and demating of the system are performed. Implemented visio-driven techniques are used to determine the arbitrary two-dimensional position and orientation of the mating elements as well as the arbitrary three-dimensional position and orientation of the light targets. A force/torque sensor continuously monitors the six components of force and torque exerted on the end-effector. Both FIS and MIS experiments were successfully accomplished on mock-ups built for this purpose. The method is immune to variations in the ambient light, in particular because of the 90-minute day-night shift in space.

  14. Trajectory generation for an on-road autonomous vehicle

    NASA Astrophysics Data System (ADS)

    Horst, John; Barbera, Anthony

    2006-05-01

    We describe an algorithm that generates a smooth trajectory (position, velocity, and acceleration at uniformly sampled instants of time) for a car-like vehicle autonomously navigating within the constraints of lanes in a road. The technique models both vehicle paths and lane segments as straight line segments and circular arcs for mathematical simplicity and elegance, which we contrast with cubic spline approaches. We develop the path in an idealized space, warp the path into real space and compute path length, generate a one-dimensional trajectory along the path length that achieves target speeds and positions, and finally, warp, translate, and rotate the one-dimensional trajectory points onto the path in real space. The algorithm moves a vehicle in lane safely and efficiently within speed and acceleration maximums. The algorithm functions in the context of other autonomous driving functions within a carefully designed vehicle control hierarchy.

  15. The EO-1 autonomous sciencecraft and prospects for future autonomous space exploration

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    2005-01-01

    This paper describes the revolutionary new science enabled by onboard autonomy as well as impact on extended missions such as the Mars Exploration Rovers and Mars Odyssey as well as future missions in development.

  16. Research into command, control, and communications in space construction

    NASA Technical Reports Server (NTRS)

    Davis, Randal

    1990-01-01

    Coordinating and controlling large numbers of autonomous or semi-autonomous robot elements in a space construction activity will present problems that are very different from most command and control problems encountered in the space business. As part of our research into the feasibility of robot constructors in space, the CSC Operations Group is examining a variety of command, control, and communications (C3) issues. Two major questions being asked are: can we apply C3 techniques and technologies already developed for use in space; and are there suitable terrestrial solutions for extraterrestrial C3 problems? An overview of the control architectures, command strategies, and communications technologies that we are examining is provided and plans for simulations and demonstrations of our concepts are described.

  17. Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)

    NASA Technical Reports Server (NTRS)

    Folta, David C.; Hawkins, Albin; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    NASA's first autonomous formation flying mission completed its primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center (GSFC) implemented a universal 3-axis formation flying algorithm in an autonomous executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard flight design and presents the validation results of this unique system. Results from functionality assessment through fully autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a standalone algorithm.

  18. A New Simulation Framework for Autonomy in Robotic Missions

    NASA Technical Reports Server (NTRS)

    Flueckiger, Lorenzo; Neukom, Christian

    2003-01-01

    Autonomy is a key factor in remote robotic exploration and there is significant activity addressing the application of autonomy to remote robots. It has become increasingly important to have simulation tools available to test the autonomy algorithms. While indus1;rial robotics benefits from a variety of high quality simulation tools, researchers developing autonomous software are still dependent primarily on block-world simulations. The Mission Simulation Facility I(MSF) project addresses this shortcoming with a simulation toolkit that will enable developers of autonomous control systems to test their system s performance against a set of integrated, standardized simulations of NASA mission scenarios. MSF provides a distributed architecture that connects the autonomous system to a set of simulated components replacing the robot hardware and its environment.

  19. An experiment in vision based autonomous grasping within a reduced gravity environment

    NASA Technical Reports Server (NTRS)

    Grimm, K. A.; Erickson, J. D.; Anderson, G.; Chien, C. H.; Hewgill, L.; Littlefield, M.; Norsworthy, R.

    1992-01-01

    The National Aeronautics and Space Administration's Reduced Gravity Program (RGP) offers opportunities for experimentation in gravities of less than one-g. The Extravehicular Activity Helper/Retriever (EVAHR) robot project of the Automation and Robotics Division at the Lyndon B. Johnson Space Center in Houston, Texas, is undertaking a task that will culminate in a series of tests in simulated zero-g using this facility. A subset of the final robot hardware consisting of a three-dimensional laser mapper, a Robotics Research 807 arm, a Jameson JH-5 hand, and the appropriate interconnect hardware/software will be used. This equipment will be flown on the RGP's KC-135 aircraft. This aircraft will fly a series of parabolas creating the effect of zero-g. During the periods of zero-g, a number of objects will be released in front of the fixed base robot hardware in both static and dynamic configurations. The system will then inspect the object, determine the objects pose, plan a grasp strategy, and execute the grasp. This must all be accomplished in the approximately 27 seconds of zero-g.

  20. Using Existing NASA Satellites as Orbiting Testbeds to Accelerate Technology Infusion into Future Missions

    NASA Technical Reports Server (NTRS)

    Mandl, Daniel; Ly, Vuong; Frye, Stuart

    2006-01-01

    One of the shared problems for new space mission developers is that it is extremely difficult to infuse new technology into new missions unless that technology has been flight validated. Therefore, the issue is that new technology is required to fly on a successful mission for flight validation. We have been experimenting with new technology on existing satellites by retrofitting primarily the flight software while the missions are on-orbit to experiment with new operations concepts. Experiments have been using Earth Observing 1 (EO-1), which is part of the New Millennium Program at NASA. EO-1 finished its prime mission one year after its launch on November 21,2000. From November 21,2001 until the present, EO-1 has been used in parallel with additional science data gathering to test out various sensor web concepts. Similarly, the Cosmic Hot Interstellar Plasma Spectrometer (CHIPS) satellite was also a one year mission flown by the University of Berkeley, sponsored by NASA and whose prime mission ended August 30,2005. Presently, CHIPS is being used to experiment with a seamless space to ground interface by installing Core Flight System (cFS), a "plug-and-play" architecture developed by the Flight Software Branch at NASA/GSFC on top of the existing space-to-ground Internet Protocol (IP) interface that CHIPS implemented. For example, one targeted experiment is to connect CHIPS to a rover via this interface and the Internet, and trigger autonomous actions on CHIPS, the rover or both. Thus far, having satellites to experiment with new concepts has turned out to be an inexpensive way to infuse new technology for future missions. Relevant experiences thus far and future plans will be discussed in this presentation.

  1. Development and Execution of Autonomous Procedures Onboard the International Space Station to Support the Next Phase of Human Space Exploration

    NASA Technical Reports Server (NTRS)

    Beisert, Susan; Rodriggs, Michael; Moreno, Francisco; Korth, David; Gibson, Stephen; Lee, Young H.; Eagles, Donald E.

    2013-01-01

    Now that major assembly of the International Space Station (ISS) is complete, NASA's focus has turned to using this high fidelity in-space research testbed to not only advance fundamental science research, but also demonstrate and mature technologies and develop operational concepts that will enable future human exploration missions beyond low Earth orbit. The ISS as a Testbed for Analog Research (ISTAR) project was established to reduce risks for manned missions to exploration destinations by utilizing ISS as a high fidelity micro-g laboratory to demonstrate technologies, operations concepts, and techniques associated with crew autonomous operations. One of these focus areas is the development and execution of ISS Testbed for Analog Research (ISTAR) autonomous flight crew procedures intended to increase crew autonomy that will be required for long duration human exploration missions. Due to increasing communications delays and reduced logistics resupply, autonomous procedures are expected to help reduce crew reliance on the ground flight control team, increase crew performance, and enable the crew to become more subject-matter experts on both the exploration space vehicle systems and the scientific investigation operations that will be conducted on a long duration human space exploration mission. These tests make use of previous or ongoing projects tested in ground analogs such as Research and Technology Studies (RATS) and NASA Extreme Environment Mission Operations (NEEMO). Since the latter half of 2012, selected non-critical ISS systems crew procedures have been used to develop techniques for building ISTAR autonomous procedures, and ISS flight crews have successfully executed them without flight controller involvement. Although the main focus has been preparing for exploration, the ISS has been a beneficiary of this synergistic effort and is considering modifying additional standard ISS procedures that may increase crew efficiency, reduce operational costs, and raise the amount of crew time available for scientific research. The next phase of autonomous procedure development is expected to include payload science and human research investigations. Additionally, ISS International Partners have expressed interest in participating in this effort. The recently approved one-year crew expedition starting in 2015, consisting of one Russian and one U.S. Operating Segment (USOS) crewmember, will be used not only for long duration human research investigations but also for the testing of exploration operations concepts, including crew autonomy.

  2. Simulation of the outdoor energy efficiency of an autonomous solar kit based on meteorological data for a site in Central Europa

    NASA Astrophysics Data System (ADS)

    Bouzaki, Mohammed Moustafa; Chadel, Meriem; Benyoucef, Boumediene; Petit, Pierre; Aillerie, Michel

    2016-07-01

    This contribution analyzes the energy provided by a solar kit dedicated to autonomous usage and installed in Central Europa (Longitude 6.10°; Latitude 49.21° and Altitude 160 m) by using the simulation software PVSYST. We focused the analysis on the effect of temperature and solar irradiation on the I-V characteristic of a commercial PV panel. We also consider in this study the influence of charging and discharging the battery on the generator efficiency. Meteorological data are integrated into the simulation software. As expected, the solar kit provides an energy varying all along the year with a minimum in December. In the proposed approach, we consider this minimum as the lowest acceptable energy level to satisfy the use. Thus for the other months, a lost in the available renewable energy exists if no storage system is associated.

  3. The Standard Autonomous File Server, A Customized, Off-the-Shelf Success Story

    NASA Technical Reports Server (NTRS)

    Semancik, Susan K.; Conger, Annette M.; Obenschain, Arthur F. (Technical Monitor)

    2001-01-01

    The Standard Autonomous File Server (SAFS), which includes both off-the-shelf hardware and software, uses an improved automated file transfer process to provide a quicker, more reliable, prioritized file distribution for customers of near real-time data without interfering with the assets involved in the acquisition and processing of the data. It operates as a stand-alone solution, monitoring itself, and providing an automated fail-over process to enhance reliability. This paper describes the unique problems and lessons learned both during the COTS selection and integration into SAFS, and the system's first year of operation in support of NASA's satellite ground network. COTS was the key factor in allowing the two-person development team to deploy systems in less than a year, meeting the required launch schedule. The SAFS system has been so successful; it is becoming a NASA standard resource, leading to its nomination for NASA's Software of the Year Award in 1999.

  4. Autonomous power management and distribution

    NASA Technical Reports Server (NTRS)

    Dolce, Jim; Kish, Jim

    1990-01-01

    The goal of the Autonomous Power System program is to develop and apply intelligent problem solving and control to the Space Station Freedom's electric power testbed being developed at NASA's Lewis Research Center. Objectives are to establish artificial intelligence technology paths, craft knowledge-based tools and products for power systems, and integrate knowledge-based and conventional controllers. This program represents a joint effort between the Space Station and Office of Aeronautics and Space Technology to develop and demonstrate space electric power automation technology capable of: (1) detection and classification of system operating status, (2) diagnosis of failure causes, and (3) cooperative problem solving for power scheduling and failure recovery. Program details, status, and plans will be presented.

  5. Autonomous Monitoring of Radiation Environment and Personal Systems for Crew Enhanced SPE Protection (AMORE and PSYCHE)

    NASA Astrophysics Data System (ADS)

    Narici, L.; Baiocco, G.; Berrilli, F.; Giraudo, M.; Ottolenghi, A.; Rizzo, A.; Salina, G.

    2018-02-01

    Understand the relationship between SPE precursors, the related SPE radiation inside the Deep Space Gateway, and the associated risk levels, validating existing models, proposing countermeasures actions via a real time, autonomous intelligent system.

  6. Autonomous Commanding of the WIRE Spacecraft

    NASA Technical Reports Server (NTRS)

    Prior, Mike; Walyus, Keith; Saylor, Rick

    1999-01-01

    This paper presents the end-to-end design architecture for an autonomous commanding capability to be used on the Wide Field Infrared Explorer (WIRE) mission for the uplink of command loads during unattended station contacts. The WIRE mission is the fifth and final mission of NASA's Goddard Space Flight Center Small Explorer (SMEX) series to be launched in March of 1999. Its primary mission is the targeting of deep space fields using an ultra-cooled infrared telescope. Due to its mission design WIRE command loads are large (approximately 40 Kbytes per 24 hours) and must be performed daily. To reduce the cost of mission operations support that would be required in order to uplink command loads, the WIRE Flight Operations Team has implemented an autonomous command loading capability. This capability allows completely unattended operations over a typical two- day weekend period. The key factors driving design and implementation of this capability were: 1) Integration with already existing ground system autonomous capabilities and systems, 2) The desire to evolve autonomous operations capabilities based upon previous SMEX operations experience 3) Integration with ground station operations - both autonomous and man-tended, 4) Low cost and quick implementation, and 5) End-to-end system robustness. A trade-off study was performed to examine these factors in light of the low-cost, higher-risk SMEX mission philosophy. The study concluded that a STOL (Spacecraft Test and Operations Language) based script, highly integrated with other scripts used to perform autonomous operations, was best suited given the budget and goals of the mission. Each of these factors is discussed to provide an overview of the autonomous operations capabilities implemented for the mission. The capabilities implemented on the WIRE mission are an example of a low-cost, robust, and efficient method for autonomous command loading when implemented with other autonomous features of the ground system. They can be used as a design and implementation template by other small satellite missions interested in evolving toward autonomous and lower cost operations.

  7. Autonomic Management of Space Missions. Chapter 12

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walt; Rouff, Christopher A.; Sterritt, Roy

    2006-01-01

    With NASA s renewed commitment to outer space exploration, greater emphasis is being placed on both human and robotic exploration. Even when humans are involved in the exploration, human tending of assets becomes cost-prohibitive or in many cases is simply not feasible. In addition, certain exploration missions will require spacecraft that will be capable of venturing where humans cannot be sent. Early space missions were operated manually from ground control centers with little or no automated operations. In the mid-l980s, the high costs of satellite operations prompted NASA, and others, to begin automating as many functions as possible. In our context, a system is autonomous if it can achieve its goals without human intervention. A number of more-or-less automated ground systems exist today, but work continues with the goal being to reduce operations costs to even lower levels. Cost reductions can be achieved in a number of areas. Ground control and spacecraft operations are two such areas where greater autonomy can reduce costs. As a consequence, autonomy is increasingly seen as a critical approach for robotic missions and for some aspects of manned missions. Although autonomy will be critical for the success of future missions (and indeed will enable certain kinds of science data gathering approaches), missions imbued with autonomy must also exhibit autonomic properties. Exploitation of autonomy alone, without emphasis on autonomic properties, will leave spacecraft vulnerable to the dangerous environments in which they must operate. Without autonomic properties, a spacecraft may be unable to recognize negative environmental effects on its components and subsystems, or may be unable to take any action to ameliorate the effects. The spacecraft, though operating autonomously, may then sustain a degradation of performance of components or subsystems, and consequently may have a reduced potential for achieving mission objectives. In extreme cases, lack of autonomic properties could leave the spacecraft unable to recover from faults. Ensuring that exploration spacecraft have autonomic properties will increase the survivability and therefore the likelihood of success of these missions. In fact, over time, as mission requirements increased demands on spacecraft capabilities and longevity, designers have gradually built more autonomicity into spacecraft. For example, a spacecraft in low-earth orbit may experience an out-of-bounds perturbation of its attitude (orientation) due to increased drag caused by increased atmospheric density at its altitude as a result of a sufficiently large solar flare. If the spacecraft was designed to recognize the excessive attitude perturbation, it could decide to protect itself by going into a safe-hold mode where its internal configuration and operation are altered to conserve power and its coarse attitude is adjusted to point its solar panels toward the Sun to maximize power generation. This is an example of a simple type of autonomic behavior that has actually occurred. Future mission concepts will be increasingly dependent on space system survivability enabled by more advanced types of autonomic behaviors

  8. Integrating small satellite communication in an autonomous vehicle network: A case for oceanography

    NASA Astrophysics Data System (ADS)

    Guerra, André G. C.; Ferreira, António Sérgio; Costa, Maria; Nodar-López, Diego; Aguado Agelet, Fernando

    2018-04-01

    Small satellites and autonomous vehicles have greatly evolved in the last few decades. Hundreds of small satellites have been launched with increasing functionalities, in the last few years. Likewise, numerous autonomous vehicles have been built, with decreasing costs and form-factor payloads. Here we focus on combining these two multifaceted assets in an incremental way, with an ultimate goal of alleviating the logistical expenses in remote oceanographic operations. The first goal is to create a highly reliable and constantly available communication link for a network of autonomous vehicles, taking advantage of the small satellite lower cost, with respect to conventional spacecraft, and its higher flexibility. We have developed a test platform as a proving ground for this network, by integrating a satellite software defined radio on an unmanned air vehicle, creating a system of systems, and several tests have been run successfully, over land. As soon as the satellite is fully operational, we will start to move towards a cooperative network of autonomous vehicles and small satellites, with application in maritime operations, both in-situ and remote sensing.

  9. Aeroelastic Scaling of a Joined Wing Aircraft Concept

    DTIC Science & Technology

    2010-01-11

    waxed and then peel ply is laid down, next the layers of fabric are laid down (outermost first) with an outer layer of light glass scrim used as the...A parametric model is developed using Phoenix Integration’s Model Center Software (MC). This model includes the vortex lattice software, AVL that...piece of real-time footage taken from the on-board, gimbaled camera. 2009 Progress Report 27 Figure 35 – initial Autonomous Flight After

  10. Intelligent Hardware-Enabled Sensor and Software Safety and Health Management for Autonomous UAS

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Y.; Schumann, Johann; Ippolito, Corey

    2015-01-01

    Unmanned Aerial Systems (UAS) can only be deployed if they can effectively complete their mission and respond to failures and uncertain environmental conditions while maintaining safety with respect to other aircraft as well as humans and property on the ground. We propose to design a real-time, onboard system health management (SHM) capability to continuously monitor essential system components such as sensors, software, and hardware systems for detection and diagnosis of failures and violations of safety or performance rules during the ight of a UAS. Our approach to SHM is three-pronged, providing: (1) real-time monitoring of sensor and software signals; (2) signal analysis, preprocessing, and advanced on-the- y temporal and Bayesian probabilistic fault diagnosis; (3) an unobtrusive, lightweight, read-only, low-power hardware realization using Field Programmable Gate Arrays (FPGAs) in order to avoid overburdening limited computing resources or costly re-certi cation of ight software due to instrumentation. No currently available SHM capabilities (or combinations of currently existing SHM capabilities) come anywhere close to satisfying these three criteria yet NASA will require such intelligent, hardwareenabled sensor and software safety and health management for introducing autonomous UAS into the National Airspace System (NAS). We propose a novel approach of creating modular building blocks for combining responsive runtime monitoring of temporal logic system safety requirements with model-based diagnosis and Bayesian network-based probabilistic analysis. Our proposed research program includes both developing this novel approach and demonstrating its capabilities using the NASA Swift UAS as a demonstration platform.

  11. Experiments in teleoperator and autonomous control of space robotic vehicles

    NASA Technical Reports Server (NTRS)

    Alexander, Harold L.

    1990-01-01

    A research program and strategy are described which include fundamental teleoperation issues and autonomous-control issues of sensing and navigation for satellite robots. The program consists of developing interfaces for visual operation and studying the consequences of interface designs as well as developing navigation and control technologies based on visual interaction. A space-robot-vehicle simulator is under development for use in virtual-environment teleoperation experiments and neutral-buoyancy investigations. These technologies can be utilized in a study of visual interfaces to address tradeoffs between head-tracking and manual remote cameras, panel-mounted and helmet-mounted displays, and stereoscopic and monoscopic display systems. The present program can provide significant data for the development of control experiments for autonomously controlled satellite robots.

  12. Autonomous Modal Identification of the Space Shuttle Tail Rudder

    NASA Technical Reports Server (NTRS)

    Pappa, Richard S.; James, George H., III; Zimmerman, David C.

    1997-01-01

    Autonomous modal identification automates the calculation of natural vibration frequencies, damping, and mode shapes of a structure from experimental data. This technology complements damage detection techniques that use continuous or periodic monitoring of vibration characteristics. The approach shown in the paper incorporates the Eigensystem Realization Algorithm (ERA) as a data analysis engine and an autonomous supervisor to condense multiple estimates of modal parameters using ERA's Consistent-Mode Indicator and correlation of mode shapes. The procedure was applied to free-decay responses of a Space Shuttle tail rudder and successfully identified the seven modes of the structure below 250 Hz. The final modal parameters are a condensed set of results for 87 individual ERA cases requiring approximately five minutes of CPU time on a DEC Alpha computer.

  13. Visual Odometry for Autonomous Deep-Space Navigation

    NASA Technical Reports Server (NTRS)

    Robinson, Shane; Pedrotty, Sam

    2016-01-01

    Visual Odometry fills two critical needs shared by all future exploration architectures considered by NASA: Autonomous Rendezvous and Docking (AR&D), and autonomous navigation during loss of comm. To do this, a camera is combined with cutting-edge algorithms (called Visual Odometry) into a unit that provides accurate relative pose between the camera and the object in the imagery. Recent simulation analyses have demonstrated the ability of this new technology to reliably, accurately, and quickly compute a relative pose. This project advances this technology by both preparing the system to process flight imagery and creating an activity to capture said imagery. This technology can provide a pioneering optical navigation platform capable of supporting a wide variety of future missions scenarios: deep space rendezvous, asteroid exploration, loss-of-comm.

  14. AirSTAR Hardware and Software Design for Beyond Visual Range Flight Research

    NASA Technical Reports Server (NTRS)

    Laughter, Sean; Cox, David

    2016-01-01

    The National Aeronautics and Space Administration (NASA) Airborne Subscale Transport Aircraft Research (AirSTAR) Unmanned Aerial System (UAS) is a facility developed to study the flight dynamics of vehicles in emergency conditions, in support of aviation safety research. The system was upgraded to have its operational range significantly expanded, going beyond the line of sight of a ground-based pilot. A redesign of the airborne flight hardware was undertaken, as well as significant changes to the software base, in order to provide appropriate autonomous behavior in response to a number of potential failures and hazards. Ground hardware and system monitors were also upgraded to include redundant communication links, including ADS-B based position displays and an independent flight termination system. The design included both custom and commercially available avionics, combined to allow flexibility in flight experiment design while still benefiting from tested configurations in reversionary flight modes. A similar hierarchy was employed in the software architecture, to allow research codes to be tested, with a fallback to more thoroughly validated flight controls. As a remotely piloted facility, ground systems were also developed to ensure the flight modes and system state were communicated to ground operations personnel in real-time. Presented in this paper is a general overview of the concept of operations for beyond visual range flight, and a detailed review of the airborne hardware and software design. This discussion is held in the context of the safety and procedural requirements that drove many of the design decisions for the AirSTAR UAS Beyond Visual Range capability.

  15. Autonomous support for microorganism research in space

    NASA Astrophysics Data System (ADS)

    Fleet, M. L.; Smith, J. D.; Klaus, D. M.; Luttges, M. W.

    1993-02-01

    A preliminary design for performing on orbit, autonomous research on microorganisms and cultured cells/tissues is presented. The payload is designed to be compatible with the COMercial Experiment Transporter (COMET), an orbiter middeck locker interface and with Space Station Freedom. Uplink/downlink capabilities and sample return through controlled reentry are available for all carriers. Autonomous testing activities are preprogrammed with in-flight reprogrammability. Sensors for monitoring temperature, pH, light, gravity levels, vibrations, and radiation are provided for environmental regulation and experimental data collection. Additional data acquisition includes optical density measurement, microscopy, video, and film photography. On-board data storage capabilities are provided. A fluid transfer mechanism is utilized for inoculation, sampling, and nutrient replenishment of experiment cultures. In addition to payload design, research opportunities are explored to illustrate hardware versatility and function. The project is defined to provide biological data pertinent to extended duration crewed space flight including crew health issues and development of a Controlled Ecological Life Support System (CELSS). In addition, opportunities are opened for investigations leading to commercial applications of space, such as pharmaceutical development, modeling of terrestrial diseases, and material processing.

  16. Simulation of the communication system between an AUV group and a surface station

    NASA Astrophysics Data System (ADS)

    Burtovaya, D.; Demin, A.; Demeshko, M.; Moiseev, A.; Kudryashova, A.

    2017-01-01

    An object model for simulation of the communications system of an autonomous underwater vehicles (AUV) group with a surface station is proposed in the paper. Implementation of the model is made on the basis of the software package “Object Distribution Simulation”. All structural relationships and behavior details are described. The application was developed on the basis of the proposed model and is now used for computational experiments on the simulation of the communications system between the autonomous underwater vehicles group and a surface station.

  17. Preliminary Results of NASA's First Autonomous Formation Flying Experiment: Earth Observing-1 (EO-1)

    NASA Technical Reports Server (NTRS)

    Folta, David; Hawkins, Albin

    2001-01-01

    NASA's first autonomous formation flying mission is completing a primary goal of demonstrating an advanced technology called enhanced formation flying. To enable this technology, the Guidance, Navigation, and Control center at the Goddard Space Flight Center has implemented an autonomous universal three-axis formation flying algorithm in executive flight code onboard the New Millennium Program's (NMP) Earth Observing-1 (EO-1) spacecraft. This paper describes the mathematical background of the autonomous formation flying algorithm and the onboard design and presents the preliminary validation results of this unique system. Results from functionality assessment and autonomous maneuver control are presented as comparisons between the onboard EO-1 operational autonomous control system called AutoCon(tm), its ground-based predecessor, and a stand-alone algorithm.

  18. Advancing Autonomous Operations Technologies for NASA Missions

    NASA Technical Reports Server (NTRS)

    Cruzen, Craig; Thompson, Jerry Todd

    2013-01-01

    This paper discusses the importance of implementing advanced autonomous technologies supporting operations of future NASA missions. The ability for crewed, uncrewed and even ground support systems to be capable of mission support without external interaction or control has become essential as space exploration moves further out into the solar system. The push to develop and utilize autonomous technologies for NASA mission operations stems in part from the need to reduce operations cost while improving and increasing capability and safety. This paper will provide examples of autonomous technologies currently in use at NASA and will identify opportunities to advance existing autonomous technologies that will enhance mission success by reducing operations cost, ameliorating inefficiencies, and mitigating catastrophic anomalies.

  19. The promises and perils of hospital autonomy: reform by decree in Viet Nam.

    PubMed

    London, Jonathan D

    2013-11-01

    This article investigates impacts of hospital autonomization in Viet Nam employing a "decision-space" framework that examines how hospitals have used their increased discretion and to what effect. Analysis suggests autonomization is associated with increased revenue, increasing staff pay, and greater investment in infrastructure and equipment. But autonomization is also associated with more costly and intensive treatment methods of uncertain contribution to the Vietnamese government's stated goal of quality healthcare for all. Impacts of autonomization in district hospitals are less striking. Despite certain limitations, the analysis generates key insights into early stages of hospital autonomization in Viet Nam. Copyright © 2013 The Author. Published by Elsevier Ltd.. All rights reserved.

  20. Advancing Autonomous Operations Technologies for NASA Missions

    NASA Technical Reports Server (NTRS)

    Cruzen, Craig; Thompson, Jerry T.

    2013-01-01

    This paper discusses the importance of implementing advanced autonomous technologies supporting operations of future NASA missions. The ability for crewed, uncrewed and even ground support systems to be capable of mission support without external interaction or control has become essential as space exploration moves further out into the solar system. The push to develop and utilize autonomous technologies for NASA mission operations stems in part from the need to reduce cost while improving and increasing capability and safety. This paper will provide examples of autonomous technologies currently in use at NASA and will identify opportunities to advance existing autonomous technologies that will enhance mission success by reducing cost, ameliorating inefficiencies, and mitigating catastrophic anomalies

  1. Tele-Supervised Adaptive Ocean Sensor Fleet

    NASA Technical Reports Server (NTRS)

    Lefes, Alberto; Podnar, Gregg W.; Dolan, John M.; Hosler, Jeffrey C.; Ames, Troy J.

    2009-01-01

    The Tele-supervised Adaptive Ocean Sensor Fleet (TAOSF) is a multi-robot science exploration architecture and system that uses a group of robotic boats (the Ocean-Atmosphere Sensor Integration System, or OASIS) to enable in-situ study of ocean surface and subsurface characteristics and the dynamics of such ocean phenomena as coastal pollutants, oil spills, hurricanes, or harmful algal blooms (HABs). The OASIS boats are extended- deployment, autonomous ocean surface vehicles. The TAOSF architecture provides an integrated approach to multi-vehicle coordination and sliding human-vehicle autonomy. One feature of TAOSF is the adaptive re-planning of the activities of the OASIS vessels based on sensor input ( smart sensing) and sensorial coordination among multiple assets. The architecture also incorporates Web-based communications that permit control of the assets over long distances and the sharing of data with remote experts. Autonomous hazard and assistance detection allows the automatic identification of hazards that require human intervention to ensure the safety and integrity of the robotic vehicles, or of science data that require human interpretation and response. Also, the architecture is designed for science analysis of acquired data in order to perform an initial onboard assessment of the presence of specific science signatures of immediate interest. TAOSF integrates and extends five subsystems developed by the participating institutions: Emergent Space Tech - nol ogies, Wallops Flight Facility, NASA s Goddard Space Flight Center (GSFC), Carnegie Mellon University, and Jet Propulsion Laboratory (JPL). The OASIS Autonomous Surface Vehicle (ASV) system, which includes the vessels as well as the land-based control and communications infrastructure developed for them, controls the hardware of each platform (sensors, actuators, etc.), and also provides a low-level waypoint navigation capability. The Multi-Platform Simulation Environment from GSFC is a surrogate for the OASIS ASV system and allows for independent development and testing of higher-level software components. The Platform Communicator acts as a proxy for both actual and simulated platforms. It translates platform-independent messages from the higher control systems to the device-dependent communication protocols. This enables the higher-level control systems to interact identically with heterogeneous actual or simulated platforms.

  2. Informed maintenance for next generation reusable launch systems

    NASA Astrophysics Data System (ADS)

    Fox, Jack J.; Gormley, Thomas J.

    2001-03-01

    Perhaps the most substantial single obstacle to progress of space exploration and utilization of space for human benefit is the safety & reliability and the inherent cost of launching to, and returning from, space. The primary influence in the high costs of current launch systems (the same is true for commercial and military aircraft and most other reusable systems) is the operations, maintenance and infrastructure portion of the program's total life cycle costs. Reusable Launch Vehicle (RLV) maintenance and design have traditionally been two separate engineering disciplines with often conflicting objectives - maximizing ease of maintenance versus optimizing performance, size and cost. Testability analysis, an element of Informed Maintenance (IM), has been an ad hoc, manual effort, in which maintenance engineers attempt to identify an efficient method of troubleshooting for the given product, with little or no control over product design. Therefore, testability deficiencies in the design cannot be rectified. It is now widely recognized that IM must be engineered into the product at the design stage itself, so that an optimal compromise is achieved between system maintainability and performance. The elements of IM include testability analysis, diagnostics/prognostics, automated maintenance scheduling, automated logistics coordination, paperless documentation and data mining. IM derives its heritage from complimentary NASA science, space and aeronautic enterprises such as the on-board autonomous Remote Agent Architecture recently flown on NASA's Deep Space 1 Probe as well as commercial industries that employ quick turnaround operations. Commercial technologies and processes supporting NASA's IM initiatives include condition based maintenance technologies from Boeing's Commercial 777 Aircraft and Lockheed-Martin's F-22 Fighter, automotive computer diagnostics and autonomous controllers that enable 100,000 mile maintenance free operations, and locomotive monitoring system software. This paper will summarize NASA's long-term strategy, development, and implementation plans for Informed Maintenance for next generation RLVs. This will be done through a convergence into a single IM vision the work being performed throughout NASA, industry and academia. Additionally, a current status of IM development throughout NASA programs such as the Space Shuttle, X-33, X-34 and X-37 will be provided and will conclude with an overview of near-term work that is being initiated in FY00 to support NASA's 2 nd Generation Reusable Launch Vehicle Program.

  3. First Image from a Mars Rover Choosing a Target, False Color

    NASA Image and Video Library

    2010-03-23

    This image is the result of the first observation of a target selected autonomously by NASA Opportunity using newly developed and uploaded software called AEGIS. The false color makes some differences between materials easier to see.

  4. Vision requirements for Space Station applications

    NASA Technical Reports Server (NTRS)

    Crouse, K. R.

    1985-01-01

    Problems which will be encountered by computer vision systems in Space Station operations are discussed, along with solutions be examined at Johnson Space Station. Lighting cannot be controlled in space, nor can the random presence of reflective surfaces. Task-oriented capabilities are to include docking to moving objects, identification of unexpected objects during autonomous flights to different orbits, and diagnoses of damage and repair requirements for autonomous Space Station inspection robots. The approaches being examined to provide these and other capabilities are television IR sensors, advanced pattern recognition programs feeding on data from laser probes, laser radar for robot eyesight and arrays of SMART sensors for automated location and tracking of target objects. Attention is also being given to liquid crystal light valves for optical processing of images for comparisons with on-board electronic libraries of images.

  5. A new software on TUG-T60 autonomous telescope for astronomical transient events

    NASA Astrophysics Data System (ADS)

    Dindar, Murat; Helhel, Selçuk; Esenoğlu, Hasan; Parmaksızoğlu, Murat

    2015-03-01

    Robotic telescopes usually run under the control of a scheduler, which provides high-level control by selecting astronomical targets for observation. TÜBİTAK (Scientific and Technological Research Council of Turkey) National Observatory (TUG)-T60 Robotic Telescope is controlled by open-source OCAAS software, formally named Talon. This study introduces new software which was designed for Talon to catch GRB, GAIA and transient alerts. The new GRB software module (daemon process) alertd is running with all other modules of Talon such as telescoped; focus, dome; camerad and telrun. Maximum slew velocity and acceleration limits of the T60 telescope are enough fast for the GRB and transient observations.

  6. Effects of a Passive Online Software Application on Heart Rate Variability and Autonomic Nervous System Balance.

    PubMed

    Rubik, Beverly

    2017-01-01

    This study investigated whether short-term exposure to a passive online software application of purported subtle energy technology would affect heart rate variability (HRV) and associated autonomic nervous system measures. This was a randomized, double-blinded, sham-controlled clinical trial (RCT). The study took place in a nonprofit laboratory in Emeryville, California. Twenty healthy, nonsmoking subjects (16 females), aged 40-75 years, participated. Quantum Code Technology ™ (QCT), a purported subtle energy technology, was delivered through a passive software application (Heart+ App) on a smartphone placed <1 m from subjects who were seated and reading a catalog. HRV was measured for 5 min in triplicate for each condition via finger plethysmography using a Food and Drug Administration medically approved HRV measurement device. Measurements were made at baseline and 35 min following exposure to the software applications. The following parameters were calculated and analyzed: heart rate, total power, standard deviation node-to-node, root mean square sequential difference, low frequency to high frequency ratio (LF/HF), low frequency (LF), and high frequency (HF). Paired samples t-tests showed that for the Heart+ App, mean LF/HF decreased (p = 9.5 × 10 -4 ), while mean LF decreased in a trend (p = 0.06), indicating reduced sympathetic dominance. Root mean square sequential difference increased for the Heart+ App, showing a possible trend (p = 0.09). Post-pre differences in LF/HF for sham compared with the Heart+ App were also significant (p < 0.008) by independent t-test, indicating clinical relevance. Significant beneficial changes in mean LF/HF, along with possible trends in mean LF and root mean square sequential difference, were observed in subjects following 35 min exposure to the Heart+ App that was working in the background on an active smartphone untouched by the subjects. This may be the first RCT to show that specific frequencies of a purported non-Hertzian type of subtle energy conveyed by software applications broadcast from personal electronic devices can be bioactive and beneficially impact autonomic nervous system balance.

  7. Use of Semi-Autonomous Tools for ISS Commanding and Monitoring

    NASA Technical Reports Server (NTRS)

    Brzezinski, Amy S.

    2014-01-01

    As the International Space Station (ISS) has moved into a utilization phase, operations have shifted to become more ground-based with fewer mission control personnel monitoring and commanding multiple ISS systems. This shift to fewer people monitoring more systems has prompted use of semi-autonomous console tools in the ISS Mission Control Center (MCC) to help flight controllers command and monitor the ISS. These console tools perform routine operational procedures while keeping the human operator "in the loop" to monitor and intervene when off-nominal events arise. Two such tools, the Pre-positioned Load (PPL) Loader and Automatic Operators Recorder Manager (AutoORM), are used by the ISS Communications RF Onboard Networks Utilization Specialist (CRONUS) flight control position. CRONUS is responsible for simultaneously commanding and monitoring the ISS Command & Data Handling (C&DH) and Communications and Tracking (C&T) systems. PPL Loader is used to uplink small pieces of frequently changed software data tables, called PPLs, to ISS computers to support different ISS operations. In order to uplink a PPL, a data load command must be built that contains multiple user-input fields. Next, a multiple step commanding and verification procedure must be performed to enable an onboard computer for software uplink, uplink the PPL, verify the PPL has incorporated correctly, and disable the computer for software uplink. PPL Loader provides different levels of automation in both building and uplinking these commands. In its manual mode, PPL Loader automatically builds the PPL data load commands but allows the flight controller to verify and save the commands for future uplink. In its auto mode, PPL Loader automatically builds the PPL data load commands for flight controller verification, but automatically performs the PPL uplink procedure by sending commands and performing verification checks while notifying CRONUS of procedure step completion. If an off-nominal condition occurs during procedure execution, PPL Loader notifies CRONUS through popup messages, allowing CRONUS to examine the situation and choose an option of how PPL loader should proceed with the procedure. The use of PPL Loader to perform frequent, routine PPL uplinks offloads CRONUS to better monitor two ISS systems. It also reduces procedure performance time and decreases risk of command errors. AutoORM identifies ISS communication outage periods and builds commands to lock, playback, and unlock ISS Operations Recorder files. Operation Recorder files are circular buffer files of continually recorded ISS telemetry data. Sections of these files can be locked from further writing, be played back to capture telemetry data that occurred during an ISS loss of signal (LOS) period, and then be unlocked for future recording use. Downlinked Operation Recorder files are used by mission support teams for data analysis, especially if failures occur during LOS. The commands to lock, playback, and unlock Operations Recorder files are encompassed in three different operational procedures and contain multiple user-input fields. AutoORM provides different levels of automation for building and uplinking the commands to lock, playback, and unlock Operations Recorder files. In its automatic mode, AutoORM automatically detects ISS LOS periods, then generates and uplinks the commands to lock, playback, and unlock Operations Recorder files when MCC regains signal with ISS. AutoORM also features semi-autonomous and manual modes which integrate CRONUS more into the command verification and uplink process. AutoORMs ability to automatically detect ISS LOS periods and build the necessary commands to preserve, playback, and release recorded telemetry data greatly offloads CRONUS to perform more high-level cognitive tasks, such as mission planning and anomaly troubleshooting. Additionally, since Operations Recorder commands contain numerical time input fields which are tedious for a human to manually build, AutoORM's ability to automatically build commands reduces operational command errors. PPL Loader and AutoORM demonstrate principles of semi-autonomous operational tools that will benefit future space mission operations. Both tools employ different levels of automation to perform simple and routine procedures, thereby offloading human operators to perform higher-level cognitive tasks. Because both tools provide procedure execution status and highlight off-nominal indications, the flight controller is able to intervene during procedure execution if needed. Semi-autonomous tools and systems that can perform routine procedures, yet keep human operators informed of execution, will be essential in future long-duration missions where the onboard crew will be solely responsible for spacecraft monitoring and control.

  8. Synchronization of autonomous objects in discrete event simulation

    NASA Technical Reports Server (NTRS)

    Rogers, Ralph V.

    1990-01-01

    Autonomous objects in event-driven discrete event simulation offer the potential to combine the freedom of unrestricted movement and positional accuracy through Euclidean space of time-driven models with the computational efficiency of event-driven simulation. The principal challenge to autonomous object implementation is object synchronization. The concept of a spatial blackboard is offered as a potential methodology for synchronization. The issues facing implementation of a spatial blackboard are outlined and discussed.

  9. Using ANTS to explore small body populations in the solar system.

    NASA Astrophysics Data System (ADS)

    Clark, P. E.; Rilee, M.; Truszkowski, W.; Curtis, S.; Marr, G.; Chapman, C.

    2001-11-01

    ANTS (Autonomous Nano-Technology Swarm), a NASA advanced mission concept, is a large (100 to 1000 member) swarm of pico-class (1 kg) totally autonomous spacecraft that prospect the asteroid belt. Little data is available for asteroids because the vast majority are too small to be observed except in close proximity. Light curves are available for thousands of asteroids, confirmed trajectories for tens of thousands, detailed shape models for approximately ten. Asteroids originated in the transitional region between the inner (rocky) and outer (solidified gases) solar system. Many have remained largely unmodified since formation, and thus have more primitive composition than planetary surfaces. Determination of the systematic distribution of physical and compositional properties within the asteroid population is crucial in the understanding of solar system formation. The traditional exploration approach of using few, large spacecraft for sequential exploration, could be improved. Our far more cost-effective approach utilizes distributed intelligence in a swarm of tiny highly maneuverable spacecraft, each with specialized instrument capability (e.g., advanced computing, imaging, spectrometry). NASA is at the forefront of Intelligent Software Agents (ISAs) research, performing experiments in space and on the ground to advance deliberative and collaborative autonomous control techniques. The advanced development under consideration here is in the use of ISAs at a strategic level, to explore remote frontiers of the solar system, potentially involving a large class of objects such as asteroids. Supervised clusters of spacecraft operate simultaneously within a broadly defined framework of goals to select targets (> 1000) from among available candidates while developing scenarios for studying targets. Swarm members use solar sails to fly directly to asteroids > 1 kilometer in diameter, and then perform maneuvers appropriate for the instrument carried, ranging from hovering to orbiting. Selected members return with data and are replaced as needed.

  10. System Engineering Paper

    NASA Technical Reports Server (NTRS)

    Heise, James; Hull, Bethanne J.; Bauer, Jonathan; Beougher, Nathan G.; Boe, Caleb; Canahui, Ricardo; Charles, John P.; Cooper, Zachary Davis Job; DeShaw, Mark A.; Fontanella, Luan Gasparetto; hide

    2012-01-01

    The Iowa State University team, Team LunaCY, is composed of the following sub-teams: the main student organization, the Lunabotics Club; a senior mechanical engineering design course, ME 415; a senior multidisciplinary design course, ENGR 466; and a senior design course from Wartburg College in Waverly, Iowa. Team LunaCY designed and fabricated ART-E III, Astra Robotic Tractor- Excavator the Third, for the team's third appearance in the NASA Lunabotic Mining competition. While designing ART-E III, the team had four main goals for this year's competition:to reduce the total weight of the robot, to increase the amount of regolith simulant mined, to reduce dust, and to make ART-E III autonomous. After many designs and research, a final robot design was chosen that obtained all four goals of Team LunaCY. A few changes Team LunaCY made this year was to go to the electrical, computer, and software engineering club fest at Iowa State University to recruit engineering students to accomplish the task of making ART-E III autonomous. Team LunaCY chose to use LabView to program the robot and various sensors were installed to measure the distance between the robot and the surroundings to allow ART-E III to maneuver autonomously. Team LunaCY also built a testing arena to test prototypes and ART-E III in. To best replicate the competition arena at the Kennedy Space Center, a regolith simulant was made from sand, QuickCrete, and fly ash to cover the floor of the arena. Team LunaCY also installed fans to allow ventilation in the arena and used proper safety attire when working in the arena . With the additional practice in the testing arena and innovative robot design, Team LunaCY expects to make a strong appearance at the 2012 NASA Lunabotic Mining Competition. .

  11. Issues Regarding the Future Application of Autonomous Systems to Command and Control (C2)

    DTIC Science & Technology

    2015-06-01

    working with Lockheed Martin to build a fleet of land and air drones to deliver cars and even containers of soldiers[OG13]. 5.3.4 Space Deep Space 1...Orlando Belo. Autonomous forex trading agents. In Petra Perner, editor, Advances in Data Mining. Medical Applications, E- Commerce, Marketing, and...http://pando.com/2013/04/02/ want-to-take-on-wall-street-quantopians-algorithmic-trading- platform-now-accepts-outside-data-sets/. CC05. Martin

  12. Autonomous Mechanical Assembly on the Space Shuttle: An Overview

    NASA Technical Reports Server (NTRS)

    Raibert, M. H.

    1979-01-01

    The space shuttle will be equipped with a pair of 50 ft. manipulators used to handle payloads and to perform mechanical assembly operations. Although current plans call for these manipulators to be operated by a human teleoperator. The possibility of using results from robotics and machine intelligence to automate this shuttle assembly system was investigated. The major components of an autonomous mechanical assembly system are examined, along with the technology base upon which they depend. The state of the art in advanced automation is also assessed.

  13. Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.

    2007-01-01

    This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.

  14. An AI Approach to Ground Station Autonomy for Deep Space Communications

    NASA Technical Reports Server (NTRS)

    Fisher, Forest; Estlin, Tara; Mutz, Darren; Paal, Leslie; Law, Emily; Stockett, Mike; Golshan, Nasser; Chien, Steve

    1998-01-01

    This paper describes an architecture for an autonomous deep space tracking station (DS-T). The architecture targets fully automated routine operations encompassing scheduling and resource allocation, antenna and receiver predict generation. track procedure generation from service requests, and closed loop control and error recovery for the station subsystems. This architecture has been validated by the construction of a prototype DS-T station, which has performed a series of demonstrations of autonomous ground station control for downlink services with NASA's Mars Global Surveyor (MGS).

  15. An autonomous payload controller for the Space Shuttle

    NASA Technical Reports Server (NTRS)

    Hudgins, J. I.

    1979-01-01

    The Autonomous Payload Control (APC) system discussed in the present paper was designed on the basis of such criteria as minimal cost of implementation, minimal space required in the flight-deck area, simple operation with verification of the results, minimal additional weight, minimal impact on Orbiter design, and minimal impact on Orbiter payload integration. In its present configuration, the APC provides a means for the Orbiter crew to control as many as 31 autononous payloads. The avionics and human engineering aspects of the system are discussed.

  16. First Results from a Hardware-in-the-Loop Demonstration of Closed-Loop Autonomous Formation Flying

    NASA Technical Reports Server (NTRS)

    Gill, E.; Naasz, Bo; Ebinuma, T.

    2003-01-01

    A closed-loop system for the demonstration of formation flying technologies has been developed at NASA s Goddard Space Flight Center. Making use of a GPS signal simulator with a dual radio frequency outlet, the system includes two GPS space receivers as well as a powerful onboard navigation processor dedicated to the GPS-based guidance, navigation, and control of a satellite formation in real-time. The closed-loop system allows realistic simulations of autonomous formation flying scenarios, enabling research in the fields of tracking and orbit control strategies for a wide range of applications. A sample scenario has been set up where the autonomous transition of a satellite formation from an initial along-track separation of 800 m to a final distance of 100 m has been demonstrated. As a result, a typical control accuracy of about 5 m has been achieved which proves the applicability of autonomous formation flying techniques to formations of satellites as close as 50 m.

  17. Science, technology and the future of small autonomous drones.

    PubMed

    Floreano, Dario; Wood, Robert J

    2015-05-28

    We are witnessing the advent of a new era of robots - drones - that can autonomously fly in natural and man-made environments. These robots, often associated with defence applications, could have a major impact on civilian tasks, including transportation, communication, agriculture, disaster mitigation and environment preservation. Autonomous flight in confined spaces presents great scientific and technical challenges owing to the energetic cost of staying airborne and to the perceptual intelligence required to negotiate complex environments. We identify scientific and technological advances that are expected to translate, within appropriate regulatory frameworks, into pervasive use of autonomous drones for civilian applications.

  18. Autonomous cloud based site monitoring through hydro geophysical data assimilation, processing and result delivery

    NASA Astrophysics Data System (ADS)

    Versteeg, R.; Johnson, D. V.; Rodzianko, A.; Zhou, H.; Dafflon, B.; Leger, E.; de Kleine, M.

    2017-12-01

    Understanding of processes in the shallow subsurface requires that geophysical, biogeochemical, hydrological and remote sensing datasets are assimilated, processed and interpreted. Multiple enabling software capabilities for process understanding have been developed by the science community. These include information models (ODM2), reactive transport modeling (PFLOTRAN, Modflow, CLM, Landlab), geophysical inversion (E4D, BERT), parameter estimation (PEST, DAKOTA), visualization (ViSiT, Paraview, D3, QGIS) as well as numerous tools written in python and R for petrophysical mapping, stochastic modeling, data analysis and so on. These capabilities use data collected using sensors and analytical tools developed by multiple manufacturers which produce many different measurements. While scientists obviously leverage tools, capabilities and lessons learned from one site at other sites, the current approach to site characterization and monitoring is very labor intensive and does not scale well. Our objective is to be able to monitor many (hundreds - thousands) of sites. This requires that monitoring can be done in a near time, affordable, auditable and essentially autonomous manner. For this we have developed a modular vertically integrated cloud based software framework which was designed from the ground up for effective site and process monitoring. This software framework (PAF - Predictive Assimilation Framework) is multitenant software and provides automation of data ingestion, processing and visualization of hydrological, geochemical and geophysical (ERT/DTS) data. The core organizational element of PAF is a project/user one in which capabilities available to users are controlled by a combination of available data and access permissions. All PAF capabilities are exposed through APIs, making it easy to quickly add new components. PAF is fully integrated with newly developed autonomous electrical geophysical hardware and thus allows for automation of electrical geophysical ingestion and processing and the ability for co analysis and visualization of the raw and processed data with other data of interest (e.g. soil temperature, soil moisture, precipitation). We will demonstrate current PAF capabilities and discuss future efforts.

  19. [Micron]ADS-B Detect and Avoid Flight Tests on Phantom 4 Unmanned Aircraft System

    NASA Technical Reports Server (NTRS)

    Arteaga, Ricardo; Dandachy, Mike; Truong, Hong; Aruljothi, Arun; Vedantam, Mihir; Epperson, Kraettli; McCartney, Reed

    2018-01-01

    Researchers at the National Aeronautics and Space Administration Armstrong Flight Research Center in Edwards, California and Vigilant Aerospace Systems collaborated for the flight-test demonstration of an Automatic Dependent Surveillance-Broadcast based collision avoidance technology on a small unmanned aircraft system equipped with the uAvionix Automatic Dependent Surveillance-Broadcast transponder. The purpose of the testing was to demonstrate that National Aeronautics and Space Administration / Vigilant software and algorithms, commercialized as the FlightHorizon UAS"TM", are compatible with uAvionix hardware systems and the DJI Phantom 4 small unmanned aircraft system. The testing and demonstrations were necessary for both parties to further develop and certify the technology in three key areas: flights beyond visual line of sight, collision avoidance, and autonomous operations. The National Aeronautics and Space Administration and Vigilant Aerospace Systems have developed and successfully flight-tested an Automatic Dependent Surveillance-Broadcast Detect and Avoid system on the Phantom 4 small unmanned aircraft system. The Automatic Dependent Surveillance-Broadcast Detect and Avoid system architecture is especially suited for small unmanned aircraft systems because it integrates: 1) miniaturized Automatic Dependent Surveillance-Broadcast hardware; 2) radio data-link communications; 3) software algorithms for real-time Automatic Dependent Surveillance-Broadcast data integration, conflict detection, and alerting; and 4) a synthetic vision display using a fully-integrated National Aeronautics and Space Administration geobrowser for three dimensional graphical representations for ownship and air traffic situational awareness. The flight-test objectives were to evaluate the performance of Automatic Dependent Surveillance-Broadcast Detect and Avoid collision avoidance technology as installed on two small unmanned aircraft systems. In December 2016, four flight tests were conducted at Edwards Air Force Base. Researchers in the ground control station looking at displays were able to verify the Automatic Dependent Surveillance-Broadcast target detection and collision avoidance resolutions.

  20. Flight Test Results from the Low Power Transceiver Communications and Navigation Demonstration on Shuttle (CANDOS)

    NASA Technical Reports Server (NTRS)

    Rush, John; Israel, David; Harlacher, Marc; Haas, Lin

    2003-01-01

    The Low Power Transceiver (LPT) is an advanced signal processing platform that offers a configurable and reprogrammable capability for supporting communications, navigation and sensor functions for mission applications ranging from spacecraft TT&C and autonomous orbit determination to sophisticated networks that use crosslinks to support communications and real-time relative navigation for formation flying. The LPT is the result of extensive collaborative research under NASNGSFC s Advanced Technology Program and ITT Industries internal research and development efforts. Its modular, multi-channel design currently enables transmitting and receiving communication signals on L- or S-band frequencies and processing GPS L-band signals for precision navigation. The LPT flew as a part of the GSFC Hitchhiker payload named Fast Reaction Experiments Enabling Science Technology And Research (FREESTAR) on-board Space Shuttle Columbia s final mission. The experiment demonstrated functionality in GPS-based navigation and orbit determination, NASA STDN Ground Network communications, space relay communications via the NASA TDRSS, on-orbit reconfiguration of the software radio, the use of the Internet Protocol (IP) for TT&C, and communication concepts for space based range safety. All data from the experiment was recovered and, as a result, all primary and secondary objectives of the experiment were successful. This paper presents the results of the LPTs maiden space flight as a part of STS- 107.

  1. An Autonomous Autopilot Control System Design for Small-Scale UAVs

    NASA Technical Reports Server (NTRS)

    Ippolito, Corey; Pai, Ganeshmadhav J.; Denney, Ewen W.

    2012-01-01

    This paper describes the design and implementation of a fully autonomous and programmable autopilot system for small scale autonomous unmanned aerial vehicle (UAV) aircraft. This system was implemented in Reflection and has flown on the Exploration Aerial Vehicle (EAV) platform at NASA Ames Research Center, currently only as a safety backup for an experimental autopilot. The EAV and ground station are built on a component-based architecture called the Reflection Architecture. The Reflection Architecture is a prototype for a real-time embedded plug-and-play avionics system architecture which provides a transport layer for real-time communications between hardware and software components, allowing each component to focus solely on its implementation. The autopilot module described here, although developed in Reflection, contains no design elements dependent on this architecture.

  2. Using Planning, Scheduling and Execution for Autonomous Mars Rover Operations

    NASA Technical Reports Server (NTRS)

    Estlin, Tara A.; Gaines, Daniel M.; Chouinard, Caroline M.; Fisher, Forest W.; Castano, Rebecca; Judd, Michele J.; Nesnas, Issa A.

    2006-01-01

    With each new rover mission to Mars, rovers are traveling significantly longer distances. This distance increase raises not only the opportunities for science data collection, but also amplifies the amount of environment and rover state uncertainty that must be handled in rover operations. This paper describes how planning, scheduling and execution techniques can be used onboard a rover to autonomously generate and execute rover activities and in particular to handle new science opportunities that have been identified dynamically. We also discuss some of the particular challenges we face in supporting autonomous rover decision-making. These include interaction with rover navigation and path-planning software and handling large amounts of uncertainty in state and resource estimations. Finally, we describe our experiences in testing this work using several Mars rover prototypes in a realistic environment.

  3. Autonomous intelligent assembly systems LDRD 105746 final report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Robert J.

    2013-04-01

    This report documents a three-year to develop technology that enables mobile robots to perform autonomous assembly tasks in unstructured outdoor environments. This is a multi-tier problem that requires an integration of a large number of different software technologies including: command and control, estimation and localization, distributed communications, object recognition, pose estimation, real-time scanning, and scene interpretation. Although ultimately unsuccessful in achieving a target brick stacking task autonomously, numerous important component technologies were nevertheless developed. Such technologies include: a patent-pending polygon snake algorithm for robust feature tracking, a color grid algorithm for uniquely identification and calibration, a command and control frameworkmore » for abstracting robot commands, a scanning capability that utilizes a compact robot portable scanner, and more. This report describes this project and these developed technologies.« less

  4. Closing the Gap between the Inside and the Outside: Interoceptive Sensitivity and Social Distances

    PubMed Central

    Ambrosecchia, Marianna; Gallese, Vittorio

    2013-01-01

    Humans’ ability to represent their body state from within through interoception has been proposed to predict different aspects of human cognition and behaviour. We focused on the possible contribution of interoceptive sensitivity to social behaviour as mediated by adaptive modulation of autonomic response. We, thus, investigated whether interoceptive sensitivity to one's heartbeat predicts participants' autonomic response at different social distances. We measured respiratory sinus arrhythmia (RSA) during either a Social or a Non-social task. In the Social task each participant viewed an experimenter performing a caress-like movement at different distances from their hand. In the Non-social task a metal stick was moved at the same distances from the participant's hand. We found a positive association between interoceptive sensitivity and autonomic response only for the social setting. Moreover, only good heartbeat perceivers showed higher autonomic response 1) in the social compared to the non-social setting, 2) specifically, when the experimenter's hand was moving at boundary of their peripersonal space (20 cm from the participant's hand). Our findings suggest that interoceptive sensitivity might contribute to interindividual differences concerning social attitudes and interpersonal space representation via recruitment of different adaptive autonomic response strategies. PMID:24098397

  5. About possibilities of clearing near-Earth space from dangerous debris by a spaceborne laser system with an autonomous cw chemical HF laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Avdeev, A V; Bashkin, A S; Katorgin, Boris I

    2011-07-31

    The possibility of clearing hazardous near-Earth space debris using a spaceborne laser station with a large autonomous cw chemical HF laser is substantiated and the requirements to its characteristics (i.e., power and divergence of laser radiation, pulse duration in the repetitively pulsed regime, repetition rate and total time of laser action on space debris, necessary to remove them from the orbits of the protected spacecrafts) are determined. The possibility of launching the proposed spaceborne laser station to the orbit with the help of a 'Proton-M' carrier rocket is considered. (laser applications)

  6. Building intelligent systems: Artificial intelligence research at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Friedland, P.; Lum, H.

    1987-01-01

    The basic components that make up the goal of building autonomous intelligent systems are discussed, and ongoing work at the NASA Ames Research Center is described. It is noted that a clear progression of systems can be seen through research settings (both within and external to NASA) to Space Station testbeds to systems which actually fly on the Space Station. The starting point for the discussion is a truly autonomous Space Station intelligent system, responsible for a major portion of Space Station control. Attention is given to research in fiscal 1987, including reasoning under uncertainty, machine learning, causal modeling and simulation, knowledge from design through operations, advanced planning work, validation methodologies, and hierarchical control of and distributed cooperation among multiple knowledge-based systems.

  7. Building intelligent systems - Artificial intelligence research at NASA Ames Research Center

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Lum, Henry

    1987-01-01

    The basic components that make up the goal of building autonomous intelligent systems are discussed, and ongoing work at the NASA Ames Research Center is described. It is noted that a clear progression of systems can be seen through research settings (both within and external to NASA) to Space Station testbeds to systems which actually fly on the Space Station. The starting point for the discussion is a 'truly' autonomous Space Station intelligent system, responsible for a major portion of Space Station control. Attention is given to research in fiscal 1987, including reasoning under uncertainty, machine learning, causal modeling and simulation, knowledge from design through operations, advanced planning work, validation methodologies, and hierarchical control of and distributed cooperation among multiple knowledge-based systems.

  8. Teaching practice and effect of the curriculum design and simulation courses under the support of professional optical software

    NASA Astrophysics Data System (ADS)

    Lin, YuanFang; Zheng, XiaoDong; Huang, YuJia

    2017-08-01

    Curriculum design and simulation courses are bridges to connect specialty theories, engineering practice and experimental skills. In order to help students to have the computer aided optical system design ability adapting to developments of the times, a professional optical software-Advanced System of Analysis Program (ASAP) was used in the research teaching of curriculum design and simulation courses. The ASAP tutorials conducting, exercises both complementing and supplementing the lectures, hands-on practice in class, autonomous learning and independent design after class were bridged organically, to guide students "learning while doing, learning by doing", paying more attention to the process instead of the results. Several years of teaching practice of curriculum design and simulation courses shows that, project-based learning meets society needs of training personnel with knowledge, ability and quality. Students have obtained not only skills of using professional software, but also skills of finding and proposing questions in engineering practice, the scientific method of analyzing and solving questions with specialty knowledge, in addition, autonomous learning ability, teamwork spirit and innovation consciousness, still scientific attitude of facing failure and scientific spirit of admitting deficiency in the process of independent design and exploration.

  9. Development of autonomous gamma dose logger for environmental monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jisha, N. V.; Krishnakumar, D. N.; Surya Prakash, G.

    2012-03-15

    Continuous monitoring and archiving of background radiation levels in and around the nuclear installation is essential and the data would be of immense use during analysis of any untoward incidents. A portable Geiger Muller detector based autonomous gamma dose logger (AGDL) for environmental monitoring is indigenously designed and developed. The system operations are controlled by microcontroller (AT89S52) and the main features of the system are software data acquisition, real time LCD display of radiation level, data archiving at removable compact flash card. The complete system operates on 12 V battery backed up by solar panel and hence the system ismore » totally portable and ideal for field use. The system has been calibrated with Co-60 source (8.1 MBq) at various source-detector distances. The system is field tested and performance evaluation is carried out. This paper covers the design considerations of the hardware, software architecture of the system along with details of the front-end operation of the autonomous gamma dose logger and the data file formats. The data gathered during field testing and inter comparison with GammaTRACER are also presented in the paper. AGDL has shown excellent correlation with energy fluence monitor tuned to identify {sup 41}Ar, proving its utility for real-time plume tracking and source term estimation.« less

  10. Development of autonomous gamma dose logger for environmental monitoring

    NASA Astrophysics Data System (ADS)

    Jisha, N. V.; Krishnakumar, D. N.; Surya Prakash, G.; Kumari, Anju; Baskaran, R.; Venkatraman, B.

    2012-03-01

    Continuous monitoring and archiving of background radiation levels in and around the nuclear installation is essential and the data would be of immense use during analysis of any untoward incidents. A portable Geiger Muller detector based autonomous gamma dose logger (AGDL) for environmental monitoring is indigenously designed and developed. The system operations are controlled by microcontroller (AT89S52) and the main features of the system are software data acquisition, real time LCD display of radiation level, data archiving at removable compact flash card. The complete system operates on 12 V battery backed up by solar panel and hence the system is totally portable and ideal for field use. The system has been calibrated with Co-60 source (8.1 MBq) at various source-detector distances. The system is field tested and performance evaluation is carried out. This paper covers the design considerations of the hardware, software architecture of the system along with details of the front-end operation of the autonomous gamma dose logger and the data file formats. The data gathered during field testing and inter comparison with GammaTRACER are also presented in the paper. AGDL has shown excellent correlation with energy fluence monitor tuned to identify 41Ar, proving its utility for real-time plume tracking and source term estimation.

  11. Minimal support technology and in situ resource utilization for risk management of planetary spaceflight missions

    NASA Astrophysics Data System (ADS)

    Murphy, K. L.; Rygalov, V. Ye.; Johnson, S. B.

    2009-04-01

    All artificial systems and components in space degrade at higher rates than on Earth, depending in part on environmental conditions, design approach, assembly technologies, and the materials used. This degradation involves not only the hardware and software systems but the humans that interact with those systems. All technological functions and systems can be expressed through functional dependence: [Function]˜[ERU]∗[RUIS]∗[ISR]/[DR];where [ERU]efficiency (rate) of environmental resource utilization[RUIS]resource utilization infrastructure[ISR]in situ resources[DR]degradation rateThe limited resources of spaceflight and open space for autonomous missions require a high reliability (maximum possible, approaching 100%) for system functioning and operation, and must minimize the rate of any system degradation. To date, only a continuous human presence with a system in the spaceflight environment can absolutely mitigate those degradations. This mitigation is based on environmental amelioration for both the technology systems, as repair of data and spare parts, and the humans, as exercise and psychological support. Such maintenance now requires huge infrastructures, including research and development complexes and management agencies, which currently cannot move beyond the Earth. When considering what is required to move manned spaceflight from near Earth stations to remote locations such as Mars, what are the minimal technologies and infrastructures necessary for autonomous restoration of a degrading system in space? In all of the known system factors of a mission to Mars that reduce the mass load, increase the reliability, and reduce the mission’s overall risk, the current common denominator is the use of undeveloped or untested technologies. None of the technologies required to significantly reduce the risk for critical systems are currently available at acceptable readiness levels. Long term interplanetary missions require that space programs produce a craft with all systems integrated so that they are of the highest reliability. Right now, with current technologies, we cannot guarantee this reliability for a crew of six for 1000 days to Mars and back. Investigation of the technologies to answer this need and a focus of resources and research on their advancement would significantly improve chances for a safe and successful mission.

  12. A unifying framework for systems modeling, control systems design, and system operation

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel L.; Indictor, Mark B.; Ingham, Michel D.; Rasmussen, Robert D.; Stringfellow, Margaret V.

    2005-01-01

    Current engineering practice in the analysis and design of large-scale multi-disciplinary control systems is typified by some form of decomposition- whether functional or physical or discipline-based-that enables multiple teams to work in parallel and in relative isolation. Too often, the resulting system after integration is an awkward marriage of different control and data mechanisms with poor end-to-end accountability. System of systems engineering, which faces this problem on a large scale, cries out for a unifying framework to guide analysis, design, and operation. This paper describes such a framework based on a state-, model-, and goal-based architecture for semi-autonomous control systems that guides analysis and modeling, shapes control system software design, and directly specifies operational intent. This paper illustrates the key concepts in the context of a large-scale, concurrent, globally distributed system of systems: NASA's proposed Array-based Deep Space Network.

  13. Complex scenes and situations visualization in hierarchical learning algorithm with dynamic 3D NeoAxis engine

    NASA Astrophysics Data System (ADS)

    Graham, James; Ternovskiy, Igor V.

    2013-06-01

    We applied a two stage unsupervised hierarchical learning system to model complex dynamic surveillance and cyber space monitoring systems using a non-commercial version of the NeoAxis visualization software. The hierarchical scene learning and recognition approach is based on hierarchical expectation maximization, and was linked to a 3D graphics engine for validation of learning and classification results and understanding the human - autonomous system relationship. Scene recognition is performed by taking synthetically generated data and feeding it to a dynamic logic algorithm. The algorithm performs hierarchical recognition of the scene by first examining the features of the objects to determine which objects are present, and then determines the scene based on the objects present. This paper presents a framework within which low level data linked to higher-level visualization can provide support to a human operator and be evaluated in a detailed and systematic way.

  14. National space transportation systems planning

    NASA Technical Reports Server (NTRS)

    Lucas, W. R.

    1985-01-01

    In the fall of 1984, the DOD and NASA had been asked to identify launch vehicle technologies which could be made available for use in 1995 to 2010. The results of the studies of the two groups were integrated, and a consumer report, dated December 1984, was forwarded to the President. Aspects of mission planning and analysis are discussed along with a combined mission model, future launch system requirements, a launch vehicle planning background, Shuttle derivative vehicle program options, payload modularization, launch vehicle technology implications, a new engine program for the mid-1990's. Future launch systems goals are to achieve an order of magnitude reduction in future launch cost and meet the lift requirements and launch rates. Attention is given to an advanced cryogenic engine, advanced LOX/hydrocarbon engine, advanced power systems, aerodynamics/flight mechanics, reentry/recovery systems, avionics/software, advanced manufacturing techniques, autonomous ground and mission operations, advanced structures/materials, and air breathing propulsion.

  15. Expanding the KATE toolbox

    NASA Technical Reports Server (NTRS)

    Thomas, Stan J.

    1993-01-01

    KATE (Knowledge-based Autonomous Test Engineer) is a model-based software system developed in the Artificial Intelligence Laboratory at the Kennedy Space Center for monitoring, fault detection, and control of launch vehicles and ground support systems. In order to bring KATE to the level of performance, functionality, and integratability needed for firing room applications, efforts are underway to implement KATE in the C++ programming language using an X-windows interface. Two programs which were designed and added to the collection of tools which comprise the KATE toolbox are described. The first tool, called the schematic viewer, gives the KATE user the capability to view digitized schematic drawings in the KATE environment. The second tool, called the model editor, gives the KATE model builder a tool for creating and editing knowledge base files. Design and implementation issues having to do with these two tools are discussed. It will be useful to anyone maintaining or extending either the schematic viewer or the model editor.

  16. Space Station Software Recommendations

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor)

    1985-01-01

    Four panels of invited experts and NASA representatives focused on the following topics: software management, software development environment, languages, and software standards. Each panel deliberated in private, held two open sessions with audience participation, and developed recommendations for the NASA Space Station Program. The major thrusts of the recommendations were as follows: (1) The software management plan should establish policies, responsibilities, and decision points for software acquisition; (2) NASA should furnish a uniform modular software support environment and require its use for all space station software acquired (or developed); (3) The language Ada should be selected for space station software, and NASA should begin to address issues related to the effective use of Ada; and (4) The space station software standards should be selected (based upon existing standards where possible), and an organization should be identified to promulgate and enforce them. These and related recommendations are described in detail in the conference proceedings.

  17. Space Station Software Issues

    NASA Technical Reports Server (NTRS)

    Voigt, S. (Editor); Beskenis, S. (Editor)

    1985-01-01

    Issues in the development of software for the Space Station are discussed. Software acquisition and management, software development environment, standards, information system support for software developers, and a future software advisory board are addressed.

  18. Multidisciplinary Concurrent Design Optimization via the Internet

    NASA Technical Reports Server (NTRS)

    Woodard, Stanley E.; Kelkar, Atul G.; Koganti, Gopichand

    2001-01-01

    A methodology is presented which uses commercial design and analysis software and the Internet to perform concurrent multidisciplinary optimization. The methodology provides a means to develop multidisciplinary designs without requiring that all software be accessible from the same local network. The procedures are amenable to design and development teams whose members, expertise and respective software are not geographically located together. This methodology facilitates multidisciplinary teams working concurrently on a design problem of common interest. Partition of design software to different machines allows each constituent software to be used on the machine that provides the most economy and efficiency. The methodology is demonstrated on the concurrent design of a spacecraft structure and attitude control system. Results are compared to those derived from performing the design with an autonomous FORTRAN program.

  19. A hazard control system for robot manipulators

    NASA Technical Reports Server (NTRS)

    Carter, Ruth Chiang; Rad, Adrian

    1991-01-01

    A robot for space applications will be required to complete a variety of tasks in an uncertain, harsh environment. This fact presents unusual and highly difficult challenges to ensuring the safety of astronauts and keeping the equipment they depend on from becoming damaged. The systematic approach being taken to control hazards that could result from introducing robotics technology in the space environment is described. First, system safety management and engineering principles, techniques, and requirements are discussed as they relate to Shuttle payload design and operation in general. The concepts of hazard, hazard category, and hazard control, as defined by the Shuttle payload safety requirements, is explained. Next, it is shown how these general safety management and engineering principles are being implemented on an actual project. An example is presented of a hazard control system for controlling one of the hazards identified for the Development Test Flight (DTF-1) of NASA's Flight Telerobotic Servicer, a teleoperated space robot. How these schemes can be applied to terrestrial robots is discussed as well. The same software monitoring and control approach will insure the safe operation of a slave manipulator under teleoperated or autonomous control in undersea, nuclear, or manufacturing applications where the manipulator is working in the vicinity of humans or critical hardware.

  20. Augmentation of the space station module power management and distribution breadboard

    NASA Technical Reports Server (NTRS)

    Walls, Bryan; Hall, David K.; Lollar, Louis F.

    1991-01-01

    The space station module power management and distribution (SSM/PMAD) breadboard models power distribution and management, including scheduling, load prioritization, and a fault detection, identification, and recovery (FDIR) system within a Space Station Freedom habitation or laboratory module. This 120 VDC system is capable of distributing up to 30 kW of power among more than 25 loads. In addition to the power distribution hardware, the system includes computer control through a hierarchy of processes. The lowest level consists of fast, simple (from a computing standpoint) switchgear that is capable of quickly safing the system. At the next level are local load center processors, (LLP's) which execute load scheduling, perform redundant switching, and shed loads which use more than scheduled power. Above the LLP's are three cooperating artificial intelligence (AI) systems which manage load prioritizations, load scheduling, load shedding, and fault recovery and management. Recent upgrades to hardware and modifications to software at both the LLP and AI system levels promise a drastic increase in speed, a significant increase in functionality and reliability, and potential for further examination of advanced automation techniques. The background, SSM/PMAD, interface to the Lewis Research Center test bed, the large autonomous spacecraft electrical power system, and future plans are discussed.

  1. Ground crewmen help guide the alignment of the X-40A as the experimental craft is gently lowered to the ground by a U.S. Army CH-47 Chinook helicopter following a captive-carry test flight

    NASA Image and Video Library

    2000-12-08

    Ground crewmen help guide the alignment of the X-40 technology demonstrator as the experimental craft is gently lowered to the ground by a U.S. Army CH-47 Chinook cargo helicopter following a captive-carry test flight at NASA's Dryden Flight Research Center, Edwards, California. The X-40 is an unpowered 82 percent scale version of the X-37, a Boeing-developed spaceplane designed to demonstrate various advanced technologies for development of future lower-cost access to space vehicles. The X-37 will be carried into space aboard a space shuttle and then released to perform various maneuvers and a controlled re-entry through the Earth's atmosphere to an airplane-style landing on a runway, controlled entirely by pre-programmed computer software. Following a series of captive-carry flights, the X-40 made several free flights from a launch altitude of about 15,000 feet above ground, gliding to a fully autonomous landing. The captive carry flights helped verify the X-40's navigation and control systems, rigging angles for its sling, and stability and control of the helicopter while carrying the X-40 on a tether.

  2. Supervisory autonomous local-remote control system design: Near-term and far-term applications

    NASA Technical Reports Server (NTRS)

    Zimmerman, Wayne; Backes, Paul

    1993-01-01

    The JPL Supervisory Telerobotics Laboratory (STELER) has developed a unique local-remote robot control architecture which enables management of intermittent bus latencies and communication delays such as those expected for ground-remote operation of Space Station robotic systems via the TDRSS communication platform. At the local site, the operator updates the work site world model using stereo video feedback and a model overlay/fitting algorithm which outputs the location and orientation of the object in free space. That information is relayed to the robot User Macro Interface (UMI) to enable programming of the robot control macros. The operator can then employ either manual teleoperation, shared control, or supervised autonomous control to manipulate the object under any degree of time-delay. The remote site performs the closed loop force/torque control, task monitoring, and reflex action. This paper describes the STELER local-remote robot control system, and further describes the near-term planned Space Station applications, along with potential far-term applications such as telescience, autonomous docking, and Lunar/Mars rovers.

  3. Applications of graphics to support a testbed for autonomous space vehicle operations

    NASA Technical Reports Server (NTRS)

    Schmeckpeper, K. R.; Aldridge, J. P.; Benson, S.; Horner, S.; Kullman, A.; Mulder, T.; Parrott, W.; Roman, D.; Watts, G.; Bochsler, Daniel C.

    1989-01-01

    Researchers describe their experience using graphics tools and utilities while building an application, AUTOPS, that uses a graphical Machintosh (TM)-like interface for the input and display of data, and animation graphics to enhance the presentation of results of autonomous space vehicle operations simulations. AUTOPS is a test bed for evaluating decisions for intelligent control systems for autonomous vehicles. Decisions made by an intelligent control system, e.g., a revised mission plan, might be displayed to the user in textual format or he can witness the effects of those decisions via out of window graphics animations. Although a textual description conveys essentials, a graphics animation conveys the replanning results in a more convincing way. Similarily, iconic and menu-driven screen interfaces provide the user with more meaningful options and displays. Presented here are experiences with the SunView and TAE Plus graphics tools used for interface design, and the Johnson Space Center Interactive Graphics Laboratory animation graphics tools used for generating out out of the window graphics.

  4. Sandia National Laboratories proof-of-concept robotic security vehicle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrington, J.J.; Jones, D.P.; Klarer, P.R.

    1989-01-01

    Several years ago Sandia National Laboratories developed a prototype interior robot that could navigate autonomously inside a large complex building to air and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modified andmore » integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities. 2 refs., 3 figs.« less

  5. Solar Thermal Utility-Scale Joint Venture Program (USJVP) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANCINI,THOMAS R.

    2001-04-01

    Several years ago Sandia National Laboratories developed a prototype interior robot [1] that could navigate autonomously inside a large complex building to aid and test interior intrusion detection systems. Recently the Department of Energy Office of Safeguards and Security has supported the development of a vehicle that will perform limited security functions autonomously in a structured exterior environment. The goal of the first phase of this project was to demonstrate the feasibility of an exterior robotic vehicle for security applications by using converted interior robot technology, if applicable. An existing teleoperational test bed vehicle with remote driving controls was modifiedmore » and integrated with a newly developed command driving station and navigation system hardware and software to form the Robotic Security Vehicle (RSV) system. The RSV, also called the Sandia Mobile Autonomous Navigator (SANDMAN), has been successfully used to demonstrate that teleoperated security vehicles which can perform limited autonomous functions are viable and have the potential to decrease security manpower requirements and improve system capabilities.« less

  6. Autonomous and Autonomic Swarms

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Rash, James L.; Truszkowski, Walter F.; Rouff, Christopher A.; Sterritt, Roy

    2005-01-01

    A watershed in systems engineering is represented by the advent of swarm-based systems that accomplish missions through cooperative action by a (large) group of autonomous individuals each having simple capabilities and no global knowledge of the group s objective. Such systems, with individuals capable of surviving in hostile environments, pose unprecedented challenges to system developers. Design and testing and verification at much higher levels will be required, together with the corresponding tools, to bring such systems to fruition. Concepts for possible future NASA space exploration missions include autonomous, autonomic swarms. Engineering swarm-based missions begins with understanding autonomy and autonomicity and how to design, test, and verify systems that have those properties and, simultaneously, the capability to accomplish prescribed mission goals. Formal methods-based technologies, both projected and in development, are described in terms of their potential utility to swarm-based system developers.

  7. Preliminary Results from a Model-Driven Architecture Methodology for Development of an Event-Driven Space Communications Service Concept

    NASA Technical Reports Server (NTRS)

    Roberts, Christopher J.; Morgenstern, Robert M.; Israel, David J.; Borky, John M.; Bradley, Thomas H.

    2017-01-01

    NASA's next generation space communications network will involve dynamic and autonomous services analogous to services provided by current terrestrial wireless networks. This architecture concept, known as the Space Mobile Network (SMN), is enabled by several technologies now in development. A pillar of the SMN architecture is the establishment and utilization of a continuous bidirectional control plane space link channel and a new User Initiated Service (UIS) protocol to enable more dynamic and autonomous mission operations concepts, reduced user space communications planning burden, and more efficient and effective provider network resource utilization. This paper provides preliminary results from the application of model driven architecture methodology to develop UIS. Such an approach is necessary to ensure systematic investigation of several open questions concerning the efficiency, robustness, interoperability, scalability and security of the control plane space link and UIS protocol.

  8. Control of a free-flying robot manipulator system

    NASA Technical Reports Server (NTRS)

    Alexander, H.

    1986-01-01

    The development of and test control strategies for self-contained, autonomous free flying space robots are discussed. Such a robot would perform operations in space similar to those currently handled by astronauts during extravehicular activity (EVA). Use of robots should reduce the expense and danger attending EVA both by providing assistance to astronauts and in many cases by eliminating altogether the need for human EVA, thus greatly enhancing the scope and flexibility of space assembly and repair activities. The focus of the work is to develop and carry out a program of research with a series of physical Satellite Robot Simulator Vehicles (SRSV's), two-dimensionally freely mobile laboratory models of autonomous free-flying space robots such as might perform extravehicular functions associated with operation of a space station or repair of orbiting satellites. It is planned, in a later phase, to extend the research to three dimensions by carrying out experiments in the Space Shuttle cargo bay.

  9. Virtual Mission Operations of Remote Sensors With Rapid Access To and From Space

    NASA Technical Reports Server (NTRS)

    Ivancic, William D.; Stewart, Dave; Walke, Jon; Dikeman, Larry; Sage, Steven; Miller, Eric; Northam, James; Jackson, Chris; Taylor, John; Lynch, Scott; hide

    2010-01-01

    This paper describes network-centric operations, where a virtual mission operations center autonomously receives sensor triggers, and schedules space and ground assets using Internet-based technologies and service-oriented architectures. For proof-of-concept purposes, sensor triggers are received from the United States Geological Survey (USGS) to determine targets for space-based sensors. The Surrey Satellite Technology Limited (SSTL) Disaster Monitoring Constellation satellite, the United Kingdom Disaster Monitoring Constellation (UK-DMC), is used as the space-based sensor. The UK-DMC s availability is determined via machine-to-machine communications using SSTL s mission planning system. Access to/from the UK-DMC for tasking and sensor data is via SSTL s and Universal Space Network s (USN) ground assets. The availability and scheduling of USN s assets can also be performed autonomously via machine-to-machine communications. All communication, both on the ground and between ground and space, uses open Internet standards.

  10. Satellite Servicing's Autonomous Rendezvous and Docking Testbed on the International Space Station

    NASA Technical Reports Server (NTRS)

    Naasz, Bo J.; Strube, Matthew; Van Eepoel, John; Barbee, Brent W.; Getzandanner, Kenneth M.

    2011-01-01

    The Space Servicing Capabilities Project (SSCP) at NASA's Goddard Space Flight Center (GSFC) has been tasked with developing systems for servicing space assets. Starting in 2009, the SSCP completed a study documenting potential customers and the business case for servicing, as well as defining several notional missions and required technologies. In 2010, SSCP moved to the implementation stage by completing several ground demonstrations and commencing development of two International Space Station (ISS) payloads-the Robotic Refueling Mission (RRM) and the Dextre Pointing Package (DPP)--to mitigate new technology risks for a robotic mission to service existing assets in geosynchronous orbit. This paper introduces the DPP, scheduled to fly in July of 2012 on the third operational SpaceX Dragon mission, and its Autonomous Rendezvous and Docking (AR&D) instruments. The combination of sensors and advanced avionics provide valuable on-orbit demonstrations of essential technologies for servicing existing vehicles, both cooperative and non-cooperative.

  11. End-to-End Data System Architecture for the Space Station Biological Research Project

    NASA Technical Reports Server (NTRS)

    Mian, Arshad; Scimemi, Sam; Adeni, Kaiser; Picinich, Lou; Ramos, Rubin (Technical Monitor)

    1998-01-01

    The Space Station Biological Research Project (SSBRP) Is developing hardware referred to as the "facility" for providing life sciences research capability on the International Space Station. This hardware includes several biological specimen habitats, habitat holding racks, a centrifuge and a glovebox. An SSBRP end to end data system architecture has been developed to allow command and control of the facility from the ground, either with crew assistance or autonomously. The data system will be capable of handling commands, sensor data, and video from multiple cameras. The data will traverse through several onboard and ground networks and processing entities including the SSBRP and Space Station onboard and ground data systems. A large number of onboard and ground (,entities of the data system are being developed by the Space Station Program, other NASA centers and the International Partners. The SSBRP part of the system which includes the habitats, holding racks, and the ground operations center, User Operations Facility (UOF) will be developed by a multitude of geographically distributed development organizations. The SSBRP has the responsibility to define the end to end data and communications systems to make the interfaces manageable and verifiable with multiple contractors with widely varying development constraints and schedules. This paper provides an overview of the SSBRP end-to-end data system. Specifically, it describes the hardware, software and functional interactions of individual systems, and interface requirements among various entities of the end-to-end system.

  12. An Architecture, System Engineering, and Acquisition Approach for Space System Software Resiliency

    NASA Astrophysics Data System (ADS)

    Phillips, Dewanne Marie

    Software intensive space systems can harbor defects and vulnerabilities that may enable external adversaries or malicious insiders to disrupt or disable system functions, risking mission compromise or loss. Mitigating this risk demands a sustained focus on the security and resiliency of the system architecture including software, hardware, and other components. Robust software engineering practices contribute to the foundation of a resilient system so that the system "can take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". Software resiliency must be a priority and addressed early in the life cycle development to contribute a secure and dependable space system. Those who develop, implement, and operate software intensive space systems must determine the factors and systems engineering practices to address when investing in software resiliency. This dissertation offers methodical approaches for improving space system resiliency through software architecture design, system engineering, increased software security, thereby reducing the risk of latent software defects and vulnerabilities. By providing greater attention to the early life cycle phases of development, we can alter the engineering process to help detect, eliminate, and avoid vulnerabilities before space systems are delivered. To achieve this objective, this dissertation will identify knowledge, techniques, and tools that engineers and managers can utilize to help them recognize how vulnerabilities are produced and discovered so that they can learn to circumvent them in future efforts. We conducted a systematic review of existing architectural practices, standards, security and coding practices, various threats, defects, and vulnerabilities that impact space systems from hundreds of relevant publications and interviews of subject matter experts. We expanded on the system-level body of knowledge for resiliency and identified a new software architecture framework and acquisition methodology to improve the resiliency of space systems from a software perspective with an emphasis on the early phases of the systems engineering life cycle. This methodology involves seven steps: 1) Define technical resiliency requirements, 1a) Identify standards/policy for software resiliency, 2) Develop a request for proposal (RFP)/statement of work (SOW) for resilient space systems software, 3) Define software resiliency goals for space systems, 4) Establish software resiliency quality attributes, 5) Perform architectural tradeoffs and identify risks, 6) Conduct architecture assessments as part of the procurement process, and 7) Ascertain space system software architecture resiliency metrics. Data illustrates that software vulnerabilities can lead to opportunities for malicious cyber activities, which could degrade the space mission capability for the user community. Reducing the number of vulnerabilities by improving architecture and software system engineering practices can contribute to making space systems more resilient. Since cyber-attacks are enabled by shortfalls in software, robust software engineering practices and an architectural design are foundational to resiliency, which is a quality that allows the system to "take a hit to a critical component and recover in a known, bounded, and generally acceptable period of time". To achieve software resiliency for space systems, acquirers and suppliers must identify relevant factors and systems engineering practices to apply across the lifecycle, in software requirements analysis, architecture development, design, implementation, verification and validation, and maintenance phases.

  13. Autonomous support for microorganism research in space

    NASA Technical Reports Server (NTRS)

    Fleet, Mary L.; Miller, Mark S.; Shipley, Derek, E.; Smith, Jeff D.

    1992-01-01

    A preliminary design for performing on orbit, autonomous research on microorganisms and cultured cells/tissues is presented. An understanding of gravity and its effects on cells is crucial for space exploration as well as for terrestrial applications. The payload is designed to be compatible with the Commercial Experiment Transporter (COMET) launch vehicle, an orbiter middeck locker interface, and with Space Station Freedom. Uplink/downlink capabilities and sample return through controlled reentry are available for all carriers. Autonomous testing activities are preprogrammed with in-flight reprogrammability. Sensors for monitoring temperature, pH, light, gravity levels, vibrations, and radiation are provided for environmental regulation and experimental data collection. Additional experimental data acquisition includes optical density measurement, microscopy, video, and film photography. On-board full data storage capabilities are provided. A fluid transfer mechanism is utilized for inoculation, sampling, and nutrient replenishment of experiment cultures. In addition to payload design, representative experiments were developed to ensure scientific objectives remained compatible with hardware capabilities. The project is defined to provide biological data pertinent to extended duration crewed space flight including crew health issues and development of a Controlled Ecological Life Support System (CELSS). In addition, opportunities are opened for investigations leading to commercial applications of space, such as pharmaceutical development, modeling of terrestrial diseases, and material processing.

  14. Autonomous support for microorganism research in space

    NASA Technical Reports Server (NTRS)

    Luttges, M. W.; Klaus, D. M.; Fleet, M. L.; Miller, M. S.; Shipley, D. E.; Smith, J. D.

    1992-01-01

    A preliminary design for performing on-orbit, autonomous research on microorganisms and cultured cells/tissues is presented. An understanding of gravity and its effects on cells is crucial for space exploration as well as for terrestrial applications. The payload is designed to be compatible with the COMmercial Experiment Transported (COMET) launch vehicle, an orbiter middeck locker interface, and with Space Station Freedom. Uplink/downlink capabilities and sample return through controlled reentry are available for all carriers. Autonomous testing activities are preprogrammed with inflight reprogrammability. Sensors for monitoring temperature, pH, light, gravity levels, vibration, and radiation are provided for environmental regulation and experimental data collection. Additional experiment data acquisition includes optical density measurement, microscopy, video, and file photography. Onboard full data storage capabilities are provided. A fluid transfer mechanism is utilized for inoculation, sampling, and nutrient replenishment of experiment cultures. In addition to payload design, representative experiments were developed to ensure scientific objectives remained compatible with hardware capabilities. The project is defined to provide biological data pertinent to extended duration crewed space flight including crew health issues and development of a Controlled Ecological Life Support System (CELSS). In addition, opportunities are opened for investigations leading to commercial applications of space, such as pharmaceutical development, modeling of terrestrial diseases, and material processing.

  15. Integrating the autonomous subsystems management process

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry R.

    1992-01-01

    Ways in which the ranking of the Space Station Module Power Management and Distribution testbed may be achieved and an individual subsystem's internal priorities may be managed within the complete system are examined. The application of these results in the integration and performance leveling of the autonomously managed system is discussed.

  16. Learning for autonomous navigation

    NASA Technical Reports Server (NTRS)

    Angelova, Anelia; Howard, Andrew; Matthies, Larry; Tang, Benyang; Turmon, Michael; Mjolsness, Eric

    2005-01-01

    Autonomous off-road navigation of robotic ground vehicles has important applications on Earth and in space exploration. Progress in this domain has been retarded by the limited lookahead range of 3-D sensors and by the difficulty of preprogramming systems to understand the traversability of the wide variety of terrain they can encounter.

  17. Reducing cost with autonomous operations of the Deep Space Network radio science receiver

    NASA Technical Reports Server (NTRS)

    Asmar, S.; Anabtawi, A.; Connally, M.; Jongeling, A.

    2003-01-01

    This paper describes the Radio Science Receiver system and the savings it has brought to mission operations. The design and implementation of remote and autonomous operations will be discussed along with the process of including user feedback along the way and lessons learned and procedures avoided.

  18. Neural Network Substorm Identification: Enabling TREx Sensor Web Modes

    NASA Astrophysics Data System (ADS)

    Chaddock, D.; Spanswick, E.; Arnason, K. M.; Donovan, E.; Liang, J.; Ahmad, S.; Jackel, B. J.

    2017-12-01

    Transition Region Explorer (TREx) is a ground-based sensor web of optical and radio instruments that is presently being deployed across central Canada. The project consists of an array of co-located blue-line, full-colour, and near-infrared all-sky imagers, imaging riometers, proton aurora spectrographs, and GNSS systems. A key goal of the TREx project is to create the world's first (artificial) intelligent sensor web for remote sensing space weather. The sensor web will autonomously control and coordinate instrument operations in real-time. To accomplish this, we will use real-time in-line analytics of TREx and other data to dynamically switch between operational modes. An operating mode could be, for example, to have a blue-line imager gather data at a one or two orders of magnitude higher cadence than it operates for its `baseline' mode. The software decision to increase the imaging cadence would be in response to an anticipated increase in auroral activity or other programmatic requirements. Our first test for TREx's sensor web technologies is to develop the capacity to autonomously alter the TREx operating mode prior to a substorm expansion phase onset. In this paper, we present our neural network analysis of historical optical and riometer data and our ability to predict an optical onset. We explore the preliminary insights into using a neural network to pick out trends and features which it deems are similar among substorms.

  19. Obtaining the Greatest Scientific Benefit from Observational Platforms by Consideration of the Relative Benefit of Observations

    NASA Technical Reports Server (NTRS)

    Chelberg, David; Drews, Frank; Fleeman, David; Welch, Lonnie; Marquart, Jane; Pfarr, Barbara

    2003-01-01

    One of the current trends in spacecraft software design is to increase the autonomy of onboard flight and science software. This is especially true when real-time observations may affect the observation schedule of a mission. For many science missions, such as those conducted by the Swift Burst Alert Telescope, the ability of the spacecraft to autonomously respond in real-time to unpredicted science events is crucial for mission success. We apply utility theory within resource management middleware to optimize the real-time performance of application software and achieve maximum system level benefit. We then explore how this methodology can be extended to manage both software and observational resources onboard a spacecraft to achieve the best possible observations.

  20. RoBlock: a prototype autonomous manufacturing cell

    NASA Astrophysics Data System (ADS)

    Baekdal, Lars K.; Balslev, Ivar; Eriksen, Rene D.; Jensen, Soren P.; Jorgensen, Bo N.; Kirstein, Brian; Kristensen, Bent B.; Olsen, Martin M.; Perram, John W.; Petersen, Henrik G.; Petersen, Morten L.; Ruhoff, Peter T.; Skjolstrup, Carl E.; Sorensen, Anders S.; Wagenaar, Jeroen M.

    2000-10-01

    RoBlock is the first phase of an internally financed project at the Institute aimed at building a system in which two industrial robots suspended from a gantry, as shown below, cooperate to perform a task specified by an external user, in this case, assembling an unstructured collection of colored wooden blocks into a specified 3D pattern. The blocks are identified and localized using computer vision and grasped with a suction cup mechanism. Future phases of the project will involve other processes such as grasping and lifting, as well as other types of robot such as autonomous vehicles or variable geometry trusses. Innovative features of the control software system include: The use of an advanced trajectory planning system which ensures collision avoidance based on a generalization of the method of artificial potential fields, the use of a generic model-based controller which learns the values of parameters, including static and kinetic friction, of a detailed mechanical model of itself by comparing actual with planned movements, the use of fast, flexible, and robust pattern recognition and 3D-interpretation strategies, integration of trajectory planning and control with the sensor systems in a distributed Java application running on a network of PC's attached to the individual physical components. In designing this first stage, the aim was to build in the minimum complexity necessary to make the system non-trivially autonomous and to minimize the technological risks. The aims of this project, which is planned to be operational during 2000, are as follows: To provide a platform for carrying out experimental research in multi-agent systems and autonomous manufacturing systems, to test the interdisciplinary cooperation architecture of the Maersk Institute, in which researchers in the fields of applied mathematics (modeling the physical world), software engineering (modeling the system) and sensor/actuator technology (relating the virtual and real worlds) could collaborate with systems integrators to construct intelligent, autonomous systems, and to provide a showpiece demonstrator in the entrance hall of the Institute's new building.

Top