Sample records for mission level model

  1. Ontological Problem-Solving Framework for Assigning Sensor Systems and Algorithms to High-Level Missions

    PubMed Central

    Qualls, Joseph; Russomanno, David J.

    2011-01-01

    The lack of knowledge models to represent sensor systems, algorithms, and missions makes opportunistically discovering a synthesis of systems and algorithms that can satisfy high-level mission specifications impractical. A novel ontological problem-solving framework has been designed that leverages knowledge models describing sensors, algorithms, and high-level missions to facilitate automated inference of assigning systems to subtasks that may satisfy a given mission specification. To demonstrate the efficacy of the ontological problem-solving architecture, a family of persistence surveillance sensor systems and algorithms has been instantiated in a prototype environment to demonstrate the assignment of systems to subtasks of high-level missions. PMID:22164081

  2. An integrated radar model solution for mission level performance and cost trades

    NASA Astrophysics Data System (ADS)

    Hodge, John; Duncan, Kerron; Zimmerman, Madeline; Drupp, Rob; Manno, Mike; Barrett, Donald; Smith, Amelia

    2017-05-01

    A fully integrated Mission-Level Radar model is in development as part of a multi-year effort under the Northrop Grumman Mission Systems (NGMS) sector's Model Based Engineering (MBE) initiative to digitally interconnect and unify previously separate performance and cost models. In 2016, an NGMS internal research and development (IR and D) funded multidisciplinary team integrated radio frequency (RF), power, control, size, weight, thermal, and cost models together using a commercial-off-the-shelf software, ModelCenter, for an Active Electronically Scanned Array (AESA) radar system. Each represented model was digitally connected with standard interfaces and unified to allow end-to-end mission system optimization and trade studies. The radar model was then linked to the Air Force's own mission modeling framework (AFSIM). The team first had to identify the necessary models, and with the aid of subject matter experts (SMEs) understand and document the inputs, outputs, and behaviors of the component models. This agile development process and collaboration enabled rapid integration of disparate models and the validation of their combined system performance. This MBE framework will allow NGMS to design systems more efficiently and affordably, optimize architectures, and provide increased value to the customer. The model integrates detailed component models that validate cost and performance at the physics level with high-level models that provide visualization of a platform mission. This connectivity of component to mission models allows hardware and software design solutions to be better optimized to meet mission needs, creating cost-optimal solutions for the customer, while reducing design cycle time through risk mitigation and early validation of design decisions.

  3. Analytical basis for planetary quarantine.

    NASA Technical Reports Server (NTRS)

    Schalkowsky, S.; Kline, R. C., Jr.

    1971-01-01

    The attempt is made to investigate quarantine constraints, and alternatives for meeting them, in sufficient detail for identifying those courses of action which compromise neither the quarantine nor the space mission objectives. Mathematical models pertinent to this goal are formulated at three distinct levels. The first level of mission constraint models pertains to the quarantine goals considered necessary by the international scientific community. The principal emphasis of modeling at this level is to quantify international considerations and to produce well-defined mission constraints. Such constraints must be translated into explicit implementation requirements by the operational agency of the launching nation. This produces the second level of implementation system modeling. However, because of the multitude of factors entering into the implementation models, it is convenient to consider these factors at the third level of implementation parameter models. These models are intentionally limited to the inclusion of only those factors which can be quantified realistically, either now or in the near future.

  4. Deep Impact Sequence Planning Using Multi-Mission Adaptable Planning Tools With Integrated Spacecraft Models

    NASA Technical Reports Server (NTRS)

    Wissler, Steven S.; Maldague, Pierre; Rocca, Jennifer; Seybold, Calina

    2006-01-01

    The Deep Impact mission was ambitious and challenging. JPL's well proven, easily adaptable multi-mission sequence planning tools combined with integrated spacecraft subsystem models enabled a small operations team to develop, validate, and execute extremely complex sequence-based activities within very short development times. This paper focuses on the core planning tool used in the mission, APGEN. It shows how the multi-mission design and adaptability of APGEN made it possible to model spacecraft subsystems as well as ground assets throughout the lifecycle of the Deep Impact project, starting with models of initial, high-level mission objectives, and culminating in detailed predictions of spacecraft behavior during mission-critical activities.

  5. Summary of a Modeling and Simulation Framework for High-Fidelity Weapon Models in Joint Semi-Automated Forces (JSAF) and Other Mission-Simulation Software

    DTIC Science & Technology

    2008-05-01

    communicate with other weapon models In a mission-level simulation; (3) introduces the four configuration levels of the M&S framework; and (4) presents a cost ...and Disadvantages ....................................................................... 26 6 COST -EFFECTIVE M&S LABORATORY PLAN...25 23 Weapon Model Sample Time and Average TET Displayed on the Target PC ..... 26 24 Design and Cost of an

  6. NASA Instrument Cost Model for Explorer-Like Mission Instruments (NICM-E)

    NASA Technical Reports Server (NTRS)

    Habib-Agahi, Hamid; Fox, George; Mrozinski, Joe; Ball, Gary

    2013-01-01

    NICM-E is a cost estimating relationship that supplements the traditional NICM System Level CERs for instruments flown on NASA Explorer-like missions that have the following three characteristics: 1) fly on Class C missions, 2) major development led and performed by universities or research foundations, and 3) have significant level of inheritance.

  7. Multiagent Modeling and Simulation in Human-Robot Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Sims, Michael H.; Shafto, Michael (Technical Monitor)

    2001-01-01

    This paper describes a collaborative multiagent modeling and simulation approach for designing work systems. The Brahms environment is used to model mission operations for a semi-autonomous robot mission to the Moon at the work practice level. It shows the impact of human-decision making on the activities and energy consumption of a robot. A collaborative work systems design methodology is described that allows informal models, created with users and stakeholders, to be used as input to the development of formal computational models.

  8. Principles to Products: Toward Realizing MOS 2.0

    NASA Technical Reports Server (NTRS)

    Bindschadler, Duane L.; Delp, Christopher L.

    2012-01-01

    This is a report on the Operations Revitalization Initiative, part of the ongoing NASA-funded Advanced Multi-Mission Operations Systems (AMMOS) program. We are implementing products that significantly improve efficiency and effectiveness of Mission Operations Systems (MOS) for deep-space missions. We take a multi-mission approach, in keeping with our organization's charter to "provide multi-mission tools and services that enable mission customers to operate at a lower total cost to NASA." Focusing first on architectural fundamentals of the MOS, we review the effort's progress. In particular, we note the use of stakeholder interactions and consideration of past lessons learned to motivate a set of Principles that guide the evolution of the AMMOS. Thus guided, we have created essential patterns and connections (detailed in companion papers) that are explicitly modeled and support elaboration at multiple levels of detail (system, sub-system, element...) throughout a MOS. This architecture is realized in design and implementation products that provide lifecycle support to a Mission at the system and subsystem level. The products include adaptable multi-mission engineering documentation that describes essentials such as operational concepts and scenarios, requirements, interfaces and agreements, information models, and mission operations processes. Because we have adopted a model-based system engineering method, these documents and their contents are meaningfully related to one another and to the system model. This means they are both more rigorous and reusable (from mission to mission) than standard system engineering products. The use of models also enables detailed, early (e.g., formulation phase) insight into the impact of changes (e.g., to interfaces or to software) that is rigorous and complete, allowing better decisions on cost or technical trades. Finally, our work provides clear and rigorous specification of operations needs to software developers, further enabling significant gains in productivity.

  9. MPD Thruster Performance Analytic Models

    NASA Technical Reports Server (NTRS)

    Gilland, James; Johnston, Geoffrey

    2003-01-01

    Magnetoplasmadynamic (MPD) thrusters are capable of accelerating quasi-neutral plasmas to high exhaust velocities using Megawatts (MW) of electric power. These characteristics make such devices worthy of consideration for demanding, far-term missions such as the human exploration of Mars or beyond. Assessment of MPD thrusters at the system and mission level is often difficult due to their status as ongoing experimental research topics rather than developed thrusters. However, in order to assess MPD thrusters utility in later missions, some adequate characterization of performance, or more exactly, projected performance, and system level definition are required for use in analyses. The most recent physical models of self-field MPD thrusters have been examined, assessed, and reconfigured for use by systems and mission analysts. The physical models allow for rational projections of thruster performance based on physical parameters that can be measured in the laboratory. The models and their implications for the design of future MPD thrusters are presented.

  10. MPD Thruster Performance Analytic Models

    NASA Technical Reports Server (NTRS)

    Gilland, James; Johnston, Geoffrey

    2007-01-01

    Magnetoplasmadynamic (MPD) thrusters are capable of accelerating quasi-neutral plasmas to high exhaust velocities using Megawatts (MW) of electric power. These characteristics make such devices worthy of consideration for demanding, far-term missions such as the human exploration of Mars or beyond. Assessment of MPD thrusters at the system and mission level is often difficult due to their status as ongoing experimental research topics rather than developed thrusters. However, in order to assess MPD thrusters utility in later missions, some adequate characterization of performance, or more exactly, projected performance, and system level definition are required for use in analyses. The most recent physical models of self-field MPD thrusters have been examined, assessed, and reconfigured for use by systems and mission analysts. The physical models allow for rational projections of thruster performance based on physical parameters that can be measured in the laboratory. The models and their implications for the design of future MPD thrusters are presented.

  11. Development of a non-contextual model for determining the autonomy level of intelligent unmanned systems

    NASA Astrophysics Data System (ADS)

    Durst, Phillip J.; Gray, Wendell; Trentini, Michael

    2013-05-01

    A simple, quantitative measure for encapsulating the autonomous capabilities of unmanned systems (UMS) has yet to be established. Current models for measuring a UMS's autonomy level require extensive, operational level testing, and provide a means for assessing the autonomy level for a specific mission/task and operational environment. A more elegant technique for quantifying autonomy using component level testing of the robot platform alone, outside of mission and environment contexts, is desirable. Using a high level framework for UMS architectures, such a model for determining a level of autonomy has been developed. The model uses a combination of developmental and component level testing for each aspect of the UMS architecture to define a non-contextual autonomous potential (NCAP). The NCAP provides an autonomy level, ranging from fully non- autonomous to fully autonomous, in the form of a single numeric parameter describing the UMS's performance capabilities when operating at that level of autonomy.

  12. Probabilistic Solar Energetic Particle Models

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.

    2011-01-01

    To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.

  13. A Subjective Assessment of Alternative Mission Architecture Operations Concepts for the Human Exploration of Mars at NASA Using a Three-Dimensional Multi-Criteria Decision Making Model

    NASA Technical Reports Server (NTRS)

    Tavana, Madjid

    2003-01-01

    The primary driver for developing missions to send humans to other planets is to generate significant scientific return. NASA plans human planetary explorations with an acceptable level of risk consistent with other manned operations. Space exploration risks can not be completely eliminated. Therefore, an acceptable level of cost, technical, safety, schedule, and political risks and benefits must be established for exploratory missions. This study uses a three-dimensional multi-criteria decision making model to identify the risks and benefits associated with three alternative mission architecture operations concepts for the human exploration of Mars identified by the Mission Operations Directorate at Johnson Space Center. The three alternatives considered in this study include split, combo lander, and dual scenarios. The model considers the seven phases of the mission including: 1) Earth Vicinity/Departure; 2) Mars Transfer; 3) Mars Arrival; 4) Planetary Surface; 5) Mars Vicinity/Departure; 6) Earth Transfer; and 7) Earth Arrival. Analytic Hierarchy Process (AHP) and subjective probability estimation are used to captures the experts belief concerning the risks and benefits of the three alternative scenarios through a series of sequential, rational, and analytical processes.

  14. Time Exceedances for High Intensity Solar Proton Fluxes

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Adam, James H., Jr.; Dietrich, William F.

    2011-01-01

    A model is presented for times during a space mission that specified solar proton flux levels are exceeded. This includes both total time and continuous time periods during missions. Results for the solar maximum and solar minimum phases of the solar cycle are presented and compared for a broad range of proton energies and shielding levels. This type of approach is more amenable to reliability analysis for spacecraft systems and instrumentation than standard statistical models.

  15. Update - Concept of Operations for Integrated Model-Centric Engineering at JPL

    NASA Technical Reports Server (NTRS)

    Bayer, Todd J.; Bennett, Matthew; Delp, Christopher L.; Dvorak, Daniel; Jenkins, Steven J.; Mandutianu, Sanda

    2011-01-01

    The increasingly ambitious requirements levied on JPL's space science missions, and the development pace of such missions, challenge our current engineering practices. All the engineering disciplines face this growth in complexity to some degree, but the challenges are greatest in systems engineering where numerous competing interests must be reconciled and where complex system level interactions must be identified and managed. Undesired system-level interactions are increasingly a major risk factor that cannot be reliably exposed by testing, and natural-language single-viewpoint specifications areinadequate to capture and expose system level interactions and characteristics. Systems engineering practices must improve to meet these challenges, and the most promising approach today is the movement toward a more integrated and model-centric approach to mission conception, design, implementation and operations. This approach elevates engineering models to a principal role in systems engineering, gradually replacing traditional document centric engineering practices.

  16. Prediction of LDEF exposure to the ionizing radiation environment

    NASA Technical Reports Server (NTRS)

    Watts, J. W.; Armstrong, T. W.; Colborn, B. L.

    1996-01-01

    Predictions of the LDEF mission's trapped proton and electron and galactic cosmic ray proton exposures have been made using the currently accepted models with improved resolution near mission end and better modeling of solar cycle effects. An extension of previous calculations, to provide a more definitive description of the LDEF exposure to ionizing radiation, is represented by trapped proton and electron flux as a function of mission time, presented considering altitude and solar activity variation during the mission and the change in galactic cosmic ray proton flux over the mission. Modifications of the AP8MAX and AP8MIN fluence led to a reduction of fluence by 20%. A modified interpolation model developed by Daly and Evans resulted in 30% higher dose and activation levels, which better agreed with measured values than results predicted using the Vette model.

  17. Terrain Modelling for Immersive Visualization for the Mars Exploration Rovers

    NASA Technical Reports Server (NTRS)

    Wright, J.; Hartman, F.; Cooper, B.; Maxwell, S.; Yen, J.; Morrison, J.

    2004-01-01

    Immersive environments are being used to support mission operations at the Jet Propulsion Laboratory. This technology contributed to the Mars Pathfinder Mission in planning sorties for the Sojourner rover and is being used for the Mars Exploration Rover (MER) missions. The stereo imagery captured by the rovers is used to create 3D terrain models, which can be viewed from any angle, to provide a powerful and information rich immersive visualization experience. These technologies contributed heavily to both the mission success and the phenomenal level of public outreach achieved by Mars Pathfinder and MER. This paper will review the utilization of terrain modelling for immersive environments in support of MER.

  18. Model for Cumulative Solar Heavy Ion Energy and LET Spectra

    NASA Technical Reports Server (NTRS)

    Xapsos, Mike; Barth, Janet; Stauffer, Craig; Jordan, Tom; Mewaldt, Richard

    2007-01-01

    A probabilistic model of cumulative solar heavy ion energy and lineary energy transfer (LET) spectra is developed for spacecraft design applications. Spectra are given as a function of confidence level, mission time period during solar maximum and shielding thickness. It is shown that long-term solar heavy ion fluxes exceed galactic cosmic ray fluxes during solar maximum for shielding levels of interest. Cumulative solar heavy ion fluences should therefore be accounted for in single event effects rate calculations and in the planning of space missions.

  19. Picometer Level Modeling of a Shared Vertex Double Corner Cube in the Space Interferometry Mission Kite Testbed

    NASA Technical Reports Server (NTRS)

    Kuan, Gary M.; Dekens, Frank G.

    2006-01-01

    The Space Interferometry Mission (SIM) is a microarcsecond interferometric space telescope that requires picometer level precision measurements of its truss and interferometer baselines. Single-gauge metrology errors due to non-ideal physical characteristics of corner cubes reduce the angular measurement capability of the science instrument. Specifically, the non-common vertex error (NCVE) of a shared vertex, double corner cube introduces micrometer level single-gauge errors in addition to errors due to dihedral angles and reflection phase shifts. A modified SIM Kite Testbed containing an articulating double corner cube is modeled and the results are compared to the experimental testbed data. The results confirm modeling capability and viability of calibration techniques.

  20. Galactic cosmic ray radiation levels in spacecraft on interplanetary missions

    NASA Technical Reports Server (NTRS)

    Shinn, J. L.; Nealy, J. E.; Townsend, L. W.; Wilson, J. W.; Wood, J.S.

    1994-01-01

    Using the Langley Research Center Galactic Cosmic Ray (GCR) transport computer code (HZETRN) and the Computerized Anatomical Man (CAM) model, crew radiation levels inside manned spacecraft on interplanetary missions are estimated. These radiation-level estimates include particle fluxes, LET (Linear Energy Transfer) spectra, absorbed dose, and dose equivalent within various organs of interest in GCR protection studies. Changes in these radiation levels resulting from the use of various different types of shield materials are presented.

  1. Mission Simulation Toolkit

    NASA Technical Reports Server (NTRS)

    Pisaich, Gregory; Flueckiger, Lorenzo; Neukom, Christian; Wagner, Mike; Buchanan, Eric; Plice, Laura

    2007-01-01

    The Mission Simulation Toolkit (MST) is a flexible software system for autonomy research. It was developed as part of the Mission Simulation Facility (MSF) project that was started in 2001 to facilitate the development of autonomous planetary robotic missions. Autonomy is a key enabling factor for robotic exploration. There has been a large gap between autonomy software (at the research level), and software that is ready for insertion into near-term space missions. The MST bridges this gap by providing a simulation framework and a suite of tools for supporting research and maturation of autonomy. MST uses a distributed framework based on the High Level Architecture (HLA) standard. A key feature of the MST framework is the ability to plug in new models to replace existing ones with the same services. This enables significant simulation flexibility, particularly the mixing and control of fidelity level. In addition, the MST provides automatic code generation from robot interfaces defined with the Unified Modeling Language (UML), methods for maintaining synchronization across distributed simulation systems, XML-based robot description, and an environment server. Finally, the MSF supports a number of third-party products including dynamic models and terrain databases. Although the communication objects and some of the simulation components that are provided with this toolkit are specifically designed for terrestrial surface rovers, the MST can be applied to any other domain, such as aerial, aquatic, or space.

  2. Radiation Transport Modeling and Assessment to Better Predict Radiation Exposure, Dose, and Toxicological Effects to Human Organs on Long Duration Space Flights

    NASA Technical Reports Server (NTRS)

    Denkins, Pamela; Badhwar, Gautam; Obot, Victor

    2000-01-01

    NASA's long-range plans include possible human exploratory missions to the moon and Mars within the next quarter century. Such missions beyond low Earth orbit will expose crews to transient radiation from solar particle events which include high-energy galactic cosmic rays and high-energy protons. Because the radiation levels in space are high and the missions long, adequate shielding is needed to minimize the deleterious health effects of exposure to radiation. The focus of this study is radiation exposure to the blood-forming organs of the NASA astronauts. NASA/JSC developed the Phantom Torso Experiment for Organ Dose Measurements which housed active and passive dosimeters that would monitor and record absorbed radiation levels at vital organ locations. This experiment was conducted during the STS-9 I mission in May '98 and provided the necessary space radiation data for correlation to results obtained from the current analytical models used to predict exposure to the blood-forming organs. Numerous models (i.e., BRYNTRN and HZETRN) have been developed and used to predict radiation exposure. However, new models are continually being developed and evaluated. The Space Environment Information Systems (SPENVIS) modeling program, developed by the Belgian Institute for Space Aeronomy, is to be used and evaluated as a part of the research activity. It is the intent of this research effort to compare the modeled data to the findings from the STS-9 I mission; assess the accuracy and efficiency of this model; and to determine its usefulness for predicting radiation exposure and developing better guidelines for shielding requirements for long duration manned missions.

  3. Low Thrust Orbital Maneuvers Using Ion Propulsion

    NASA Astrophysics Data System (ADS)

    Ramesh, Eric

    2011-10-01

    Low-thrust maneuver options, such as electric propulsion, offer specific challenges within mission-level Modeling, Simulation, and Analysis (MS&A) tools. This project seeks to transition techniques for simulating low-thrust maneuvers from detailed engineering level simulations such as AGI's Satellite ToolKit (STK) Astrogator to mission level simulations such as the System Effectiveness Analysis Simulation (SEAS). Our project goals are as follows: A) Assess different low-thrust options to achieve various orbital changes; B) Compare such approaches to more conventional, high-thrust profiles; C) Compare computational cost and accuracy of various approaches to calculate and simulate low-thrust maneuvers; D) Recommend methods for implementing low-thrust maneuvers in high-level mission simulations; E) prototype recommended solutions.

  4. Modeling and simulation of satellite subsystems for end-to-end spacecraft modeling

    NASA Astrophysics Data System (ADS)

    Schum, William K.; Doolittle, Christina M.; Boyarko, George A.

    2006-05-01

    During the past ten years, the Air Force Research Laboratory (AFRL) has been simultaneously developing high-fidelity spacecraft payload models as well as a robust distributed simulation environment for modeling spacecraft subsystems. Much of this research has occurred in the Distributed Architecture Simulation Laboratory (DASL). AFRL developers working in the DASL have effectively combined satellite power, attitude pointing, and communication link analysis subsystem models with robust satellite sensor models to create a first-order end-to-end satellite simulation capability. The merging of these two simulation areas has advanced the field of spacecraft simulation, design, and analysis, and enabled more in-depth mission and satellite utility analyses. A core capability of the DASL is the support of a variety of modeling and analysis efforts, ranging from physics and engineering-level modeling to mission and campaign-level analysis. The flexibility and agility of this simulation architecture will be used to support space mission analysis, military utility analysis, and various integrated exercises with other military and space organizations via direct integration, or through DOD standards such as Distributed Interaction Simulation. This paper discusses the results and lessons learned in modeling satellite communication link analysis, power, and attitude control subsystems for an end-to-end satellite simulation. It also discusses how these spacecraft subsystem simulations feed into and support military utility and space mission analyses.

  5. The Importance of Technology Readiness in NASA Earth Venture Missions

    NASA Technical Reports Server (NTRS)

    Wells, James E.; Komar, George J.

    2009-01-01

    The first set of Venture-class investigations share the characteristic that the technology should be mature and all investigations must use mature technology that has been modeled or demonstrated in a relevant environment (Technology Readiness Level (TRL) >5). Technology Readiness Levels are a systematic metric/measurement system that supports assessments of the maturity of a particular technology and the consistent comparison of maturity between different types of technology. The TRL is used in NASA technology planning. A major step in the level of fidelity of the technology demonstration follows the completion of TRL 5. At TRL 6, a system or subsystem model or prototype must be demonstrated in a relevant environment (ground or space) representative model or prototype system or system, which would go well beyond ad hoc, "patch-cord," or discrete component level breadboarding. These TRL levels are chosen as target objectives for the Program. The challenge for offerors is that they must identify key aspects (uncertainty, multi subsystem complexity, etc) of the TRL estimate that should be properly explained in a submitted proposal. Risk minimization is a key component of the Earth Venture missions. Experiences of prior airborne missions will be shared. The discussion will address aspects of uncertainty and issues surrounding three areas of airborne earth science missions: (1) Aircraft or proposed flight platform -- Expressing the capability of the aircraft in terms of the supporting mission requirements. These issues include airplane performance characteristics (duration, range, altitude, among others) and multiship complexities. (2) Instruments -- Establishing that the instruments have been demonstrated in a relevant environment. Instruments with heritage in prior space missions meet this requirement, as do instruments tested on the ground. Evidence that the instruments have demonstrated the ability to collect data as advertised will be described. The complexity of the integration of multiple subsystems will also be addressed. Issues associated with tailoring the instrument to meet the specific Venture mission objectives must be thoroughly explained and justified. (3) Aircraft/Instrument Integration -- Explicitly defining what development may be required to harden the instrument and integrate the instrument. The challenges associated with this key aspect of major airborne earth science investigations will be presented.

  6. A manned Mars mission concept with artificial gravity

    NASA Technical Reports Server (NTRS)

    Davis, Hubert P.

    1986-01-01

    A series of simulated manned Mars missions was analyzed by a computer model. Numerous mission opportunities and mission modes were investigated. Sensitivity trade studies were performed of the vehicle all-up mass and propulsion stage sizes as a function of various levels of conservatism in mission velocity increment margins, payload mass and propulsive stage characteristics. The longer duration but less energetic type of conjunction class mission is emphasized. The specific mission opportunity reviewed was for a 1997 departure. From the trade study results, a three and one-half stage vehicle concept evolved, utilizing a Trans-Mars Injection (TMI) first stage derived from the Space Shuttle External Tank (ET).

  7. Probability of Causation for Space Radiation Carcinogenesis Following International Space Station, Near Earth Asteroid, and Mars Missions

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Chappell, Lori J.

    2012-01-01

    Cancer risk is an important concern for International Space Station (ISS) missions and future exploration missions. An important question concerns the likelihood of a causal association between a crew members radiation exposure and the occurrence of cancer. The probability of causation (PC), also denoted as attributable risk, is used to make such an estimate. This report summarizes the NASA model of space radiation cancer risks and uncertainties, including improvements to represent uncertainties in tissue-specific cancer incidence models for never-smokers and the U.S. average population. We report on tissue-specific cancer incidence estimates and PC for different post-mission times for ISS and exploration missions. An important conclusion from our analysis is that the NASA policy to limit the risk of exposure-induced death to 3% at the 95% confidence level largely ensures that estimates of the PC for most cancer types would not reach a level of significance. Reducing uncertainties through radiobiological research remains the most efficient method to extend mission length and establish effective mitigators for cancer risks. Efforts to establish biomarkers of space radiation-induced tumors and to estimate PC for rarer tumor types are briefly discussed.

  8. Performability evaluation of the SIFT computer

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.; Furchtgott, D. G.; Wu, L. T.

    1979-01-01

    Performability modeling and evaluation techniques are applied to the SIFT computer as it might operate in the computational evironment of an air transport mission. User-visible performance of the total system (SIFT plus its environment) is modeled as a random variable taking values in a set of levels of accomplishment. These levels are defined in terms of four attributes of total system behavior: safety, no change in mission profile, no operational penalties, and no economic process whose states describe the internal structure of SIFT as well as relavant conditions of the environment. Base model state trajectories are related to accomplishment levels via a capability function which is formulated in terms of a 3-level model hierarchy. Performability evaluation algorithms are then applied to determine the performability of the total system for various choices of computer and environment parameter values. Numerical results of those evaluations are presented and, in conclusion, some implications of this effort are discussed.

  9. A data assimilation system combining CryoSat-2 data and hydrodynamic river models

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Ridler, Marc-Etienne; Godiksen, Peter Nygaard; Madsen, Henrik; Bauer-Gottwein, Peter

    2018-02-01

    There are numerous hydrologic studies using satellite altimetry data from repeat-orbit missions such as Envisat or Jason over rivers. This study is one of the first examples for the combination of altimetry from drifting-ground track satellite missions, namely CryoSat-2, with a river model. CryoSat-2 SARIn Level 2 data is used to improve a 1D hydrodynamic model of the Brahmaputra River in South Asia, which is based on the Saint-Venant equations for unsteady flow and set up in the MIKE HYDRO River software. After calibration of discharge and water level the hydrodynamic model can accurately and bias-free represent the spatio-temporal variations of water levels. A data assimilation framework has been developed and linked with the model. It is a flexible framework that can assimilate water level data which are arbitrarily distributed in time and space. The setup has been used to assimilate CryoSat-2 water level observations over the Assam valley for the years 2010-2015, using an Ensemble Transform Kalman Filter (ETKF). Performance improvement in terms of discharge forecasting skill was then evaluated. For experiments with synthetic CryoSat-2 data the continuous ranked probability score (CRPS) was improved by up to 32%, whilst for experiments assimilating real data it could be improved by up to 10%. The developed methods are expected to be transferable to other rivers and altimeter missions. The model setup and calibration is based almost entirely on globally available remote sensing data.

  10. Space Environment Effects: Model for Emission of Solar Protons (ESP): Cumulative and Worst Case Event Fluences

    NASA Technical Reports Server (NTRS)

    Xapsos, M. A.; Barth, J. L.; Stassinopoulos, E. G.; Burke, E. A.; Gee, G. B.

    1999-01-01

    The effects that solar proton events have on microelectronics and solar arrays are important considerations for spacecraft in geostationary and polar orbits and for interplanetary missions. Designers of spacecraft and mission planners are required to assess the performance of microelectronic systems under a variety of conditions. A number of useful approaches exist for predicting information about solar proton event fluences and, to a lesser extent, peak fluxes. This includes the cumulative fluence over the course of a mission, the fluence of a worst-case event during a mission, the frequency distribution of event fluences, and the frequency distribution of large peak fluxes. Naval Research Laboratory (NRL) and NASA Goddard Space Flight Center, under the sponsorship of NASA's Space Environments and Effects (SEE) Program, have developed a new model for predicting cumulative solar proton fluences and worst-case solar proton events as functions of mission duration and user confidence level. This model is called the Emission of Solar Protons (ESP) model.

  11. Space Environment Effects: Model for Emission of Solar Protons (ESP)--Cumulative and Worst-Case Event Fluences

    NASA Technical Reports Server (NTRS)

    Xapsos, M. A.; Barth, J. L.; Stassinopoulos, E. G.; Burke, Edward A.; Gee, G. B.

    1999-01-01

    The effects that solar proton events have on microelectronics and solar arrays are important considerations for spacecraft in geostationary and polar orbits and for interplanetary missions. Designers of spacecraft and mission planners are required to assess the performance of microelectronic systems under a variety of conditions. A number of useful approaches exist for predicting information about solar proton event fluences and, to a lesser extent, peak fluxes. This includes the cumulative fluence over the course of a mission, the fluence of a worst-case event during a mission, the frequency distribution of event fluences, and the frequency distribution of large peak fluxes. Naval Research Laboratory (NRL) and NASA Goddard Space Flight Center, under the sponsorship of NASA's Space Environments and Effects (SEE) Program, have developed a new model for predicting cumulative solar proton fluences and worst-case solar proton events as functions of mission duration and user confidence level. This model is called the Emission of Solar Protons (ESP) model.

  12. Assimilation of CryoSat-2 altimetry to a hydrodynamic model of the Brahmaputra river

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Nygaard Godiksen, Peter; Ridler, Marc-Etienne; Madsen, Henrik; Bauer-Gottwein, Peter

    2016-04-01

    Remote sensing provides valuable data for parameterization and updating of hydrological models, for example water level measurements of inland water bodies from satellite radar altimeters. Satellite altimetry data from repeat-orbit missions such as Envisat, ERS or Jason has been used in many studies, also synthetic wide-swath altimetry data as expected from the SWOT mission. This study is one of the first hydrologic applications of altimetry data from a drifting orbit satellite mission, namely CryoSat-2. CryoSat-2 is equipped with the SIRAL instrument, a new type of radar altimeter similar to SRAL on Sentinel-3. CryoSat-2 SARIn level 2 data is used to improve a 1D hydrodynamic model of the Brahmaputra river basin in South Asia set up in the DHI MIKE 11 software. CryoSat-2 water levels were extracted over river masks derived from Landsat imagery. After discharge calibration, simulated water levels were fitted to the CryoSat-2 data along the Assam valley by adapting cross section shapes and datums. The resulting hydrodynamic model shows accurate spatio-temporal representation of water levels, which is a prerequisite for real-time model updating by assimilation of CryoSat-2 altimetry or multi-mission data in general. For this task, a data assimilation framework has been developed and linked with the MIKE 11 model. It is a flexible framework that can assimilate water level data which are arbitrarily distributed in time and space. Different types of error models, data assimilation methods, etc. can easily be used and tested. Furthermore, it is not only possible to update the water level of the hydrodynamic model, but also the states of the rainfall-runoff models providing the forcing of the hydrodynamic model. The setup has been used to assimilate CryoSat-2 observations over the Assam valley for the years 2010 to 2013. Different data assimilation methods and localizations were tested, together with different model error representations. Furthermore, the impact of different filtering and clustering methods and error descriptions of the CryoSat-2 observations was evaluated. Performance improvement in terms of discharge and water level forecast due to the assimilation of satellite altimetry data was then evaluated. The model forecasts were also compared to climatology and persistence forecasts. Using ensemble based filters, the evaluation was done not only based on performance criteria for the central forecast such as root-mean-square error (RMSE) and Nash-Sutcliffe model efficiency (NSE), but also based on sharpness, reliability and continuous ranked probability score (CRPS) of the ensemble of probabilistic forecasts.

  13. Study of a tracking and data acquisition system for the 1990's. Volume 3: TDAS Communication Mission Model

    NASA Technical Reports Server (NTRS)

    Mccreary, T.

    1983-01-01

    A parametric description of the communication channels required between the user spacecraft to be supported and the user ground data systems is developed. Scenarios of mission models, which reflect a range of free flyers vs space platform usage as well as levels of NASA activity and potential support for military missions, and potential channel requirements which identify: (1) bounds on TDAS forward and return link data communication demand, and (2) the additional demand for providing navigation/tracking support are covered.

  14. Multi-level Operational C2 Holonic Reference Architecture Modeling for MHQ with MOC

    DTIC Science & Technology

    2009-06-01

    x), x(k), uj(k)) is defined as the task success probability, based on the asset allocation and task execution activities at the tactical level...on outcomes of asset- task allocation at the tactical level. We employ semi-Markov decision process (SMDP) approach to decide on missions to be...AGA) graph for addressing the mission monitoring/ planning issues related to task sequencing and asset allocation at the OLC-TLC layer (coordination

  15. Multi-Mission Power Analysis Tool (MMPAT) Version 3

    NASA Technical Reports Server (NTRS)

    Wood, Eric G.; Chang, George W.; Chen, Fannie C.

    2012-01-01

    The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.

  16. Probabilistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  17. HANFORD RIVER PROTECTION PROJECT ENHANCED MISSION PLANNING THROUGH INNOVATIVE TOOLS LIFECYCLE COST MODELING AND AQUEOUS THERMODYNAMIC MODELING - 12134

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    PIERSON KL; MEINERT FL

    2012-01-26

    Two notable modeling efforts within the Hanford Tank Waste Operations Simulator (HTWOS) are currently underway to (1) increase the robustness of the underlying chemistry approximations through the development and implementation of an aqueous thermodynamic model, and (2) add enhanced planning capabilities to the HTWOS model through development and incorporation of the lifecycle cost model (LCM). Since even seemingly small changes in apparent waste composition or treatment parameters can result in large changes in quantities of high-level waste (HLW) and low-activity waste (LAW) glass, mission duration or lifecycle cost, a solubility model that more accurately depicts the phases and concentrations ofmore » constituents in tank waste is required. The LCM enables evaluation of the interactions of proposed changes on lifecycle mission costs, which is critical for decision makers.« less

  18. Solar Sail Models and Test Measurements Correspondence for Validation Requirements Definition

    NASA Technical Reports Server (NTRS)

    Ewing, Anthony; Adams, Charles

    2004-01-01

    Solar sails are being developed as a mission-enabling technology in support of future NASA science missions. Current efforts have advanced solar sail technology sufficient to justify a flight validation program. A primary objective of this activity is to test and validate solar sail models that are currently under development so that they may be used with confidence in future science mission development (e.g., scalable to larger sails). Both system and model validation requirements must be defined early in the program to guide design cycles and to ensure that relevant and sufficient test data will be obtained to conduct model validation to the level required. A process of model identification, model input/output documentation, model sensitivity analyses, and test measurement correspondence is required so that decisions can be made to satisfy validation requirements within program constraints.

  19. Campaign-level dynamic network modelling for spaceflight logistics for the flexible path concept

    NASA Astrophysics Data System (ADS)

    Ho, Koki; de Weck, Olivier L.; Hoffman, Jeffrey A.; Shishko, Robert

    2016-06-01

    This paper develops a network optimization formulation for dynamic campaign-level space mission planning. Although many past space missions have been designed mainly from a mission-level perspective, a campaign-level perspective will be important for future space exploration. In order to find the optimal campaign-level space transportation architecture, a mixed-integer linear programming (MILP) formulation with a generalized multi-commodity flow and a time-expanded network is developed. Particularly, a new heuristics-based method, a partially static time-expanded network, is developed to provide a solution quickly. The developed method is applied to a case study containing human exploration of a near-Earth object (NEO) and Mars, related to the concept of the Flexible Path. The numerical results show that using the specific combinations of propulsion technologies, in-situ resource utilization (ISRU), and other space infrastructure elements can reduce the initial mass in low-Earth orbit (IMLEO) significantly. In addition, the case study results also show that we can achieve large IMLEO reduction by designing NEO and Mars missions together as a campaign compared with designing them separately owing to their common space infrastructure pre-deployment. This research will be an important step toward efficient and flexible campaign-level space mission planning.

  20. Combining Envisat type and CryoSat-2 altimetry to inform hydrodynamic models

    NASA Astrophysics Data System (ADS)

    Schneider, Raphael; Nygaard Godiksen, Peter; Villadsen, Heidi; Madsen, Henrik; Bauer-Gottwein, Peter

    2015-04-01

    Hydrological models are developed and used for flood forecasting and water resources management. Such models rely on a variety of input and calibration data. In general, and especially in data scarce areas, remote sensing provides valuable data for the parameterization and updating of such models. Satellite radar altimeters provide water level measurements of inland water bodies. So far, many studies making use of satellite altimeters have been based on data from repeat-orbit missions such as Envisat, ERS or Jason or on synthetic wide-swath altimetry data as expected from the SWOT mission. This work represents one of the first hydrologic applications of altimetry data from a drifting orbit satellite mission, using data from CryoSat-2. We present an application where CryoSat-2 data is used to improve a hydrodynamic model of the Ganges and Brahmaputra river basins in South Asia set up in the DHI MIKE 11 software. The model's parameterization and forcing is mainly based on remote sensing data, for example the TRMM 3B42 precipitation product and the SRTM DEM for river and subcatchment delineation. CryoSat-2 water levels were extracted over a river mask derived from Landsat 7 and 8 imagery. After calibrating the hydrological-hydrodynamic model against observed discharge, simulated water levels were fitted to the CryoSat-2 data, with a focus on the Brahmaputra river in the Assam valley: The average simulated water level in the hydrodynamic model was fitted to the average water level along the river's course as observed by CryoSat-2 over the years 2011-2013 by adjusting the river bed elevation. In a second step, the cross section shapes were adjusted so that the simulated water level dynamics matched those obtained from Envisat virtual station time series. The discharge calibration resulted in Nash-Sutcliffe coefficients of 0.86 and 0.94 for the Ganges and Brahmaputra. Using the Landsat river mask, the CryoSat-2 water levels show consistency along the river and are in good accordance with other products, such as the SRTM DEM. The adjusted hydrodynamic model reproduced the average water level profile along the river channel with a higher accuracy than a model based on the SRTM DEM. Furthermore, the amplitudes as observed in Envisat virtual station time series could be reproduced fitting simple triangular cross section shapes. A hydrodynamic model prepared in such a way provides water levels at any point along the river and any point in time, which are consistent with the multi-mission altimetric dataset. This means it can for example be updated by assimilation of near real-time water level measurements from CryoSat-2 improving its flood forecasting capability.

  1. Goal-Driven Autonomy and Robust Architecture for Long-Duration Missions (Year 1: 1 July 2013 - 31 July 2014)

    DTIC Science & Technology

    2014-09-30

    Mental Domain = Ω Goal Management goal change goal input World =Ψ Memory Mission & Goals( ) World Model (-Ψ) Episodic Memory Semantic Memory ...Activations Trace Meta-Level Control Introspective Monitoring Memory Reasoning Trace ( ) Strategies Episodic Memory Metaknowledge Self Model...it is from incorrect or missing memory associations (i.e., indices). Similarly, correct information may exist in the input stream, but may not be

  2. Technology Transition a Model for Infusion and Commercialization

    NASA Technical Reports Server (NTRS)

    McMillan, Vernotto C.

    2006-01-01

    The National Aeronautics and Space Administration has as part of its charter the mission of transferring technologies developed for the space program into the private sector for the purpose of affording back to the American people the economical and improved quality of life benefits associated with the technologies developed. In recent years considerable effort has been made to use this program for not only transitioning technologies out of the NASA Mission Directorate Programs, but also to transfer technologies into the Mission Directorate Programs and leverage the impact of government and private sector innovation. The objective of this paper is to outline an approach and the creation of a model that brings together industry, government, and commercialization strategies. When these elements are integrated, the probability of successful technology development, technology infusion into the Mission Programs, and commercialization into the private sector is increased. This model primarily addresses technology readiness levels between TRL 3 and TRL 6. This is typically a gap area known as the valley of death. This gap area is too low for commercial entities to invest heavily and not developed enough for major programs to actively pursue. This model has shown promise for increasing the probably of TRL advancement to an acceptable level for NASA programs and/or commercial entities to afford large investments toward either commercialization or infusion.

  3. Periods of High Intensity Solar Proton Flux

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.; Stauffer, Craig A.; Jordan, Thomas M.; Adams, James H.; Dietrich, William F.

    2012-01-01

    Analysis is presented for times during a space mission that specified solar proton flux levels are exceeded. This includes both total time and continuous time periods during missions. Results for the solar maximum and solar minimum phases of the solar cycle are presented and compared for a broad range of proton energies and shielding levels. This type of approach is more amenable to reliability analysis for spacecraft systems and instrumentation than standard statistical models.

  4. Assessment of Current Estimates of Global and Regional Mean Sea Level from the TOPEX/Poseidon, Jason-1, and OSTM 17-Year Record

    NASA Technical Reports Server (NTRS)

    Beckley, Brian D.; Ray, Richard D.; Lemoine, Frank G.; Zelensky, N. P.; Holmes, S. A.; Desal, Shailen D.; Brown, Shannon; Mitchum, G. T.; Jacob, Samuel; Luthcke, Scott B.

    2010-01-01

    The science value of satellite altimeter observations has grown dramatically over time as enabling models and technologies have increased the value of data acquired on both past and present missions. With the prospect of an observational time series extending into several decades from TOPEX/Poseidon through Jason-1 and the Ocean Surface Topography Mission (OSTM), and further in time with a future set of operational altimeters, researchers are pushing the bounds of current technology and modeling capability in order to monitor global sea level rate at an accuracy of a few tenths of a mm/yr. The measurement of mean sea-level change from satellite altimetry requires an extreme stability of the altimeter measurement system since the signal being measured is at the level of a few mm/yr. This means that the orbit and reference frame within which the altimeter measurements are situated, and the associated altimeter corrections, must be stable and accurate enough to permit a robust MSL estimate. Foremost, orbit quality and consistency are critical to satellite altimeter measurement accuracy. The orbit defines the altimeter reference frame, and orbit error directly affects the altimeter measurement. Orbit error remains a major component in the error budget of all past and present altimeter missions. For example, inconsistencies in the International Terrestrial Reference Frame (ITRF) used to produce the precision orbits at different times cause systematic inconsistencies to appear in the multimission time-frame between TOPEX and Jason-1, and can affect the intermission calibration of these data. In an effort to adhere to cross mission consistency, we have generated the full time series of orbits for TOPEX/Poseidon (TP), Jason-1, and OSTM based on recent improvements in the satellite force models, reference systems, and modeling strategies. The recent release of the entire revised Jason-1 Geophysical Data Records, and recalibration of the microwave radiometer correction also require the further re-examination of inter-mission consistency issues. Here we present an assessment of these recent improvements to the accuracy of the 17 -year sea surface height time series, and evaluate the subsequent impact on global and regional mean sea level estimates.

  5. Analysis of satellite servicing cost benefits

    NASA Technical Reports Server (NTRS)

    Builteman, H. O.

    1982-01-01

    Under the auspices of NASA/JSC a methodology was developed to estimate the value of satellite servicing to the user community. Time and funding precluded the development of an exhaustive computer model; instead, the concept of Design Reference Missions was involved. In this approach, three space programs were analyzed for various levels of servicing. The programs selected fall into broad categories which include 80 to 90% of the missions planned between now and the end of the century. Of necessity, the extrapolation of the three program analyses to the user community as a whole depends on an average mission model and equivalency projections. The value of the estimated cost benefits based on this approach depends largely on how well the equivalency assumptions and the mission model match the real world. A careful definition of all assumptions permits the analysis to be extended to conditions beyond the scope of this study.

  6. Parametric Cost Modeling of Space Missions Using the Develop New Projects (DMP) Implementation Process

    NASA Technical Reports Server (NTRS)

    Rosenberg, Leigh; Hihn, Jairus; Roust, Kevin; Warfield, Keith

    2000-01-01

    This paper presents an overview of a parametric cost model that has been built at JPL to estimate costs of future, deep space, robotic science missions. Due to the recent dramatic changes in JPL business practices brought about by an internal reengineering effort known as develop new products (DNP), high-level historic cost data is no longer considered analogous to future missions. Therefore, the historic data is of little value in forecasting costs for projects developed using the DNP process. This has lead to the development of an approach for obtaining expert opinion and also for combining actual data with expert opinion to provide a cost database for future missions. In addition, the DNP cost model has a maximum of objective cost drivers which reduces the likelihood of model input error. Version 2 is now under development which expands the model capabilities, links it more tightly with key design technical parameters, and is grounded in more rigorous statistical techniques. The challenges faced in building this model will be discussed, as well as it's background, development approach, status, validation, and future plans.

  7. The radiation environment of OSO missions from 1974 to 1978

    NASA Technical Reports Server (NTRS)

    Stassinopoulos, E. G.

    1973-01-01

    Trapped particle radiation levels on several OSO missions were calculated for nominal trajectories using improved computational methods and new electron environment models. Temporal variations of the electron fluxes were considered and partially accounted for. Magnetic field calculations were performed with a current field model and extrapolated to a later epoch with linear time terms. Orbital flux integration results, which are presented in graphical and tabular form, are analyzed, explained, and discussed.

  8. Electric propulsion for near-Earth space missions

    NASA Technical Reports Server (NTRS)

    Terwilliger, C. H.; Smith, W. W.

    1980-01-01

    A set of missions was postulated that was considered to be representative of those likely to be desirable/feasible over the next three decades. The characteristics of these missions, and their payloads, that most impact the choice/design of the requisite propulsion system were determined. A system-level model of the near-Earth transportation process was constructed, which incorporated these mission/system characteristics, as well as the fundamental parameters describing the technology/performance of an ion bombardment based electric propulsion system. The model was used for sensitivity studies to determine the interactions between the technology descriptors and program costs, and to establish the most cost-effective directions for technology advancement. The most important factor was seen to be the costs associated with the duration of the mission, and this in turn makes the development of advanced electric propulsion systems having moderate to high efficiencies ( 50 percent) at intermediate ranges of specific impulse (approximately 1000 seconds) very desirable.

  9. SCRL-Model for Human Space Flight Operations Enterprise Supply Chain

    NASA Technical Reports Server (NTRS)

    Tucker, Brian; Paxton, Joseph

    2010-01-01

    This paper will present a Supply Chain Readiness Level (SCRL) model that can be used to evaluate and configure adaptable and sustainable program and mission supply chains at an enterprise level. It will also show that using SCRL in conjunction with Technology Readiness Levels (TRLs), Manufacturing Readiness Levels (MRLs) and National Aeronautics Space Administrations (NASA s) Project Lifecycle Process will provide a more complete means of developing and evaluating a robust sustainable supply chain that encompasses the entire product, system and mission lifecycle. In addition, it will be shown that by implementing the SCRL model, NASA can additionally define supplier requirements to enable effective supply chain management (SCM). Developing and evaluating overall supply chain readiness for any product, system and mission lifecycle is critical for mission success. Readiness levels are presently being used to evaluate the maturity of technology and manufacturing capability during development and deployment phases of products and systems. For example, TRLs are used to support the assessment of the maturity of a particular technology and compare maturity of different types of technologies. MRLs are designed to assess the maturity and risk of a given technology from a manufacturing perspective. In addition, when these measurement systems are used collectively they can offer a more comprehensive view of the maturity of the system. While some aspects of the supply chain and supply chain planning are considered in these familiar metric systems, certain characteristics of an effective supply chain, when evaluated in more detail, will provide an improved insight into the readiness and risk throughout the supply chain. Therefore, a system that concentrates particularly on supply chain attributes is required to better assess enterprise supply chain readiness.

  10. Advanced Autonomous Systems for Space Operations

    NASA Astrophysics Data System (ADS)

    Gross, A. R.; Smith, B. D.; Muscettola, N.; Barrett, A.; Mjolssness, E.; Clancy, D. J.

    2002-01-01

    New missions of exploration and space operations will require unprecedented levels of autonomy to successfully accomplish their objectives. Inherently high levels of complexity, cost, and communication distances will preclude the degree of human involvement common to current and previous space flight missions. With exponentially increasing capabilities of computer hardware and software, including networks and communication systems, a new balance of work is being developed between humans and machines. This new balance holds the promise of not only meeting the greatly increased space exploration requirements, but simultaneously dramatically reducing the design, development, test, and operating costs. New information technologies, which take advantage of knowledge-based software, model-based reasoning, and high performance computer systems, will enable the development of a new generation of design and development tools, schedulers, and vehicle and system health management capabilities. Such tools will provide a degree of machine intelligence and associated autonomy that has previously been unavailable. These capabilities are critical to the future of advanced space operations, since the science and operational requirements specified by such missions, as well as the budgetary constraints will limit the current practice of monitoring and controlling missions by a standing army of ground-based controllers. System autonomy capabilities have made great strides in recent years, for both ground and space flight applications. Autonomous systems have flown on advanced spacecraft, providing new levels of spacecraft capability and mission safety. Such on-board systems operate by utilizing model-based reasoning that provides the capability to work from high-level mission goals, while deriving the detailed system commands internally, rather than having to have such commands transmitted from Earth. This enables missions of such complexity and communication` distances as are not otherwise possible, as well as many more efficient and low cost applications. In addition, utilizing component and system modeling and reasoning capabilities, autonomous systems will play an increasing role in ground operations for space missions, where they will both reduce the human workload as well as provide greater levels of monitoring and system safety. This paper will focus specifically on new and innovative software for remote, autonomous, space systems flight operations. Topics to be presented will include a brief description of key autonomous control concepts, the Remote Agent program that commanded the Deep Space 1 spacecraft to new levels of system autonomy, recent advances in distributed autonomous system capabilities, and concepts for autonomous vehicle health management systems. A brief description of teaming spacecraft and rovers for complex exploration missions will also be provided. New on-board software for autonomous science data acquisition for planetary exploration will be described, as well as advanced systems for safe planetary landings. A new multi-agent architecture that addresses some of the challenges of autonomous systems will be presented. Autonomous operation of ground systems will also be considered, including software for autonomous in-situ propellant production and management, and closed- loop ecological life support systems (CELSS). Finally, plans and directions for the future will be discussed.

  11. Estimating the Need for Medical Intervention due to Sleep Disruption on the International Space Station

    NASA Technical Reports Server (NTRS)

    Myers, Jerry G.; Lewandowski, Beth E.; Brooker, John E.; Hurst, S. R.; Mallis, Melissa M.; Caldwell, J. Lynn

    2008-01-01

    During ISS and shuttle missions, difficulties with sleep affect more than half of all US crews. Mitigation strategies to help astronauts cope with the challenges of disrupted sleep patterns can negatively impact both mission planning and vehicle design. The methods for addressing known detrimental impacts for some mission scenarios may have a substantial impact on vehicle specific consumable mass or volume or on the mission timeline. As part of the Integrated Medical Model (IMM) task, NASA Glenn Research Center is leading the development of a Monte Carlo based forecasting tool designed to determine the consumables required to address risks related to sleep disruption. The model currently focuses on the International Space Station and uses an algorithm that assembles representative mission schedules and feeds this into a well validated model that predicts relative levels of performance, and need for sleep (SAFTE Model, IBR Inc). Correlation of the resulting output to self-diagnosed needs for hypnotics, stimulants, and other pharmaceutical countermeasures, allows prediction of pharmaceutical use and the uncertainty of the specified prediction. This paper outlines a conceptual model for determining a rate of pharmaceutical utilization that can be used in the IMM model for comparison and optimization of mitigation methods with respect to all other significant medical needs and interventions.

  12. An Evaluation of the High Level Architecture (HLA) as a Framework for NASA Modeling and Simulation

    NASA Technical Reports Server (NTRS)

    Reid, Michael R.; Powers, Edward I. (Technical Monitor)

    2000-01-01

    The High Level Architecture (HLA) is a current US Department of Defense and an industry (IEEE-1516) standard architecture for modeling and simulations. It provides a framework and set of functional rules and common interfaces for integrating separate and disparate simulators into a larger simulation. The goal of the HLA is to reduce software costs by facilitating the reuse of simulation components and by providing a runtime infrastructure to manage the simulations. In order to evaluate the applicability of the HLA as a technology for NASA space mission simulations, a Simulations Group at Goddard Space Flight Center (GSFC) conducted a study of the HLA and developed a simple prototype HLA-compliant space mission simulator. This paper summarizes the prototyping effort and discusses the potential usefulness of the HLA in the design and planning of future NASA space missions with a focus on risk mitigation and cost reduction.

  13. Missions to the Outer Solar System and Beyond - Concept Study for a Kuiper Belt Sample-Return

    NASA Astrophysics Data System (ADS)

    Ganapathy, Rohan M.

    The exploration of Kuiper belt objects (KBOs) might deliver crucial data for answering questions about the evolution of the solar system and the origin of life. Whereas the current New Horizons mission performs a flyby at KBOs, an in-depth exploration of the Kuiper belt requires an orbiter, lander or even a sample return. In this paper, we present a range of potential mission architectures for a Kuiper belt sample return mission. We use the Systems Modeling Language (SysML) for the necessary modeling and the systems engineering tool MagicDraw. A process similar to the NASA Rapid Mission Architecture approach was used. We start with a rationale a KBO sample return, dene science objectives, high-level requirements and select a strawman payload. From a key trade-matrix, mission architecture options are generated. Finally, necessary technologies and prerequisites for the mission are identied. We conclude that one of the dwarf planets Pluto, Haumea, Orcus or Quaoar and their moons should be considered as a target for the mission. The samples should be collected from the dwarf planet of choice or from its moon(s), which omits the rather high velocity requirements for a landing and departure from the dwarf planet itself. Attractive mission architectures include radioisotopic electric propulsion-based missions, missions with a combination of a solar electric propulsion stage and radioisotopic electric propulsion, or missions using nuclear electric propulsion.

  14. International Space Station as a Platform for Exploration Beyond Low Earth Orbit

    NASA Technical Reports Server (NTRS)

    Raftery, Michael; Woodcock, Gordon

    2010-01-01

    The International Space Station (ISS) has established a new model for the achievement of the most difficult engineering goals in space: international collaboration at the program level with competition at the level of technology. This strategic shift in management approach provides long term program stability while still allowing for the flexible evolution of technology needs and capabilities. Both commercial and government sponsored technology developments are well supported in this management model. ISS also provides a physical platform for development and demonstration of the systems needed for missions beyond low earth orbit. These new systems at the leading edge of technology require operational exercise in the unforgiving environment of space before they can be trusted for long duration missions. Systems and resources needed for expeditions can be aggregated and thoroughly tested at ISS before departure thus providing wide operational flexibility and the best assurance of mission success. We will describe representative mission profiles showing how ISS can support exploration missions to the Moon, Mars, asteroids and other potential destinations. Example missions would include humans to lunar surface and return, and humans to Mars orbit as well as Mars surface and return. ISS benefits include: international access from all major launch sites; an assembly location with crew and tools that could help prepare departing expeditions that involve more than one launch; a parking place for reusable vehicles; and the potential to add a propellant depot.

  15. A Facility and Architecture for Autonomy Research

    NASA Technical Reports Server (NTRS)

    Pisanich, Greg; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Autonomy is a key enabling factor in the advancement of the remote robotic exploration. There is currently a large gap between autonomy software at the research level and software that is ready for insertion into near-term space missions. The Mission Simulation Facility (MST) will bridge this gap by providing a simulation framework and suite of simulation tools to support research in autonomy for remote exploration. This system will allow developers of autonomy software to test their models in a high-fidelity simulation and evaluate their system's performance against a set of integrated, standardized simulations. The Mission Simulation ToolKit (MST) uses a distributed architecture with a communication layer that is built on top of the standardized High Level Architecture (HLA). This architecture enables the use of existing high fidelity models, allows mixing simulation components from various computing platforms and enforces the use of a standardized high-level interface among components. The components needed to achieve a realistic simulation can be grouped into four categories: environment generation (terrain, environmental features), robotic platform behavior (robot dynamics), instrument models (camera/spectrometer/etc.), and data analysis. The MST will provide basic components in these areas but allows users to plug-in easily any refined model by means of a communication protocol. Finally, a description file defines the robot and environment parameters for easy configuration and ensures that all the simulation models share the same information.

  16. Biogeography-based combinatorial strategy for efficient autonomous underwater vehicle motion planning and task-time management

    NASA Astrophysics Data System (ADS)

    Zadeh, S. M.; Powers, D. M. W.; Sammut, K.; Yazdani, A. M.

    2016-12-01

    Autonomous Underwater Vehicles (AUVs) are capable of spending long periods of time for carrying out various underwater missions and marine tasks. In this paper, a novel conflict-free motion planning framework is introduced to enhance underwater vehicle's mission performance by completing maximum number of highest priority tasks in a limited time through a large scale waypoint cluttered operating field, and ensuring safe deployment during the mission. The proposed combinatorial route-path planner model takes the advantages of the Biogeography-Based Optimization (BBO) algorithm toward satisfying objectives of both higher-lower level motion planners and guarantees maximization of the mission productivity for a single vehicle operation. The performance of the model is investigated under different scenarios including the particular cost constraints in time-varying operating fields. To show the reliability of the proposed model, performance of each motion planner assessed separately and then statistical analysis is undertaken to evaluate the total performance of the entire model. The simulation results indicate the stability of the contributed model and its feasible application for real experiments.

  17. Calculation of Operations Efficiency Factors for Mars Surface Missions

    NASA Technical Reports Server (NTRS)

    Laubach, Sharon

    2014-01-01

    The duration of a mission--and subsequently, the minimum spacecraft lifetime--is a key component in designing the capabilities of a spacecraft during mission formulation. However, determining the duration is not simply a function of how long it will take the spacecraft to execute the activities needed to achieve mission objectives. Instead, the effects of the interaction between the spacecraft and ground operators must also be taken into account. This paper describes a method, using "operations efficiency factors", to account for these effects for Mars surface missions. Typically, this level of analysis has not been performed until much later in the mission development cycle, and has not been able to influence mission or spacecraft design. Further, the notion of moving to sustainable operations during Prime Mission--and the effect that change would have on operations productivity and mission objective choices--has not been encountered until the most recent rover missions (MSL, the (now-cancelled) joint NASA-ESA 2018 Mars rover, and the proposed rover for Mars 2020). Since MSL had a single control center and sun-synchronous relay assets (like MER), estimates of productivity derived from MER prime and extended missions were used. However, Mars 2018's anticipated complexity (there would have been control centers in California and Italy, and a non-sun-synchronous relay asset) required the development of an explicit model of operations efficiency that could handle these complexities. In the case of the proposed Mars 2018 mission, the model was employed to assess the mission return of competing operations concepts, and as an input to component lifetime requirements. In this paper we provide examples of how to calculate the operations efficiency factor for a given operational configuration, and how to apply the factors to surface mission scenarios. This model can be applied to future missions to enable early effective trades between operations design, science mission planning, and spacecraft design.

  18. High Altitude Long Endurance UAV Analysis Model Development and Application Study Comparing Solar Powered Airplane and Airship Station-Keeping Capabilities

    NASA Technical Reports Server (NTRS)

    Ozoroski, Thomas A.; Nickol, Craig L.; Guynn, Mark D.

    2015-01-01

    There have been ongoing efforts in the Aeronautics Systems Analysis Branch at NASA Langley Research Center to develop a suite of integrated physics-based computational utilities suitable for modeling and analyzing extended-duration missions carried out using solar powered aircraft. From these efforts, SolFlyte has emerged as a state-of-the-art vehicle analysis and mission simulation tool capable of modeling both heavier-than-air (HTA) and lighter-than-air (LTA) vehicle concepts. This study compares solar powered airplane and airship station-keeping capability during a variety of high altitude missions, using SolFlyte as the primary analysis component. Three Unmanned Aerial Vehicle (UAV) concepts were designed for this study: an airplane (Operating Empty Weight (OEW) = 3285 kilograms, span = 127 meters, array area = 450 square meters), a small airship (OEW = 3790 kilograms, length = 115 meters, array area = 570 square meters), and a large airship (OEW = 6250 kilograms, length = 135 meters, array area = 1080 square meters). All the vehicles were sized for payload weight and power requirements of 454 kilograms and 5 kilowatts, respectively. Seven mission sites distributed throughout the United States were selected to provide a basis for assessing the vehicle energy budgets and site-persistent operational availability. Seasonal, 30-day duration missions were simulated at each of the sites during March, June, September, and December; one-year duration missions were simulated at three of the sites. Atmospheric conditions during the simulated missions were correlated to National Climatic Data Center (NCDC) historical data measurements at each mission site, at four flight levels. Unique features of the SolFlyte model are described, including methods for calculating recoverable and energy-optimal flight trajectories and the effects of shadows on solar energy collection. Results of this study indicate that: 1) the airplane concept attained longer periods of on-site capability than either airship concept, and 2) the airship concepts can attain higher levels of energy collection and storage than the airplane concept; however, attaining these energy benefits requires adverse design trades of reduced performance (small airship) or excessive solar array area (large airship).

  19. Performance Assessment of the Spare Parts for the Activation of Relocated Systems (SPARES) Forecasting Model

    DTIC Science & Technology

    1991-09-01

    constant data into the gaining base’s computer records. Among the data elements to be loaded, the 1XT434 image contains the level detail effective date...the mission support effective date, and the PBR override (19:19-203). In conjunction with the 1XT434, the Mission Change Parameter Image (Constant...the gaining base (19:19-208). The level detail effective date establishes the date the MCDDFR and MCDDR "are considered by the requirements computation

  20. Relating Resources to Personnel Readiness. Use of Army Strength Management Models,

    DTIC Science & Technology

    1997-01-01

    far from the resources. In fact, they do not consider them. What they do consider is the historical performance of response variables, using a...hierarchy is the readiness of the force or the ability of the overall force to perform a given mission successfully. A force is composed of units...at the unit level. SORTS pro- duces unit "C-levels" that characterize the proportion of the wartime mission the unit can perform .5 Separate ratings

  1. A temporal forecast of radiation environments for future space exploration missions.

    PubMed

    Kim, Myung-Hee Y; Cucinotta, Francis A; Wilson, John W

    2007-06-01

    The understanding of future space radiation environments is an important goal for space mission operations, design, and risk assessment. We have developed a solar cycle statistical model in which sunspot number is coupled to space-related quantities, such as the galactic cosmic radiation (GCR) deceleration potential (phi) and the mean occurrence frequency of solar particle events (SPEs). Future GCR fluxes were derived from a predictive model, in which the temporal dependence represented by phi was derived from GCR flux and ground-based Climax neutron monitor rate measurements over the last four decades. These results showed that the point dose equivalent inside a typical spacecraft in interplanetary space was influenced by solar modulation by up to a factor of three. It also has been shown that a strong relationship exists between large SPE occurrences and phi. For future space exploration missions, cumulative probabilities of SPEs at various integral fluence levels during short-period missions were defined using a database of proton fluences of past SPEs. Analytic energy spectra of SPEs at different ranks of the integral fluences for energies greater than 30 MeV were constructed over broad energy ranges extending out to GeV for the analysis of representative exposure levels at those fluences. Results will guide the design of protection systems for astronauts during future space exploration missions.

  2. Probalistic Models for Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Adams, James H., Jr.; Xapsos, Michael

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to describe the radiation environment that can be expected at a specified confidence level. The task of the designer is then to choose a design that will operate in the model radiation environment. Probabilistic models have already been developed for solar proton events that describe the peak flux, event-integrated fluence and missionintegrated fluence. In addition a probabilistic model has been developed that describes the mission-integrated fluence for the Z>2 elemental spectra. This talk will focus on completing this suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 element

  3. Design and validation of a GNC system for missions to asteroids: the AIM scenario

    NASA Astrophysics Data System (ADS)

    Pellacani, A.; Kicman, P.; Suatoni, M.; Casasco, M.; Gil, J.; Carnelli, I.

    2017-12-01

    Deep space missions, and in particular missions to asteroids, impose a certain level of autonomy that depends on the mission objectives. If the mission requires the spacecraft to perform close approaches to the target body (the extreme case being a landing scenario), the autonomy level must be increased to guarantee the fast and reactive response which is required in both nominal and contingency operations. The GNC system must be designed in accordance with the required level of autonomy. The GNC system designed and tested in the frame of ESA's Asteroid Impact Mission (AIM) system studies (Phase A/B1 and Consolidation Phase) is an example of an autonomous GNC system that meets the challenging objectives of AIM. The paper reports the design of such GNC system and its validation through a DDVV plan that includes Model-in-the-Loop and Hardware-in-the-Loop testing. Main focus is the translational navigation, which is able to provide online the relative state estimation with respect to the target body using exclusively cameras as relative navigation sensors. The relative navigation outputs are meant to be used for nominal spacecraft trajectory corrections as well as to estimate the collision risk with the asteroid and, if needed, to command the execution of a collision avoidance manoeuvre to guarantee spacecraft safety

  4. Human System Drivers for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Kundrot, Craig E.; Steinberg, Susan; Charles, John B.

    2010-01-01

    Evaluation of DRM4 in terms of the human system includes the ability to meet NASA standards, the inclusion of the human system in the design trade space, preparation for future missions and consideration of a robotic precursor mission. Ensuring both the safety and the performance capability of the human system depends upon satisfying NASA Space Flight Human System Standards.1 These standards in turn drive the development of program-specific requirements for Near-earth Object (NEO) missions. In evaluating DRM4 in terms of these human system standards, the currently existing risk models, technologies and biological countermeasures were used. A summary of this evaluation is provided below in a structure that supports a mission architecture planning activities. 1. Unacceptable Level of Risk The duration of the DRM4 mission leads to an unacceptable level of risk for two aspects of human system health: A. The permissible exposure limit for space flight radiation exposure (a human system standard) would be exceeded by DRM4. B. The risk of visual alterations and abnormally high intracranial pressure would be too high. 1

  5. Models and techniques for evaluating the effectiveness of aircraft computing systems

    NASA Technical Reports Server (NTRS)

    Meyer, J. F.

    1977-01-01

    Models, measures and techniques were developed for evaluating the effectiveness of aircraft computing systems. The concept of effectiveness involves aspects of system performance, reliability and worth. Specifically done was a detailed development of model hierarchy at mission, functional task, and computational task levels. An appropriate class of stochastic models was investigated which served as bottom level models in the hierarchial scheme. A unified measure of effectiveness called 'performability' was defined and formulated.

  6. Lunar Reconnaissance Orbiter Orbit Determination Accuracy Analysis

    NASA Technical Reports Server (NTRS)

    Slojkowski, Steven E.

    2014-01-01

    Results from operational OD produced by the NASA Goddard Flight Dynamics Facility for the LRO nominal and extended mission are presented. During the LRO nominal mission, when LRO flew in a low circular orbit, orbit determination requirements were met nearly 100% of the time. When the extended mission began, LRO returned to a more elliptical frozen orbit where gravity and other modeling errors caused numerous violations of mission accuracy requirements. Prediction accuracy is particularly challenged during periods when LRO is in full-Sun. A series of improvements to LRO orbit determination are presented, including implementation of new lunar gravity models, improved spacecraft solar radiation pressure modeling using a dynamic multi-plate area model, a shorter orbit determination arc length, and a constrained plane method for estimation. The analysis presented in this paper shows that updated lunar gravity models improved accuracy in the frozen orbit, and a multiplate dynamic area model improves prediction accuracy during full-Sun orbit periods. Implementation of a 36-hour tracking data arc and plane constraints during edge-on orbit geometry also provide benefits. A comparison of the operational solutions to precision orbit determination solutions shows agreement on a 100- to 250-meter level in definitive accuracy.

  7. Small Spacecraft System-Level Design and Optimization for Interplanetary Trajectories

    NASA Technical Reports Server (NTRS)

    Spangelo, Sara; Dalle, Derek; Longmier, Ben

    2014-01-01

    The feasibility of an interplanetary mission for a CubeSat, a type of miniaturized spacecraft, that uses an emerging technology, the CubeSat Ambipolar Thruster (CAT) is investigated. CAT is a large delta-V propulsion system that uses a high-density plasma source that has been miniaturized for small spacecraft applications. An initial feasibility assessment that demonstrated escaping Low Earth Orbit (LEO) and achieving Earth-escape trajectories with a 3U CubeSat and this thruster technology was demonstrated in previous work. We examine a mission architecture with a trajectory that begins in Earth orbits such as LEO and Geostationary Earth Orbit (GEO) which escapes Earth orbit and travels to Mars, Jupiter, or Saturn. The goal was to minimize travel time to reach the destinations and considering trade-offs between spacecraft dry mass, fuel mass, and solar power array size. Sensitivities to spacecraft dry mass and available power are considered. CubeSats are extremely size, mass, and power constrained, and their subsystems are tightly coupled, limiting their performance potential. System-level modeling, simulation, and optimization approaches are necessary to find feasible and optimal operational solutions to ensure system-level interactions are modeled. Thus, propulsion, power/energy, attitude, and orbit transfer models are integrated to enable systems-level analysis and trades. The CAT technology broadens the possible missions achievable with small satellites. In particular, this technology enables more sophisticated maneuvers by small spacecraft such as polar orbit insertion from an equatorial orbit, LEO to GEO transfers, Earth-escape trajectories, and transfers to other interplanetary bodies. This work lays the groundwork for upcoming CubeSat launch opportunities and supports future development of interplanetary and constellation CubeSat and small satellite mission concepts.

  8. Drone Mission Definition and Implementation for Automated Infrastructure Inspection Using Airborne Sensors

    PubMed Central

    Besada, Juan A.; Bergesio, Luca; Campaña, Iván; Vaquero-Melchor, Diego; Bernardos, Ana M.; Casar, José R.

    2018-01-01

    This paper describes a Mission Definition System and the automated flight process it enables to implement measurement plans for discrete infrastructure inspections using aerial platforms, and specifically multi-rotor drones. The mission definition aims at improving planning efficiency with respect to state-of-the-art waypoint-based techniques, using high-level mission definition primitives and linking them with realistic flight models to simulate the inspection in advance. It also provides flight scripts and measurement plans which can be executed by commercial drones. Its user interfaces facilitate mission definition, pre-flight 3D synthetic mission visualisation and flight evaluation. Results are delivered for a set of representative infrastructure inspection flights, showing the accuracy of the flight prediction tools in actual operations using automated flight control. PMID:29641506

  9. NASA's Parker Solar Probe and Solar Orbiter Missions: Discovering the Secrets of our Star

    NASA Astrophysics Data System (ADS)

    Zurbuchen, T.

    2017-12-01

    This session will explore the importance of the Parker Solar Probe and Solar Orbiter missions to NASA Science, and the preparations for discoveries from these missions. NASA's Parker Solar Probe and Solar Orbiter Missions have complementary missions and will provide unique and unprecedented contributions to heliophysics and astrophysics overall. These inner heliospheric missions will also be part of the Heliophysics System Observatory which includes an increasing amount of innovative new technology and architectures to address science and data in an integrated fashion and advance models through assimilation and system-level tests. During this talk, we will briefly explore how NASA Heliophysics research efforts not only increase our understanding and predictive capability of space weather phenomena, but also provide key insights on fundamental processes important throughout the universe.

  10. Drone Mission Definition and Implementation for Automated Infrastructure Inspection Using Airborne Sensors.

    PubMed

    Besada, Juan A; Bergesio, Luca; Campaña, Iván; Vaquero-Melchor, Diego; López-Araquistain, Jaime; Bernardos, Ana M; Casar, José R

    2018-04-11

    This paper describes a Mission Definition System and the automated flight process it enables to implement measurement plans for discrete infrastructure inspections using aerial platforms, and specifically multi-rotor drones. The mission definition aims at improving planning efficiency with respect to state-of-the-art waypoint-based techniques, using high-level mission definition primitives and linking them with realistic flight models to simulate the inspection in advance. It also provides flight scripts and measurement plans which can be executed by commercial drones. Its user interfaces facilitate mission definition, pre-flight 3D synthetic mission visualisation and flight evaluation. Results are delivered for a set of representative infrastructure inspection flights, showing the accuracy of the flight prediction tools in actual operations using automated flight control.

  11. Assessing Capabilities of the High Energy Liquid Laser Area Defense System through Combat Simulations

    DTIC Science & Technology

    2008-03-01

    it to strike targets with minimal collateral damage from a range of 15 kilometers. This stand -off type attack, made capable by the ATL, enables...levels they release a photon or quantum of light. This process continues until the light waves ’ strength builds and passes through the medium...mission level model. Lastly these models are classified by durability as standing models, or legacy models. Standing models are legacy models which have

  12. Simulation Studies of Satellite Laser CO2 Mission Concepts

    NASA Technical Reports Server (NTRS)

    Kawa, Stephan Randy; Mao, J.; Abshire, J. B.; Collatz, G. J.; Sun X.; Weaver, C. J.

    2011-01-01

    Results of mission simulation studies are presented for a laser-based atmospheric CO2 sounder. The simulations are based on real-time carbon cycle process modeling and data analysis. The mission concept corresponds to ASCENDS as recommended by the US National Academy of Sciences Decadal Survey. Compared to passive sensors, active (lidar) sensing of CO2 from space has several potentially significant advantages that hold promise to advance CO2 measurement capability in the next decade. Although the precision and accuracy requirements remain at unprecedented levels of stringency, analysis of possible instrument technology indicates that such sensors are more than feasible. Radiative transfer model calculations, an instrument model with representative errors, and a simple retrieval approach complete the cycle from "nature" run to "pseudodata" CO2. Several mission and instrument configuration options are examined, and the sensitivity to key design variables is shown. Examples are also shown of how the resulting pseudo-measurements might be used to address key carbon cycle science questions.

  13. An evaluation of noise and its effects on shuttle crewmembers during STS-50/USML-1

    NASA Technical Reports Server (NTRS)

    Koros, Anton; Wheelwright, Charles; Adam, Susan

    1993-01-01

    High noise levels can lead to physiological, psychological, and performance effects in man, ranging from irritability, annoyance, and sleep interference to interference with verbal communication and fatigue, and to temporary or permanent threshold shift at more extreme levels. The current study evaluated the acoustic environment of the STS50/USML-1 mission. The major objectives were to gain subjective assessments of the STS-50 noise levels, document impacts of noise upon crewmember performance, collect inflight sound level measurements, compare noise levels across missions, evaluate the current Shuttle acoustic criterion, and to make recommendations regarding noise specifications for SSF and other long-duration manned space missions. Sound measurements indicated that background noise levels were 60, 64, and 61 A-weighted decibels, respectively, on the Orbiter middeck, flight deck, and Space lab. All levels were rated acceptable, with the Spacelab environment rated the most favorably. Sleep stations afforded attenuation from airborne noise sources, although all crewmembers reported being awakened by crew activity on the middeck. Models of distance for acceptable speech communications were generated, identifying situations of compromised verbal communications to be avoided.

  14. SPACE PROPULSION SYSTEM PHASED-MISSION PROBABILITY ANALYSIS USING CONVENTIONAL PRA METHODS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Curtis Smith; James Knudsen

    As part of a series of papers on the topic of advance probabilistic methods, a benchmark phased-mission problem has been suggested. This problem consists of modeling a space mission using an ion propulsion system, where the mission consists of seven mission phases. The mission requires that the propulsion operate for several phases, where the configuration changes as a function of phase. The ion propulsion system itself consists of five thruster assemblies and a single propellant supply, where each thruster assembly has one propulsion power unit and two ion engines. In this paper, we evaluate the probability of mission failure usingmore » the conventional methodology of event tree/fault tree analysis. The event tree and fault trees are developed and analyzed using Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE). While the benchmark problem is nominally a "dynamic" problem, in our analysis the mission phases are modeled in a single event tree to show the progression from one phase to the next. The propulsion system is modeled in fault trees to account for the operation; or in this case, the failure of the system. Specifically, the propulsion system is decomposed into each of the five thruster assemblies and fed into the appropriate N-out-of-M gate to evaluate mission failure. A separate fault tree for the propulsion system is developed to account for the different success criteria of each mission phase. Common-cause failure modeling is treated using traditional (i.e., parametrically) methods. As part of this paper, we discuss the overall results in addition to the positive and negative aspects of modeling dynamic situations with non-dynamic modeling techniques. One insight from the use of this conventional method for analyzing the benchmark problem is that it requires significant manual manipulation to the fault trees and how they are linked into the event tree. The conventional method also requires editing the resultant cut sets to obtain the correct results. While conventional methods may be used to evaluate a dynamic system like that in the benchmark, the level of effort required may preclude its use on real-world problems.« less

  15. MSFC Skylab contamination control systems mission evaluation

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Cluster external contamination control evaluation was made throughout the Skylab Mission. This evaluation indicated that contamination control measures instigated during the design, development, and operational phases of this program were adequate to reduce the general contamination environment external to the Cluster below the threshold senstivity levels for experiments and affected subsystems. Launch and orbit contamination control features included eliminating certain vents, rerouting vents for minimum contamination impact, establishing filters, incorporating materials with minimum outgassing characteristics and developing operational constraints and mission rules to minimize contamination effects. Prior to the launch of Skylab, contamination control math models were developed which were used to predict Cluster surface deposition and background brightness levels throughout the mission. The report summarizes the Skylab system and experiment contamination control evaluation. The Cluster systems and experiments evaluated include Induced Atmosphere, Corollary and ATM Experiments, Thermal Control Surfaces, Solar Array Systems, Windows and Star Tracker.

  16. Nuclear Thermal Rocket - Arc Jet Integrated System Model

    NASA Technical Reports Server (NTRS)

    Taylor, Brian D.; Emrich, William

    2016-01-01

    In the post-shuttle era, space exploration is moving into a new regime. Commercial space flight is in development and is planned to take on much of the low earth orbit space flight missions. With the development of a heavy lift launch vehicle, the Space Launch, System, NASA has become focused on deep space exploration. Exploration into deep space has traditionally been done with robotic probes. More ambitious missions such as manned missions to asteroids and Mars will require significant technology development. Propulsion system performance is tied to the achievability of these missions and the requirements of other developing technologies that will be required. Nuclear thermal propulsion offers a significant improvement over chemical propulsion while still achieving high levels of thrust. Opportunities exist; however, to build upon what would be considered a standard nuclear thermal engine to attain improved performance, thus further enabling deep space missions. This paper discuss the modeling of a nuclear thermal system integrated with an arc jet to further augment performance. The performance predictions and systems impacts are discussed.

  17. Model Checking Artificial Intelligence Based Planners: Even the Best Laid Plans Must Be Verified

    NASA Technical Reports Server (NTRS)

    Smith, Margaret H.; Holzmann, Gerard J.; Cucullu, Gordon C., III; Smith, Benjamin D.

    2005-01-01

    Automated planning systems (APS) are gaining acceptance for use on NASA missions as evidenced by APS flown On missions such as Orbiter and Deep Space 1 both of which were commanded by onboard planning systems. The planning system takes high level goals and expands them onboard into a detailed of action fiat the spacecraft executes. The system must be verified to ensure that the automatically generated plans achieve the goals as expected and do not generate actions that would harm the spacecraft or mission. These systems are typically tested using empirical methods. Formal methods, such as model checking, offer exhaustive or measurable test coverage which leads to much greater confidence in correctness. This paper describes a formal method based on the SPIN model checker. This method guarantees that possible plans meet certain desirable properties. We express the input model in Promela, the language of SPIN and express the properties of desirable plans formally.

  18. Assessment of Medical Risks and Optimization of their Management using Integrated Medical Model

    NASA Technical Reports Server (NTRS)

    Fitts, Mary A.; Madurai, Siram; Butler, Doug; Kerstman, Eric; Risin, Diana

    2008-01-01

    The Integrated Medical Model (IMM) Project is a software-based technique that will identify and quantify the medical needs and health risks of exploration crew members during space flight and evaluate the effectiveness of potential mitigation strategies. The IMM Project employs an evidence-based approach that will quantify probability and consequences of defined in-flight medical risks, mitigation strategies, and tactics to optimize crew member health. Using stochastic techniques, the IMM will ultimately inform decision makers at both programmatic and institutional levels and will enable objective assessment of crew health and optimization of mission success using data from relevant cohort populations and from the astronaut population. The objectives of the project include: 1) identification and documentation of conditions that may occur during exploration missions (Baseline Medical Conditions List [BMCL), 2) assessment of the likelihood of conditions in the BMCL occurring during exploration missions (incidence rate), 3) determination of the risk associated with these conditions and quantify in terms of end states (Loss of Crew, Loss of Mission, Evacuation), 4) optimization of in-flight hardware mass, volume, power, bandwidth and cost for a given level of risk or uncertainty, and .. validation of the methodologies used.

  19. Propensity and Risk Assessment for Solar Particle Events: Consideration of Integral Fluence at High Proton Energies

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Hayat, Matthew J.; Feiveson, alan H.; Cucinotta, Francis A.

    2008-01-01

    For future space missions with longer duration, exposure to large solar particle events (SPEs) with high energy levels is the major concern during extra-vehicular activities (EVAs) on the lunar and Mars surface. The expected SPE propensity for large proton fluence was estimated from a non-homogeneous Poisson model using the historical database for measurements of protons with energy > 30 MeV, Phi(sub 30). The database includes a continuous data set for the past 5 solar cycles. The resultant SPE risk analysis for a specific mission period was made including the 95% confidence level. In addition to total particle intensity of SPE, the detailed energy spectra of protons especially at high energy levels were recognized as extremely important parameter for the risk assessment, since there remains a significant cancer risks from those energetic particles for large events. Using all the recorded proton fluence of SPEs for energies >60 and >100 MeV, Phi(sub 60) and Phi(sub 100), respectively, the expected propensities of SPEs abundant with high energy protons were estimated from the same non-homogeneous Poisson model and the representative cancer risk was analyzed. The dependencies of risk with different energy spectra, for e.g. between soft and hard SPEs, were evaluated. Finally, we describe approaches to improve radiation protection of astronauts and optimize mission planning for future space missions.

  20. LEO cooperative multi-spacecraft refueling mission optimization considering J2 perturbation and target's surplus propellant constraint

    NASA Astrophysics Data System (ADS)

    Zhao, Zhao; Zhang, Jin; Li, Hai-yang; Zhou, Jian-yong

    2017-01-01

    The optimization of an LEO cooperative multi-spacecraft refueling mission considering the J2 perturbation and target's surplus propellant constraint is studied in the paper. First, a mission scenario is introduced. One service spacecraft and several target spacecraft run on an LEO near-circular orbit, the service spacecraft rendezvouses with some service positions one by one, and target spacecraft transfer to corresponding service positions respectively. Each target spacecraft returns to its original position after obtaining required propellant and the service spacecraft returns to its original position after refueling all target spacecraft. Next, an optimization model of this mission is built. The service sequence, orbital transfer time, and service position are used as deign variables, whereas the propellant cost is used as the design objective. The J2 perturbation, time constraint and the target spacecraft's surplus propellant capability constraint are taken into account. Then, a hybrid two-level optimization approach is presented to solve the formulated mixed integer nonlinear programming (MINLP) problem. A hybrid-encoding genetic algorithm is adopted to seek the near optimal solution in the up-level optimization, while a linear relative dynamic equation considering the J2 perturbation is used to obtain the impulses of orbital transfer in the low-level optimization. Finally, the effectiveness of the proposed model and method is validated by numerical examples.

  1. Flux-Level Transit Injection Experiments with NASA Pleiades Supercomputer

    NASA Astrophysics Data System (ADS)

    Li, Jie; Burke, Christopher J.; Catanzarite, Joseph; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    Flux-Level Transit Injection (FLTI) experiments are executed with NASA's Pleiades supercomputer for the Kepler Mission. The latest release (9.3, January 2016) of the Kepler Science Operations Center Pipeline is used in the FLTI experiments. Their purpose is to validate the Analytic Completeness Model (ACM), which can be computed for all Kepler target stars, thereby enabling exoplanet occurrence rate studies. Pleiades, a facility of NASA's Advanced Supercomputing Division, is one of the world's most powerful supercomputers and represents NASA's state-of-the-art technology. We discuss the details of implementing the FLTI experiments on the Pleiades supercomputer. For example, taking into account that ~16 injections are generated by one core of the Pleiades processors in an hour, the “shallow” FLTI experiment, in which ~2000 injections are required per target star, can be done for 16% of all Kepler target stars in about 200 hours. Stripping down the transit search to bare bones, i.e. only searching adjacent high/low periods at high/low pulse durations, makes the computationally intensive FLTI experiments affordable. The design of the FLTI experiments and the analysis of the resulting data are presented in “Validating an Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments” by Catanzarite et al. (#2494058).Kepler was selected as the 10th mission of the Discovery Program. Funding for the Kepler Mission has been provided by the NASA Science Mission Directorate.

  2. Effects of simulated space radiation on immunoassay components for life-detection experiments in planetary exploration missions.

    PubMed

    Derveni, Mariliza; Hands, Alex; Allen, Marjorie; Sims, Mark R; Cullen, David C

    2012-08-01

    The Life Marker Chip (LMC) instrument is part of the proposed payload on the ESA ExoMars rover that is scheduled for launch in 2018. The LMC will use antibody-based assays to detect molecular signatures of life in samples obtained from the shallow subsurface of Mars. For the LMC antibodies, the ability to resist inactivation due to space particle radiation (both in transit and on the surface of Mars) will therefore be a prerequisite. The proton and neutron components of the mission radiation environment are those that are expected to have the dominant effect on the operation of the LMC. Modeling of the radiation environment for a mission to Mars led to the calculation of nominal mission fluences for proton and neutron radiation. Various combinations and multiples of these values were used to demonstrate the effects of radiation on antibody activity, primarily at the radiation levels envisaged for the ExoMars mission as well as at much higher levels. Five antibodies were freeze-dried in a variety of protective molecular matrices and were exposed to various radiation conditions generated at a cyclotron facility. After exposure, the antibodies' ability to bind to their respective antigens was assessed and found to be unaffected by ExoMars mission level radiation doses. These experiments indicated that the expected radiation environment of a Mars mission does not pose a significant risk to antibodies packaged in the form anticipated for the LMC instrument.

  3. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the shortcomings of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-the-art in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority weaknesses within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  4. A Survey of Cost Estimating Methodologies for Distributed Spacecraft Missions

    NASA Technical Reports Server (NTRS)

    Foreman, Veronica L.; Le Moigne, Jacqueline; de Weck, Oliver

    2016-01-01

    Satellite constellations present unique capabilities and opportunities to Earth orbiting and near-Earth scientific and communications missions, but also present new challenges to cost estimators. An effective and adaptive cost model is essential to successful mission design and implementation, and as Distributed Spacecraft Missions (DSM) become more common, cost estimating tools must become more representative of these types of designs. Existing cost models often focus on a single spacecraft and require extensive design knowledge to produce high fidelity estimates. Previous research has examined the limitations of existing cost practices as they pertain to the early stages of mission formulation, for both individual satellites and small satellite constellations. Recommendations have been made for how to improve the cost models for individual satellites one-at-a-time, but much of the complexity in constellation and DSM cost modeling arises from constellation systems level considerations that have not yet been examined. This paper constitutes a survey of the current state-of-theart in cost estimating techniques with recommendations for improvements to increase the fidelity of future constellation cost estimates. To enable our investigation, we have developed a cost estimating tool for constellation missions. The development of this tool has revealed three high-priority shortcomings within existing parametric cost estimating capabilities as they pertain to DSM architectures: design iteration, integration and test, and mission operations. Within this paper we offer illustrative examples of these discrepancies and make preliminary recommendations for addressing them. DSM and satellite constellation missions are shifting the paradigm of space-based remote sensing, showing promise in the realms of Earth science, planetary observation, and various heliophysical applications. To fully reap the benefits of DSM technology, accurate and relevant cost estimating capabilities must exist; this paper offers insights critical to the future development and implementation of DSM cost estimating tools.

  5. Comparison of Optimal Small Spacecraft Micro Electric Propulsion Technologies for Mission Opportunities

    NASA Technical Reports Server (NTRS)

    Spangelo, Sara

    2015-01-01

    The goal of this paper is to explore the mission opportunities that are uniquely enabled by U-class Solar Electric Propulsion (SEP) technologies. Small SEP thrusters offers significant advantages relative to existing technologies and will revolutionize the class of mission architectures that small spacecraft can accomplish by enabling trajectory maneuvers with significant change in velocity requirements and reaction wheel-free attitude control. This paper aims to develop and apply a common system-level modeling framework to evaluate these thrusters for relevant upcoming mission scenarios, taking into account the mass, power, volume, and operational constraints of small highly-constrained missions. We will identify the optimal technology for broad classes of mission applications for different U-class spacecraft sizes and provide insights into what constrains the system performance to identify technology areas where improvements are needed.

  6. Modeling the data systems role of the scientist (for the NEEDS Command and Control Task)

    NASA Technical Reports Server (NTRS)

    Hei, D. J., Jr.; Winter, W. J., Jr.; Brookes, R.; Locke, M.

    1981-01-01

    Research was conducted into the command and control activities of the scientists for five space missions: International Ultraviolet Explorer, Solar Maximum Mission, International Sun-Earth Explorer, High-Energy Astronomy Observatory 1, and Atmospheric Explorer 5. A basis for developing a generalized description of the scientists' activities was obtained. Because of this characteristic, it was decided that a series of flowcharts would be used. This set of flowcharts constitutes a model of the scientists' activities within the total data system. The model was developed through three levels of detail. The first is general and provides a conceptual framework for discussing the system. The second identifies major functions and should provide a fundamental understanding of the scientists' command and control activities. The third level expands the major functions into a more detailed description.

  7. PUS Services Software Building Block Automatic Generation for Space Missions

    NASA Astrophysics Data System (ADS)

    Candia, S.; Sgaramella, F.; Mele, G.

    2008-08-01

    The Packet Utilization Standard (PUS) has been specified by the European Committee for Space Standardization (ECSS) and issued as ECSS-E-70-41A to define the application-level interface between Ground Segments and Space Segments. The ECSS-E- 70-41A complements the ECSS-E-50 and the Consultative Committee for Space Data Systems (CCSDS) recommendations for packet telemetry and telecommand. The ECSS-E-70-41A characterizes the identified PUS Services from a functional point of view and the ECSS-E-70-31 standard specifies the rules for their mission-specific tailoring. The current on-board software design for a space mission implies the production of several PUS terminals, each providing a specific tailoring of the PUS services. The associated on-board software building blocks are developed independently, leading to very different design choices and implementations even when the mission tailoring requires very similar services (from the Ground operative perspective). In this scenario, the automatic production of the PUS services building blocks for a mission would be a way to optimize the overall mission economy and improve the robusteness and reliability of the on-board software and of the Ground-Space interactions. This paper presents the Space Software Italia (SSI) activities for the development of an integrated environment to support: the PUS services tailoring activity for a specific mission. the mission-specific PUS services configuration. the generation the UML model of the software building block implementing the mission-specific PUS services and the related source code, support documentation (software requirements, software architecture, test plans/procedures, operational manuals), and the TM/TC database. The paper deals with: (a) the project objectives, (b) the tailoring, configuration, and generation process, (c) the description of the environments supporting the process phases, (d) the characterization of the meta-model used for the generation, (e) the characterization of the reference avionics architecture and of the reference on- board software high-level architecture.

  8. Technology Readiness of the NEXT Ion Propulsion System

    NASA Technical Reports Server (NTRS)

    Benson, Scott W.; Patterson, Michael J.

    2008-01-01

    The NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system has been in advanced technology development under the NASA In-Space Propulsion Technology project. The highest fidelity hardware planned has now been completed by the government/industry team, including: a flight prototype model (PM) thruster, an engineering model (EM) power processing unit, EM propellant management assemblies, a breadboard gimbal, and control unit simulators. Subsystem and system level technology validation testing is in progress. To achieve the objective Technology Readiness Level 6, environmental testing is being conducted to qualification levels in ground facilities simulating the space environment. Additional tests have been conducted to characterize the performance range and life capability of the NEXT thruster. This paper presents the status and results of technology validation testing accomplished to date, the validated subsystem and system capabilities, and the plans for completion of this phase of NEXT development. The next round of competed planetary science mission announcements of opportunity, and directed mission decisions, are anticipated to occur in 2008 and 2009. Progress to date, and the success of on-going technology validation, indicate that the NEXT ion propulsion system will be a primary candidate for mission consideration in these upcoming opportunities.

  9. The Development of a Sea Surface Height Climate Data Record from Multi-mission Altimeter Data

    NASA Astrophysics Data System (ADS)

    Beckley, B. D.; Ray, R. D.; Lemoine, F. G.; Zelensky, N. P.; Desai, S. D.; Brown, S.; Mitchum, G. T.; Nerem, R.; Yang, X.; Holmes, S. A.

    2011-12-01

    The determination of the rate of change of mean sea level (MSL) has undeniable societal significance. The science value of satellite altimeter observations has grown dramatically over time as improved models and technologies have increased the value of data acquired on both past and present missions enabling credible MSL estimates. With the prospect of an observational time series extending into several decades from TOPEX/Poseidon through Jason-1 and the Ocean Surface Topography Mission (OSTM), and further in time with a future set of operational altimeters, researchers are pushing the bounds of current technology and modeling capability in order to monitor global and regional sea level rates at an accuracy of a few tenths of a mm/yr. GRACE data analysis suggests that the ice melt from Alaska alone contributes 0.3 mm/y to global sea level rise. The measurement of MSL change from satellite altimetry requires an extreme stability of the altimeter measurement system since the signal being measured is at the level of a few mm/yr. This means that the orbit and reference frame within which the altimeter measurements are situated, and the associated altimeter corrections, must be stable and accurate enough to permit a robust MSL estimate. Foremost, orbit quality and consistency are critical not only to satellite altimeter measurement accuracy across one mission, but also for the seamless transition between missions (Beckley, et. al, 2005). The analysis of altimeter data for TOPEX/Poseidon, Jason-1, and OSTM requires that the orbits for all three missions be in a consistent reference frame, and calculated with the best possible standards to minimize error and maximize the data return from the time series, particularly with respect to the demanding application of measuring sea level trends. In this presentation we describe the development and utility of the MEaSURE's TPJAOS V1.0 sea surface height Climate Data Record (http://podaac.jpl.nasa.gov/dataset/MERGED_TP_J1_OSTM_OST_ALL). We provide an assessment of recent improvements to the accuracy of the 19-year sea surface height time series, describe continuing calibration/validation activities, and evaluate the subsequent impact on global and regional mean sea level estimates.

  10. Constellation Program Life-cycle Cost Analysis Model (LCAM)

    NASA Technical Reports Server (NTRS)

    Prince, Andy; Rose, Heidi; Wood, James

    2008-01-01

    The Constellation Program (CxP) is NASA's effort to replace the Space Shuttle, return humans to the moon, and prepare for a human mission to Mars. The major elements of the Constellation Lunar sortie design reference mission architecture are shown. Unlike the Apollo Program of the 1960's, affordability is a major concern of United States policy makers and NASA management. To measure Constellation affordability, a total ownership cost life-cycle parametric cost estimating capability is required. This capability is being developed by the Constellation Systems Engineering and Integration (SE&I) Directorate, and is called the Lifecycle Cost Analysis Model (LCAM). The requirements for LCAM are based on the need to have a parametric estimating capability in order to do top-level program analysis, evaluate design alternatives, and explore options for future systems. By estimating the total cost of ownership within the context of the planned Constellation budget, LCAM can provide Program and NASA management with the cost data necessary to identify the most affordable alternatives. LCAM is also a key component of the Integrated Program Model (IPM), an SE&I developed capability that combines parametric sizing tools with cost, schedule, and risk models to perform program analysis. LCAM is used in the generation of cost estimates for system level trades and analyses. It draws upon the legacy of previous architecture level cost models, such as the Exploration Systems Mission Directorate (ESMD) Architecture Cost Model (ARCOM) developed for Simulation Based Acquisition (SBA), and ATLAS. LCAM is used to support requirements and design trade studies by calculating changes in cost relative to a baseline option cost. Estimated costs are generally low fidelity to accommodate available input data and available cost estimating relationships (CERs). LCAM is capable of interfacing with the Integrated Program Model to provide the cost estimating capability for that suite of tools.

  11. Synthesizing SMOS Zero-Baselines with Aquarius Brightness Temperature Simulator

    NASA Technical Reports Server (NTRS)

    Colliander, A.; Dinnat, E.; Le Vine, D.; Kainulainen, J.

    2012-01-01

    SMOS [1] and Aquarius [2] are ESA and NASA missions, respectively, to make L-band measurements from the Low Earth Orbit. SMOS makes passive measurements whereas Aquarius measures both passive and active. SMOS was launched in November 2009 and Aquarius in June 2011.The scientific objectives of the missions are overlapping: both missions aim at mapping the global Sea Surface Salinity (SSS). Additionally, SMOS mission produces soil moisture product (however, Aquarius data will eventually be used for retrieving soil moisture too). The consistency of the brightness temperature observations made by the two instruments is essential for long-term studies of SSS and soil moisture. For resolving the consistency, the calibration of the instruments is the key. The basis of the SMOS brightness temperature level is the measurements performed with the so-called zero-baselines [3]; SMOS employs an interferometric measurement technique which forms a brightness temperature image from several baselines constructed by combination of multiple receivers in an array; zero-length baseline defines the overall brightness temperature level. The basis of the Aquarius brightness temperature level is resolved from the brightness temperature simulator combined with ancillary data such as antenna patterns and environmental models [4]. Consistency between the SMOS zero-baseline measurements and the simulator output would provide a robust basis for establishing the overall comparability of the missions.

  12. The Ocean and Climate: Results from the TOPEX/POSEIDON Mission

    NASA Technical Reports Server (NTRS)

    Fu, L. -L.

    1995-01-01

    Since 1992, the TOPEX/POSEIDON satellite has been making altimetric sea surface observations with a sea level accuracy of 4.4 cm. This data can be used for studying regional and seasonal differences in sea level and for evaluating oceanic circulation models and tidal models. Longer term changes can also be studied, such as El Nino and overall sea level rising (although the latter is still within the margin of error).

  13. Model-based system engineering approach for the Euclid mission to manage scientific and technical complexity

    NASA Astrophysics Data System (ADS)

    Lorenzo Alvarez, Jose; Metselaar, Harold; Amiaux, Jerome; Saavedra Criado, Gonzalo; Gaspar Venancio, Luis M.; Salvignol, Jean-Christophe; Laureijs, René J.; Vavrek, Roland

    2016-08-01

    In the last years, the system engineering field is coming to terms with a paradigm change in the approach for complexity management. Different strategies have been proposed to cope with highly interrelated systems, system of systems and collaborative system engineering have been proposed and a significant effort is being invested into standardization and ontology definition. In particular, Model Based System Engineering (MBSE) intends to introduce methodologies for a systematic system definition, development, validation, deployment, operation and decommission, based on logical and visual relationship mapping, rather than traditional 'document based' information management. The practical implementation in real large-scale projects is not uniform across fields. In space science missions, the usage has been limited to subsystems or sample projects with modeling being performed 'a-posteriori' in many instances. The main hurdle for the introduction of MBSE practices in new projects is still the difficulty to demonstrate their added value to a project and whether their benefit is commensurate with the level of effort required to put them in place. In this paper we present the implemented Euclid system modeling activities, and an analysis of the benefits and limitations identified to support in particular requirement break-down and allocation, and verification planning at mission level.

  14. Effectiveness of international surgical program model to build local sustainability.

    PubMed

    Magee, William P; Raimondi, Haley M; Beers, Mark; Koech, Maryanne C

    2012-01-01

    Background. Humanitarian medical missions may be an effective way to temporarily overcome limitations and promote long-term solutions in the local health care system. Operation Smile, an international medical not-for-profit organization that provides surgery for patients with cleft lip and palate, not only provides surgery through short-term international missions but also focuses on developing local capacity. Methods. The history of Operation Smile was evaluated globally, and then on a local level in 3 countries: Colombia, Bolivia, and Ethiopia. Historical data was assessed by two-pronged success of (1) treating the surgical need presented by cleft patients and (2) advancing the local capacity to provide primary and ongoing care to patients. Results. The number of patients treated by Operation Smile has continually increased. Though it began by using only international teams to provide care, by 2012, this had shifted to 33% of patients being treated by international teams, while the other 67% received treatment from local models of care. The highest level of sustainability was achieved in Columbia, where two permanent centers have been established, followed by Bolivia and lastly Ethiopia. Conclusions. International missions have value because of the patients that receive surgery and the local sustainable models of care that they promote.

  15. Lessons Learned from the Wide Field Camera 3 TV1 Test Campaign and Correlation Effort

    NASA Technical Reports Server (NTRS)

    Peabody, Hume; Stavley, Richard; Bast, William

    2007-01-01

    In January 2004, shortly after the Columbia accident, future servicing missions to the Hubble Space Telescope (HST) were cancelled. In response to this, further work on the Wide Field Camera 3 instrument was ceased. Given the maturity level of the design, a characterization thermal test (TV1) was completed in case the mission was re-instated or an alternate mission found on which to fly the instrument. This thermal test yielded some valuable lessons learned with respect to testing configurations and modeling/correlation practices, including: 1. Ensure that the thermal design can be tested 2. Ensure that the model has sufficient detail for accurate predictions 3. Ensure that the power associated with all active control devices is predicted 4. Avoid unit changes for existing models. This paper documents the difficulties presented when these recommendations were not followed.

  16. TOPEX/POSEIDON orbit maintenance maneuver design

    NASA Technical Reports Server (NTRS)

    Bhat, R. S.; Frauenholz, R. B.; Cannell, Patrick E.

    1990-01-01

    The Ocean Topography Experiment (TOPEX/POSEIDON) mission orbit requirements are outlined, as well as its control and maneuver spacing requirements including longitude and time targeting. A ground-track prediction model dealing with geopotential, luni-solar gravity, and atmospheric-drag perturbations is considered. Targeting with all modeled perturbations is discussed, and such ground-track prediction errors as initial semimajor axis, orbit-determination, maneuver-execution, and atmospheric-density modeling errors are assessed. A longitude targeting strategy for two extreme situations is investigated employing all modeled perturbations and prediction errors. It is concluded that atmospheric-drag modeling errors are the prevailing ground-track prediction error source early in the mission during high solar flux, and that low solar-flux levels expected late in the experiment stipulate smaller maneuver magnitudes.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Belsher, Jeremy D.; Pierson, Kayla L.; Gimpel, Rod F.

    The Hanford site in southeast Washington contains approximately 207 million liters of radioactive and hazardous waste stored in 177 underground tanks. The U.S. Department of Energy's Office of River Protection is currently managing the Hanford waste treatment mission, which includes the storage, retrieval, treatment and disposal of the tank waste. Two recent studies, employing the modeling tools managed by the One System organization, have highlighted waste cleanup mission sensitivities. The Hanford Tank Waste Operations Simulator Sensitivity Study evaluated the impact that varying 21 different parameters had on the Hanford Tank Waste Operations Simulator model. It concluded that inaccuracies in themore » predicted phase partitioning of a few key components can result in significant changes in the waste treatment duration and in the amount of immobilized high-level waste that is produced. In addition, reducing the efficiency with which tank waste is retrieved and staged can increase mission duration. The 2012 WTP Tank Utilization Assessment concluded that flowsheet models need to include the latest low-activity waste glass algorithms or the waste treatment mission duration and the amount of low activity waste that is produced could be significantly underestimated. (authors)« less

  18. Computer Analysis of Spectrum Anomaly in 32-GHz Traveling-Wave Tube for Cassini Mission

    NASA Technical Reports Server (NTRS)

    Dayton, James A., Jr.; Wilson, Jeffrey D.; Kory, Carol L.

    1999-01-01

    Computer modeling of the 32-GHz traveling-wave tube (TWT) for the Cassini Mission was conducted to explain the anomaly observed in the spectrum analysis of one of the flight-model tubes. The analysis indicated that the effect, manifested as a weak signal in the neighborhood of 35 GHz, was an intermodulation product of the 32-GHz drive signal with a 66.9-GHz oscillation induced by coupling to the second harmonic'signal. The oscillation occurred only at low- radiofrequency (RF) drive power levels that are not expected during the Cassini Mission. The conclusion was that the anomaly was caused by a generic defect inadvertently incorporated in the geometric design of the slow-wave circuit and that it would not change as the TWT aged. The most probable effect of aging on tube performance would be a reduction in the electron beam current. The computer modeling indicated that although not likely to occur within the mission lifetime, a reduction in beam current would reduce or eliminate the anomaly but would do so at the cost of reduced RF output power.

  19. Integrating Model-Based Transmission Reduction into a multi-tier architecture

    NASA Astrophysics Data System (ADS)

    Straub, J.

    A multi-tier architecture consists of numerous craft as part of the system, orbital, aerial, and surface tiers. Each tier is able to collect progressively greater levels of information. Generally, craft from lower-level tiers are deployed to a target of interest based on its identification by a higher-level craft. While the architecture promotes significant amounts of science being performed in parallel, this may overwhelm the computational and transmission capabilities of higher-tier craft and links (particularly the deep space link back to Earth). Because of this, a new paradigm in in-situ data processing is required. Model-based transmission reduction (MBTR) is such a paradigm. Under MBTR, each node (whether a single spacecraft in orbit of the Earth or another planet or a member of a multi-tier network) is given an a priori model of the phenomenon that it is assigned to study. It performs activities to validate this model. If the model is found to be erroneous, corrective changes are identified, assessed to ensure their significance for being passed on, and prioritized for transmission. A limited amount of verification data is sent with each MBTR assertion message to allow those that might rely on the data to validate the correct operation of the spacecraft and MBTR engine onboard. Integrating MBTR with a multi-tier framework creates an MBTR hierarchy. Higher levels of the MBTR hierarchy task lower levels with data collection and assessment tasks that are required to validate or correct elements of its model. A model of the expected conditions is sent to the lower level craft; which then engages its own MBTR engine to validate or correct the model. This may include tasking a yet lower level of craft to perform activities. When the MBTR engine at a given level receives all of its component data (whether directly collected or from delegation), it randomly chooses some to validate (by reprocessing the validation data), performs analysis and sends its own results (v- lidation and/or changes of model elements and supporting validation data) to its upstream node. This constrains data transmission to only significant (either because it includes a change or is validation data critical for assessing overall performance) information and reduces the processing requirements (by not having to process insignificant data) at higher-level nodes. This paper presents a framework for multi-tier MBTR and two demonstration mission concepts: an Earth sensornet and a mission to Mars. These multi-tier MBTR concepts are compared to a traditional mission approach.

  20. Modeling of pilot's visual behavior for low-level flight

    NASA Astrophysics Data System (ADS)

    Schulte, Axel; Onken, Reiner

    1995-06-01

    Developers of synthetic vision systems for low-level flight simulators deal with the problem to decide which features to incorporate in order to achieve most realistic training conditions. This paper supports an approach to this problem on the basis of modeling the pilot's visual behavior. This approach is founded upon the basic requirement that the pilot's mechanisms of visual perception should be identical in simulated and real low-level flight. Flight simulator experiments with pilots were conducted for knowledge acquisition. During the experiments video material of a real low-level flight mission containing different situations was displayed to the pilot who was acting under a realistic mission assignment in a laboratory environment. Pilot's eye movements could be measured during the replay. The visual mechanisms were divided into rule based strategies for visual navigation, based on the preflight planning process, as opposed to skill based processes. The paper results in a model of the pilot's planning strategy of a visual fixing routine as part of the navigation task. The model is a knowledge based system based upon the fuzzy evaluation of terrain features in order to determine the landmarks used by pilots. It can be shown that a computer implementation of the model selects those features, which were preferred by trained pilots, too.

  1. The Integrated Medical Model - Optimizing In-flight Space Medical Systems to Reduce Crew Health Risk and Mission Impacts

    NASA Technical Reports Server (NTRS)

    Kerstman, Eric; Walton, Marlei; Minard, Charles; Saile, Lynn; Myers, Jerry; Butler, Doug; Lyengar, Sriram; Fitts, Mary; Johnson-Throop, Kathy

    2009-01-01

    The Integrated Medical Model (IMM) is a decision support tool used by medical system planners and designers as they prepare for exploration planning activities of the Constellation program (CxP). IMM provides an evidence-based approach to help optimize the allocation of in-flight medical resources for a specified level of risk within spacecraft operational constraints. Eighty medical conditions and associated resources are represented in IMM. Nine conditions are due to Space Adaptation Syndrome. The IMM helps answer fundamental medical mission planning questions such as What medical conditions can be expected? What type and quantity of medical resources are most likely to be used?", and "What is the probability of crew death or evacuation due to medical events?" For a specified mission and crew profile, the IMM effectively characterizes the sequence of events that could potentially occur should a medical condition happen. The mathematical relationships among mission and crew attributes, medical conditions and incidence data, in-flight medical resources, potential clinical and crew health end states are established to generate end state probabilities. A Monte Carlo computational method is used to determine the probable outcomes and requires up to 25,000 mission trials to reach convergence. For each mission trial, the pharmaceuticals and supplies required to diagnose and treat prevalent medical conditions are tracked and decremented. The uncertainty of patient response to treatment is bounded via a best-case, worst-case, untreated case algorithm. A Crew Health Index (CHI) metric, developed to account for functional impairment due to a medical condition, provides a quantified measure of risk and enables risk comparisons across mission scenarios. The use of historical in-flight medical data, terrestrial surrogate data as appropriate, and space medicine subject matter expertise has enabled the development of a probabilistic, stochastic decision support tool capable of optimizing in-flight medical systems based on crew and mission parameters. This presentation will illustrate how to apply quantitative risk assessment methods to optimize the mass and volume of space-based medical systems for a space flight mission given the level of crew health and mission risk.

  2. Gossamer-1: Mission concept and technology for a controlled deployment of gossamer spacecraft

    NASA Astrophysics Data System (ADS)

    Seefeldt, Patric; Spietz, Peter; Sproewitz, Tom; Grundmann, Jan Thimo; Hillebrandt, Martin; Hobbie, Catherin; Ruffer, Michael; Straubel, Marco; Tóth, Norbert; Zander, Martin

    2017-01-01

    Gossamer structures for innovative space applications, such as solar sails, require technology that allows their controlled and thereby safe deployment. Before employing such technology for a dedicated science mission, it is desirable, if not necessary, to demonstrate its reliability with a Technology Readiness Level (TRL) of six or higher. The aim of the work presented here is to provide reliable technology that enables the controlled deployment and verification of its functionality with various laboratory tests, thereby qualifying the hardware for a first demonstration in low Earth orbit (LEO). The development was made in the Gossamer-1 project of the German Aerospace Center (DLR). This paper provides an overview of the Gossamer-1 mission and hardware development. The system is designed based on the requirements of a technology demonstration mission. The design rests on a crossed boom configuration with triangular sail segments. Employing engineering models, all aspects of the deployment were tested under ambient environment. Several components were also subjected to environmental qualification testing. An innovative stowing and deployment strategy for a controlled deployment, as well as the designs of the bus system, mechanisms and electronics are described. The tests conducted provide insights into the deployment process and allow a mechanical characterization of that deployment process, in particular the measurement of the deployment forces. Deployment on system level could be successfully demonstrated to be robust and controllable. The deployment technology is on TRL four approaching level five, with a qualification model for environmental testing currently being built.

  3. Space Radiation Risk Assessment for Future Lunar Missions

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Ponomarev, Artem; Atwell, Bill; Cucinotta, Francis A.

    2007-01-01

    For lunar exploration mission design, radiation risk assessments require the understanding of future space radiation environments in support of resource management decisions, operational planning, and a go/no-go decision. The future GCR flux was estimated as a function of interplanetary deceleration potential, which was coupled with the estimated neutron monitor rate from the Climax monitor using a statistical model. A probability distribution function for solar particle event (SPE) occurrence was formed from proton fluence measurements of SPEs occurred during the past 5 solar cycles (19-23). Large proton SPEs identified from impulsive nitrate enhancements in polar ice for which the fluences are greater than 2 10(exp 9) protons/sq cm for energies greater than 30 MeV, were also combined to extend the probability calculation for high level of proton fluences. The probability with which any given proton fluence level of a SPE will be exceeded during a space mission of defined duration was then calculated. Analytic energy spectra of SPEs at different ranks of the integral fluences were constructed over broad energy ranges extending out to GeV, and representative exposure levels were analyzed at those fluences. For the development of an integrated strategy for radiation protection on lunar exploration missions, effective doses at various points inside a spacecraft were calculated with detailed geometry models representing proposed transfer vehicle and habitat concepts. Preliminary radiation risk assessments from SPE and GCR were compared for various configuration concepts of radiation shelter in exploratory-class spacecrafts.

  4. Mars MetNet Mission - Martian Atmospheric Observational Post Network

    NASA Astrophysics Data System (ADS)

    Haukka, Harri; Harri, Ari-Matti; Aleksashkin, Sergey; Arruego, Ignacio; Schmidt, Walter; Genzer, Maria; Vazquez, Luis; Siikonen, Timo; Palin, Matti

    2016-10-01

    A new kind of planetary exploration mission for Mars is under development in collaboration between the Finnish Meteorological Institute (FMI), Lavochkin Association (LA), Space Research Institute (IKI) and Institutio Nacional de Tecnica Aerospacial (INTA). The Mars MetNet mission is based on a new semi-hard landing vehicle called MetNet Lander (MNL).The scientific payload of the Mars MetNet Precursor mission is divided into three categories: Atmospheric instruments, Optical devices and Composition and structure devices. Each of the payload instruments will provide significant insights in to the Martian atmospheric behavior.The key technologies of the MetNet Lander have been qualified and the electrical qualification model (EQM) of the payload bay has been built and successfully tested.Full Qualification Model (QM) of the MetNet landing unit with the Precursor Mission payload is currently under functional tests. In the near future the QM unit will be exposed to environmental tests with qualification levels including vibrations, thermal balance, thermal cycling and mechanical impact shock. One complete flight unit of the entry, descent and landing systems (EDLS) has been manufactured and tested with acceptance levels. Another flight-like EDLS has been exposed to most of the qualification tests, and hence it may be used for flight after refurbishments. Accordingly two flight-capable EDLS systems exist. The eventual goal is to create a network of atmospheric observational posts around the Martian surface. The next step in the MetNet Precursor Mission is the demonstration of the technical robustness and scientific capabilities of the MetNet type of landing vehicle. Definition of the Precursor Mission and discussions on launch opportunities are currently under way. The baseline program development funding exists for the next five years. Flight unit manufacture of the payload bay takes about 18 months, and it will be commenced after the Precursor Mission has been defined.

  5. OneSAF as an In-Stride Mission Command Asset

    DTIC Science & Technology

    2014-06-01

    implementation approach. While DARPA began with a funded project to complete the capability as a “ big bang ” approach the approach here is based on reuse and...Command (MC), Modeling and Simulation (M&S), Distributed Interactive Simulation (DIS) ABSTRACT: To provide greater interoperability and integration...within Mission Command (MC) Systems the One Semi-Automated Forces (OneSAF) entity level simulation is evolving from a tightly coupled client server

  6. A Multi-Attribute-Utility-Theory Model that Minimizes Interview-Data Requirements: A Consolidation of Space Launch Decisions.

    DTIC Science & Technology

    1994-12-01

    satel~ lites on th" .gruunu" U, .U,- orbit affects the priority given to a new launch. Table 3.9 Launch Priorities Level Level Title Description 0.00 No...value of a satellite’s mission(s) relative to the misston(s) of other sate• lites As such the rating given may reflect an endre class of satellites for...Expected Remaining Lifetime 0 Years Assign a number between 0 and I that best describes the utility of a sate; lite ’,th Vh,,V- ,ta .=.... ,. A at these

  7. APGEN Scheduling: 15 Years of Experience in Planning Automation

    NASA Technical Reports Server (NTRS)

    Maldague, Pierre F.; Wissler, Steve; Lenda, Matthew; Finnerty, Daniel

    2014-01-01

    In this paper, we discuss the scheduling capability of APGEN (Activity Plan Generator), a multi-mission planning application that is part of the NASA AMMOS (Advanced Multi- Mission Operations System), and how APGEN scheduling evolved over its applications to specific Space Missions. Our analysis identifies two major reasons for the successful application of APGEN scheduling to real problems: an expressive DSL (Domain-Specific Language) for formulating scheduling algorithms, and a well-defined process for enlisting the help of auxiliary modeling tools in providing high-fidelity, system-level simulations of the combined spacecraft and ground support system.

  8. Impact of geophysical model error for recovering temporal gravity field model

    NASA Astrophysics Data System (ADS)

    Zhou, Hao; Luo, Zhicai; Wu, Yihao; Li, Qiong; Xu, Chuang

    2016-07-01

    The impact of geophysical model error on recovered temporal gravity field models with both real and simulated GRACE observations is assessed in this paper. With real GRACE observations, we build four temporal gravity field models, i.e., HUST08a, HUST11a, HUST04 and HUST05. HUST08a and HUST11a are derived from different ocean tide models (EOT08a and EOT11a), while HUST04 and HUST05 are derived from different non-tidal models (AOD RL04 and AOD RL05). The statistical result shows that the discrepancies of the annual mass variability amplitudes in six river basins between HUST08a and HUST11a models, HUST04 and HUST05 models are all smaller than 1 cm, which demonstrates that geophysical model error slightly affects the current GRACE solutions. The impact of geophysical model error for future missions with more accurate satellite ranging is also assessed by simulation. The simulation results indicate that for current mission with range rate accuracy of 2.5 × 10- 7 m/s, observation error is the main reason for stripe error. However, when the range rate accuracy improves to 5.0 × 10- 8 m/s in the future mission, geophysical model error will be the main source for stripe error, which will limit the accuracy and spatial resolution of temporal gravity model. Therefore, observation error should be the primary error source taken into account at current range rate accuracy level, while more attention should be paid to improving the accuracy of background geophysical models for the future mission.

  9. Radio frequency radiation exposure of the F-15 crewmember.

    PubMed

    Laughrey, Michael S; Grayson, J Kevin; Jauchem, James R; Misener, Andrea E

    2003-08-01

    In the United States Air Force, pilots of F-15 fighter aircraft use fire control radars to search for enemy targets and to launch beyond visual range radar missiles. The fire control radars must be of a sufficient power output to enable a target return, but pilots are concerned about deleterious health effects from the levels of radio frequency radiation (RFR) they are exposed to. Measurement of RFR while actually in flight in the F-15 has never been performed. This study was designed to document the RFR levels that pilots are exposed to on normal missions while in flight with the radar on and active. A hand-held meter was used to measure electromagnetic fields during three F-15 flights. Instrumentation consisted of a Narda Microwave Model 8718 digital survey meter and Model 8723 broadband isotropic E-field probe with a frequency range between 300 MHz and 50 GHz. The measurements were conducted in the rear cockpit of an F-15D aircraft. Three missions were flown representing the standard missions an F-15 pilot flies on an everyday basis. The missions were: night intercepts, offensive basic fighter maneuvers, and defensive basic fighter maneuvers. Based on the data collected during three F-15 missions, all recorded RFR exposure to the crewmember in the F-15 was within the OSHA Permissible Exposure Limit (PEL) of 10 mW x cm(-2). Based on a limited sample, RFR exposures in F-15 cockpits appear to be well below the PEL.

  10. Model Learner Outcomes for Service Occupations.

    ERIC Educational Resources Information Center

    Grote, Audrey M.

    This guide to model learner outcomes for service occupations contains four chapters: (1) education values, learner values, philosophy, mission, and goals; (2) introduction, goals, and eight program-level learner outcomes; (3) general learner outcomes and outcomes for housing occupations, child care occupations, cosmetology and personal services,…

  11. Laser propulsion for orbit transfer - Laser technology issues

    NASA Technical Reports Server (NTRS)

    Horvath, J. C.; Frisbee, R. H.

    1985-01-01

    Using reasonable near-term mission traffic models (1991-2000 being the assumed operational time of the system) and the most current unclassified laser and laser thruster information available, it was found that space-based laser propulsion orbit transfer vehicles (OTVs) can outperform the aerobraked chemical OTV over a 10-year life-cycle. The conservative traffic models used resulted in an optimum laser power of about 1 MW per laser. This is significantly lower than the power levels considered in other studies. Trip time was taken into account only to the extent that the system was sized to accomplish the mission schedule.

  12. NASA Air Force Cost Model (NAFCOM): Capabilities and Results

    NASA Technical Reports Server (NTRS)

    McAfee, Julie; Culver, George; Naderi, Mahmoud

    2011-01-01

    NAFCOM is a parametric estimating tool for space hardware. Uses cost estimating relationships (CERs) which correlate historical costs to mission characteristics to predict new project costs. It is based on historical NASA and Air Force space projects. It is intended to be used in the very early phases of a development project. NAFCOM can be used at the subsystem or component levels and estimates development and production costs. NAFCOM is applicable to various types of missions (crewed spacecraft, uncrewed spacecraft, and launch vehicles). There are two versions of the model: a government version that is restricted and a contractor releasable version.

  13. Correleation of the SAGE III on ISS Thermal Models in Thermal Desktop

    NASA Technical Reports Server (NTRS)

    Amundsen, Ruth M.; Davis, Warren T.; Liles, Kaitlin, A. K.; McLeod, Shawn C.

    2017-01-01

    The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III was launched on February 19, 2017 and mounted to the International Space Station (ISS) to begin its three-year mission. A detailed thermal model of the SAGE III payload, which consists of multiple subsystems, has been developed in Thermal Desktop (TD). Correlation of the thermal model is important since the payload will be expected to survive a three-year mission on ISS under varying thermal environments. Three major thermal vacuum (TVAC) tests were completed during the development of the SAGE III Instrument Payload (IP); two subsystem-level tests and a payload-level test. Additionally, a characterization TVAC test was performed in order to verify performance of a system of heater plates that was designed to allow the IP to achieve the required temperatures during payload-level testing; model correlation was performed for this test configuration as well as those including the SAGE III flight hardware. This document presents the methods that were used to correlate the SAGE III models to TVAC at the subsystem and IP level, including the approach for modeling the parts of the payload in the thermal chamber, generating pre-test predictions, and making adjustments to the model to align predictions with temperatures observed during testing. Model correlation quality will be presented and discussed, and lessons learned during the correlation process will be shared.

  14. Validating An Analytic Completeness Model for Kepler Target Stars Based on Flux-level Transit Injection Experiments

    NASA Astrophysics Data System (ADS)

    Catanzarite, Joseph; Burke, Christopher J.; Li, Jie; Seader, Shawn; Haas, Michael R.; Batalha, Natalie; Henze, Christopher; Christiansen, Jessie; Kepler Project, NASA Advanced Supercomputing Division

    2016-06-01

    The Kepler Mission is developing an Analytic Completeness Model (ACM) to estimate detection completeness contours as a function of exoplanet radius and period for each target star. Accurate completeness contours are necessary for robust estimation of exoplanet occurrence rates.The main components of the ACM for a target star are: detection efficiency as a function of SNR, the window function (WF) and the one-sigma depth function (OSDF). (Ref. Burke et al. 2015). The WF captures the falloff in transit detection probability at long periods that is determined by the observation window (the duration over which the target star has been observed). The OSDF is the transit depth (in parts per million) that yields SNR of unity for the full transit train. It is a function of period, and accounts for the time-varying properties of the noise and for missing or deweighted data.We are performing flux-level transit injection (FLTI) experiments on selected Kepler target stars with the goal of refining and validating the ACM. “Flux-level” injection machinery inserts exoplanet transit signatures directly into the flux time series, as opposed to “pixel-level” injection, which inserts transit signatures into the individual pixels using the pixel response function. See Jie Li's poster: ID #2493668, "Flux-level transit injection experiments with the NASA Pleiades Supercomputer" for details, including performance statistics.Since FLTI is affordable for only a small subset of the Kepler targets, the ACM is designed to apply to most Kepler target stars. We validate this model using “deep” FLTI experiments, with ~500,000 injection realizations on each of a small number of targets and “shallow” FLTI experiments with ~2000 injection realizations on each of many targets. From the results of these experiments, we identify anomalous targets, model their behavior and refine the ACM accordingly.In this presentation, we discuss progress in validating and refining the ACM, and we compare our detection efficiency curves with those derived from the associated pixel-level transit injection experiments.Kepler was selected as the 10th mission of the Discovery Program. Funding for this mission is provided by NASA, Science Mission Directorate.

  15. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    NASA Technical Reports Server (NTRS)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model Integration), where it has been ap praised at maturity level 3.

  16. Cooling Technology for Large Space Telescopes

    NASA Technical Reports Server (NTRS)

    DiPirro, Michael; Cleveland, Paul; Durand, Dale; Klavins, Andy; Muheim, Daniella; Paine, Christopher; Petach, Mike; Tenerelli, Domenick; Tolomeo, Jason; Walyus, Keith

    2007-01-01

    NASA's New Millennium Program funded an effort to develop a system cooling technology, which is applicable to all future infrared, sub-millimeter and millimeter cryogenic space telescopes. In particular, this technology is necessary for the proposed large space telescope Single Aperture Far-Infrared Telescope (SAFIR) mission. This technology will also enhance the performance and lower the risk and cost for other cryogenic missions. The new paradigm for cooling to low temperatures will involve passive cooling using lightweight deployable membranes that serve both as sunshields and V-groove radiators, in combination with active cooling using mechanical coolers operating down to 4 K. The Cooling Technology for Large Space Telescopes (LST) mission planned to develop and demonstrate a multi-layered sunshield, which is actively cooled by a multi-stage mechanical cryocooler, and further the models and analyses critical to scaling to future missions. The outer four layers of the sunshield cool passively by radiation, while the innermost layer is actively cooled to enable the sunshield to decrease the incident solar irradiance by a factor of more than one million. The cryocooler cools the inner layer of the sunshield to 20 K, and provides cooling to 6 K at a telescope mounting plate. The technology readiness level (TRL) of 7 will be achieved by the active cooling technology following the technology validation flight in Low Earth Orbit. In accordance with the New Millennium charter, tests and modeling are tightly integrated to advance the technology and the flight design for "ST-class" missions. Commercial off-the-shelf engineering analysis products are used to develop validated modeling capabilities to allow the techniques and results from LST to apply to a wide variety of future missions. The LST mission plans to "rewrite the book" on cryo-thermal testing and modeling techniques, and validate modeling techniques to scale to future space telescopes such as SAFIR.

  17. CALIOP V4.10 L1 & L2 Release Announcement

    Atmospheric Science Data Center

    2016-11-14

    ... CALIPSO mission announces the release of new versions of its standard Level 1 and Level 2 lidar data products.  These products are ... essential ancillary data sets. The GTOPO30 Digital Elevation Model (DEM) used in V4.00 has been replaced by a substantially more accurate ...

  18. Opinion polls and the US civil space program

    NASA Technical Reports Server (NTRS)

    Fries, Sylvia Doughty

    1992-01-01

    An analysis of two public opinion poles that sought to determine NASA's level of support among the American people is presented. One pole models public participation in policy-making as a pyramid. The model provides for three levels of public participation: the attentive public, the interested public, and the non-attentive public. The three groups are discussed in the context of how to best promulgate NASA's mission throughout the American public.

  19. NOVA: A new multi-level logic simulator

    NASA Technical Reports Server (NTRS)

    Miles, L.; Prins, P.; Cameron, K.; Shovic, J.

    1990-01-01

    A new logic simulator that was developed at the NASA Space Engineering Research Center for VLSI Design was described. The simulator is multi-level, being able to simulate from the switch level through the functional model level. NOVA is currently in the Beta test phase and was used to simulate chips designed for the NASA Space Station and the Explorer missions. A new algorithm was devised to simulate bi-directional pass transistors and a preliminary version of the algorithm is presented. The usage of functional models in NOVA is also described and performance figures are presented.

  20. Lunar Exploration Architecture Level Key Drivers and Sensitivities

    NASA Technical Reports Server (NTRS)

    Goodliff, Kandyce; Cirillo, William; Earle, Kevin; Reeves, J. D.; Shyface, Hilary; Andraschko, Mark; Merrill, R. Gabe; Stromgren, Chel; Cirillo, Christopher

    2009-01-01

    Strategic level analysis of the integrated behavior of lunar transportation and lunar surface systems architecture options is performed to assess the benefit, viability, affordability, and robustness of system design choices. This analysis employs both deterministic and probabilistic modeling techniques so that the extent of potential future uncertainties associated with each option are properly characterized. The results of these analyses are summarized in a predefined set of high-level Figures of Merit (FOMs) so as to provide senior NASA Constellation Program (CxP) and Exploration Systems Mission Directorate (ESMD) management with pertinent information to better inform strategic level decision making. The strategic level exploration architecture model is designed to perform analysis at as high a level as possible but still capture those details that have major impacts on system performance. The strategic analysis methodology focuses on integrated performance, affordability, and risk analysis, and captures the linkages and feedbacks between these three areas. Each of these results leads into the determination of the high-level FOMs. This strategic level analysis methodology has been previously applied to Space Shuttle and International Space Station assessments and is now being applied to the development of the Constellation Program point-of-departure lunar architecture. This paper provides an overview of the strategic analysis methodology and the lunar exploration architecture analyses to date. In studying these analysis results, the strategic analysis team has identified and characterized key drivers affecting the integrated architecture behavior. These key drivers include inclusion of a cargo lander, mission rate, mission location, fixed-versus- variable costs/return on investment, and the requirement for probabilistic analysis. Results of sensitivity analysis performed on lunar exploration architecture scenarios are also presented.

  1. SCOSII OL: A dedicated language for mission operations

    NASA Technical Reports Server (NTRS)

    Baldi, Andrea; Elgaard, Dennis; Lynenskjold, Steen; Pecchioli, Mauro

    1994-01-01

    The Spacecraft Control and Operations System 2 (SCOSII) is the new generation of Mission Control Systems (MCS) to be used at ESOC. The system is generic because it offers a collection of standard functions configured through a database upon which a dedicated MCS is established for a given mission. An integral component of SCOSII is the support of a dedicated Operations Language (OL). The spacecraft operation engineers edit, test, validate, and install OL scripts as part of the configuration of the system with, e.g., expressions for computing derived parameters and procedures for performing flight operations, all without involvement of software support engineers. A layered approach has been adopted for the implementation centered around the explicit representation of a data model. The data model is object-oriented defining the structure of the objects in terms of attributes (data) and services (functions) which can be accessed by the OL. SCOSII supports the creation of a mission model. System elements as, e.g., a gyro are explicit, as are the attributes which described them and the services they provide. The data model driven approach makes it possible to take immediate advantage of this higher-level of abstraction, without requiring expansion of the language. This article describes the background and context leading to the OL, concepts, language facilities, implementation, status and conclusions found so far.

  2. Slim Battery Modelling Features

    NASA Astrophysics Data System (ADS)

    Borthomieu, Y.; Prevot, D.

    2011-10-01

    Saft has developed a life prediction model for VES and MPS cells and batteries. The Saft Li-ion Model (SLIM) is a macroscopic electrochemical model based on energy (global at cell level). The main purpose is to predict the battery performances during the life for GEO, MEO and LEO missions. This model is based on electrochemical characteristics such as Energy, Capacity, EMF, Internal resistance, end of charge voltage. It uses fading and calendar law effects on energy and internal impedance vs. time, temperature, End of Charge voltage. Based on the mission profile, satellite power system characteristics, the model proposes the various battery configurations. For each configuration, the model gives the battery performances using mission figures and profiles: power, duration, DOD, end of charge voltages, temperatures during eclipses and solstices, thermal dissipations and cell failures. For the GEO/MEO missions, eclipse and solstice periods can include specific profile such as plasmic propulsion fires and specific balancing operations. For LEO missions, the model is able to simulate high power peaks to predict radar pulses. Saft's main customers have been using the SLIM model available in house for two years. The purpose is to have the satellite builder power engineers able to perform by themselves in the battery pre-dimensioning activities their own battery simulations. The simulations can be shared with Saft engineers to refine the power system designs. This model has been correlated with existing life and calendar tests performed on all the VES and MPS cells. In comparing with more than 10 year lasting life tests, the accuracy of the model from a voltage point of view is less than 10 mV at end Of Life. In addition, thethe comparison with in-orbit data has been also done. b This paper will present the main features of the SLIM software and outputs comparison with real life tests. b0

  3. Modeling the Multi-Body System Dynamics of a Flexible Solar Sail Spacecraft

    NASA Technical Reports Server (NTRS)

    Kim, Young; Stough, Robert; Whorton, Mark

    2005-01-01

    Solar sail propulsion systems enable a wide range of space missions that are not feasible with current propulsion technology. Hardware concepts and analytical methods have matured through ground development to the point that a flight validation mission is now realizable. Much attention has been given to modeling the structural dynamics of the constituent elements, but to date an integrated system level dynamics analysis has been lacking. Using a multi-body dynamics and control analysis tool called TREETOPS, the coupled dynamics of the sailcraft bus, sail membranes, flexible booms, and control system sensors and actuators of a representative solar sail spacecraft are investigated to assess system level dynamics and control issues. With this tool, scaling issues and parametric trade studies can be performed to study achievable performance, control authority requirements, and control/structure interaction assessments.

  4. Micro-Arcsec mission: implications of the monitoring, diagnostic and calibration of the instrument response in the data reduction chain. .

    NASA Astrophysics Data System (ADS)

    Busonero, D.; Gai, M.

    The goals of 21st century high angular precision experiments rely on the limiting performance associated to the selected instrumental configuration and observational strategy. Both global and narrow angle micro-arcsec space astrometry require that the instrument contributions to the overall error budget has to be less than the desired micro-arcsec level precision. Appropriate modelling of the astrometric response is required for optimal definition of the data reduction and calibration algorithms, in order to ensure high sensitivity to the astrophysical source parameters and in general high accuracy. We will refer to the framework of the SIM-Lite and the Gaia mission, the most challenging space missions of the next decade in the narrow angle and global astrometry field, respectively. We will focus our dissertation on the Gaia data reduction issues and instrument calibration implications. We describe selected topics in the framework of the Astrometric Instrument Modelling for the Gaia mission, evidencing their role in the data reduction chain and we give a brief overview of the Astrometric Instrument Model Data Analysis Software System, a Java-based pipeline under development by our team.

  5. Integrated Medical Model (IMM) Optimization Version 4.0 Functional Improvements

    NASA Technical Reports Server (NTRS)

    Arellano, John; Young, M.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    The IMMs ability to assess mission outcome risk levels relative to available resources provides a unique capability to provide guidance on optimal operational medical kit and vehicle resources. Post-processing optimization allows IMM to optimize essential resources to improve a specific model outcome such as maximization of the Crew Health Index (CHI), or minimization of the probability of evacuation (EVAC) or the loss of crew life (LOCL). Mass and or volume constrain the optimized resource set. The IMMs probabilistic simulation uses input data on one hundred medical conditions to simulate medical events that may occur in spaceflight, the resources required to treat those events, and the resulting impact to the mission based on specific crew and mission characteristics. Because IMM version 4.0 provides for partial treatment for medical events, IMM Optimization 4.0 scores resources at the individual resource unit increment level as opposed to the full condition-specific treatment set level, as done in version 3.0. This allows the inclusion of as many resources as possible in the event that an entire set of resources called out for treatment cannot satisfy the constraints. IMM Optimization version 4.0 adds capabilities that increase efficiency by creating multiple resource sets based on differing constraints and priorities, CHI, EVAC, or LOCL. It also provides sets of resources that improve mission-related IMM v4.0 outputs with improved performance compared to the prior optimization. The new optimization represents much improved fidelity that will improve the utility of the IMM 4.0 for decision support.

  6. Optimum solar electric interplanetary mission opportunities from 1975 to 1990

    NASA Technical Reports Server (NTRS)

    Mann, F. I.; Horsewood, J. L.

    1971-01-01

    A collection of optimum trajectory and spacecraft data is presented for unmanned interplanetary missions from 1975 to 1990 using solar electric propulsion. Data are presented for one-way flyby and orbiter missions from Earth to Venus, Mars, Jupiter, Saturn, Uranus, Neptune, and Pluto. The solar system model assumes planetary ephemerides which very closely approximate the true motion of the planets. Direct and indirect flight profiles are investigated. Data are presented for two representative flight times for each mission. The launch vehicle is the Titan 3 B (core)/Centaur, and a constant jet exhaust speed solar electric propulsion system having a specific mass of 30 kg/kw is completely optimized in terms of power level and jet exhaust speed to yield maximum net spacecraft mass. The hyperbolic excess speeds at departure and arrival and the launch date are optimized for each mission. For orbiter missions, a chemical retro stage is used to brake the spacecraft into a highly eccentric capture orbit about the target planet.

  7. Situation awareness-based agent transparency for human-autonomy teaming effectiveness

    NASA Astrophysics Data System (ADS)

    Chen, Jessie Y. C.; Barnes, Michael J.; Wright, Julia L.; Stowers, Kimberly; Lakhmani, Shan G.

    2017-05-01

    We developed the Situation awareness-based Agent Transparency (SAT) model to support human operators' situation awareness of the mission environment through teaming with intelligent agents. The model includes the agent's current actions and plans (Level 1), its reasoning process (Level 2), and its projection of future outcomes (Level 3). Human-inthe-loop simulation experiments have been conducted (Autonomous Squad Member and IMPACT) to illustrate the utility of the model for human-autonomy team interface designs. Across studies, the results consistently showed that human operators' task performance improved as the agents became more transparent. They also perceived transparent agents as more trustworthy.

  8. Review of TRMM/GPM Rainfall Algorithm Validation

    NASA Technical Reports Server (NTRS)

    Smith, Eric A.

    2004-01-01

    A review is presented concerning current progress on evaluation and validation of standard Tropical Rainfall Measuring Mission (TRMM) precipitation retrieval algorithms and the prospects for implementing an improved validation research program for the next generation Global Precipitation Measurement (GPM) Mission. All standard TRMM algorithms are physical in design, and are thus based on fundamental principles of microwave radiative transfer and its interaction with semi-detailed cloud microphysical constituents. They are evaluated for consistency and degree of equivalence with one another, as well as intercompared to radar-retrieved rainfall at TRMM's four main ground validation sites. Similarities and differences are interpreted in the context of the radiative and microphysical assumptions underpinning the algorithms. Results indicate that the current accuracies of the TRMM Version 6 algorithms are approximately 15% at zonal-averaged / monthly scales with precisions of approximately 25% for full resolution / instantaneous rain rate estimates (i.e., level 2 retrievals). Strengths and weaknesses of the TRMM validation approach are summarized. Because the dew of convergence of level 2 TRMM algorithms is being used as a guide for setting validation requirements for the GPM mission, it is important that the GPM algorithm validation program be improved to ensure concomitant improvement in the standard GPM retrieval algorithms. An overview of the GPM Mission's validation plan is provided including a description of a new type of physical validation model using an analytic 3-dimensional radiative transfer model.

  9. Utilizing feedback in adaptive SAR ATR systems

    NASA Astrophysics Data System (ADS)

    Horsfield, Owen; Blacknell, David

    2009-05-01

    Existing SAR ATR systems are usually trained off-line with samples of target imagery or CAD models, prior to conducting a mission. If the training data is not representative of mission conditions, then poor performance may result. In addition, it is difficult to acquire suitable training data for the many target types of interest. The Adaptive SAR ATR Problem Set (AdaptSAPS) program provides a MATLAB framework and image database for developing systems that adapt to mission conditions, meaning less reliance on accurate training data. A key function of an adaptive system is the ability to utilise truth feedback to improve performance, and it is this feature which AdaptSAPS is intended to exploit. This paper presents a new method for SAR ATR that does not use training data, based on supervised learning. This is achieved by using feature-based classification, and several new shadow features have been developed for this purpose. These features allow discrimination of vehicles from clutter, and classification of vehicles into two classes: targets, comprising military combat types, and non-targets, comprising bulldozers and trucks. The performance of the system is assessed using three baseline missions provided with AdaptSAPS, as well as three additional missions. All performance metrics indicate a distinct learning trend over the course of a mission, with most third and fourth quartile performance levels exceeding 85% correct classification. It has been demonstrated that these performance levels can be maintained even when truth feedback rates are reduced by up to 55% over the course of a mission.

  10. Electric Propulsion

    NASA Astrophysics Data System (ADS)

    Baggett, R.

    2004-11-01

    Next Generation Electric Propulsion (NGEP) technology development tasks are working towards advancing solar-powered electric propulsion systems and components to levels ready for transition to flight systems. Current tasks within NGEP include NASA's Evolutionary Xenon Thruster (NEXT), Carbon Based Ion Optics (CBIO), NSTAR Extended Life Test (ELT) and low-power Hall Effect thrusters. The growing number of solar electric propulsion options provides reduced cost and flexibility to capture a wide range of Solar System exploration missions. Benefits of electric propulsion systems over state-of-the-art chemical systems include increased launch windows, which reduce mission risk; increased deliverable payload mass for more science; and a reduction in launch vehicle size-- all of which increase the opportunities for New Frontiers and Discovery class missions. The Dawn Discovery mission makes use of electric propulsion for sequential rendezvous with two large asteroids (Vesta then Ceres), something not possible using chemical propulsion. NEXT components and thruster system under development have NSTAR heritage with significant increases in maximum power and Isp along with deep throttling capability to accommodate changes in input power over the mission trajectory. NEXT will produce engineering model system components that will be validated (through qualification-level and integrated system testing) and ready for transition to flight system development. NEXT offers Discovery, New Frontiers, Mars Exploration and outer-planet missions a larger deliverable payload mass and a smaller launch vehicle size. CBIO addresses the need to further extend ion thruster lifetime by using low erosion carbon-based materials. Testing of 30-cm Carbon-Carbon and Pyrolytic graphite grids using a lab model NSTAR thruster are complete. In addition, JPL completed a 1000 hr. life test on 30-cm Carbon-Carbon grids. The NSTAR ELT was a life time qualification test started in 1999 with a goal of 88 kg throughput of Xenon propellant. The test was intentionally terminated in 2003 after accumulating 233 kg throughput. The thruster has been completely disassembled and the conditions of all components documented. Because most of the NSTAR design features have been used in the NEXT thruster, the success of the ELT goes a long way toward qualifying NEXT by similarity Recent mission analyses for Discovery and New Frontiers class missions have also identified potential benefits of low-power, high thrust Hall Effect thrusters. Estimated to be ready for mission implementation by 2008, low-power Hall systems could increase mission capture for electric propulsion by greatly reducing propulsion cost, mass and complexity.

  11. The SMAP level 4 carbon product for monitoring ecosystem land-atmosphere CO2 exchange

    USDA-ARS?s Scientific Manuscript database

    The NASA Soil Moisture Active Passive (SMAP) mission Level 4 Carbon (L4C) product provides model estimates of Net Ecosystem CO2 exchange (NEE) incorporating SMAP soil moisture information. The L4C product includes NEE, computed as total ecosystem respiration less gross photosynthesis, at a daily ti...

  12. Requirements based system level risk modeling

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Cornford, Steven; Feather, Martin

    2004-01-01

    The problem that we address in this paper is assessing the expected degree of success of the system or mission based on the degree to which each requirement is satisfied and the relative weight of the requirements.

  13. High-Power, High-Thrust Ion Thruster (HPHTion)

    NASA Technical Reports Server (NTRS)

    Peterson, Peter Y.

    2015-01-01

    Advances in high-power photovoltaic technology have enabled the possibility of reasonably sized, high-specific power solar arrays. At high specific powers, power levels ranging from 50 to several hundred kilowatts are feasible. Ion thrusters offer long life and overall high efficiency (typically greater than 70 percent efficiency). In Phase I, the team at ElectroDynamic Applications, Inc., built a 25-kW, 50-cm ion thruster discharge chamber and fabricated a laboratory model. This was in response to the need for a single, high-powered engine to fill the gulf between the 7-kW NASA's Evolutionary Xenon Thruster (NEXT) system and a notional 25-kW engine. The Phase II project matured the laboratory model into a protoengineering model ion thruster. This involved the evolution of the discharge chamber to a high-performance thruster by performance testing and characterization via simulated and full beam extraction testing. Through such testing, the team optimized the design and built a protoengineering model thruster. Coupled with gridded ion thruster technology, this technology can enable a wide range of missions, including ambitious near-Earth NASA missions, Department of Defense missions, and commercial satellite activities.

  14. Information for Successful Interaction with Autonomous Systems

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Johnson, Kathy A.

    2003-01-01

    Interaction in heterogeneous mission operations teams is not well matched to classical models of coordination with autonomous systems. We describe methods of loose coordination and information management in mission operations. We describe an information agent and information management tool suite for managing information from many sources, including autonomous agents. We present an integrated model of levels of complexity of agent and human behavior, which shows types of information processing and points of potential error in agent activities. We discuss the types of information needed for diagnosing problems and planning interactions with an autonomous system. We discuss types of coordination for which designs are needed for autonomous system functions.

  15. General Mission Analysis Tool (GMAT) Architectural Specification. Draft

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Conway, Darrel, J.

    2007-01-01

    Early in 2002, Goddard Space Flight Center (GSFC) began to identify requirements for the flight dynamics software needed to fly upcoming missions that use formations of spacecraft to collect data. These requirements ranged from low level modeling features to large scale interoperability requirements. In 2003 we began work on a system designed to meet these requirement; this system is GMAT. The General Mission Analysis Tool (GMAT) is a general purpose flight dynamics modeling tool built on open source principles. The GMAT code is written in C++, and uses modern C++ constructs extensively. GMAT can be run through either a fully functional Graphical User Interface (GUI) or as a command line program with minimal user feedback. The system is built and runs on Microsoft Windows, Linux, and Macintosh OS X platforms. The GMAT GUI is written using wxWidgets, a cross platform library of components that streamlines the development and extension of the user interface Flight dynamics modeling is performed in GMAT by building components that represent the players in the analysis problem that is being modeled. These components interact through the sequential execution of instructions, embodied in the GMAT Mission Sequence. A typical Mission Sequence will model the trajectories of a set of spacecraft evolving over time, calculating relevant parameters during this propagation, and maneuvering individual spacecraft to maintain a set of mission constraints as established by the mission analyst. All of the elements used in GMAT for mission analysis can be viewed in the GMAT GUI or through a custom scripting language. Analysis problems modeled in GMAT are saved as script files, and these files can be read into GMAT. When a script is read into the GMAT GUI, the corresponding user interface elements are constructed in the GMAT GUI. The GMAT system was developed from the ground up to run in a platform agnostic environment. The source code compiles on numerous different platforms, and is regularly exercised running on Windows, Linux and Macintosh computers by the development and analysis teams working on the project. The system can be run using either a graphical user interface, written using the open source wxWidgets framework, or from a text console. The GMAT source code was written using open source tools. GSFC has released the code using the NASA open source license.

  16. Qualification of Commercial XIPS(R) Ion Thrusters for NASA Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Goebel, Dan M.; Polk, James E.; Wirz, Richard E.; Snyder, J.Steven; Mikellides, Ioannis G.; Katz, Ira; Anderson, John

    2008-01-01

    Electric propulsion systems based on commercial ion and Hall thrusters have the potential for significantly reducing the cost and schedule-risk of Ion Propulsion Systems (IPS) for deep space missions. The large fleet of geosynchronous communication satellites that use solar electric propulsion (SEP), which will approach 40 satellites by year-end, demonstrates the significant level of technical maturity and spaceflight heritage achieved by the commercial IPS systems. A program to delta-qualify XIPS(R) ion thrusters for deep space missions is underway at JPL. This program includes modeling of the thruster grid and cathode life, environmental testing of a 25-centimeter electromagnetic (EM) thruster over DAWN-like vibe and temperature profiles, and wear testing of the thruster cathodes to demonstrate the life and benchmark the model results. This paper will present the delta-qualification status of the XIPS thruster and discuss the life and reliability with respect to known failure mechanisms.

  17. Space radiation incident on SATS missions

    NASA Technical Reports Server (NTRS)

    Stassinopoulos, E. G.

    1973-01-01

    A special orbital radiation study was conducted in order to evaluate mission encountered energetic particle fluxes. This information is to be supplied to the project subsystem engineers for their guidance in designing flight hardware to withstand the expected radiation levels. Flux calculations were performed for a set of 20 nominal trajectories placed at several altitudes and inclinations. Temporal variations in the ambient electron environment were considered and partially accounted for. Magnetic field calculations were performed with a current field model, extrapolated to the tentative SATS launch epoch with linear time terms. Orbital flux integrations ware performed with the latest proton and electron environment models, using new computational methods. The results are presented in graphical and tabular form. Estimates of energetic solar proton fluxes are given for a one year mission at selected integral energies ranging from 10 to 100 Mev, calculated for a year of maximum solar activity during the next solar cycle.

  18. The simulation of automatic ladar sensor control during flight operations using USU LadarSIM software

    NASA Astrophysics Data System (ADS)

    Pack, Robert T.; Saunders, David; Fullmer, Rees; Budge, Scott

    2006-05-01

    USU LadarSIM Release 2.0 is a ladar simulator that has the ability to feed high-level mission scripts into a processor that automatically generates scan commands during flight simulations. The scan generation depends on specified flight trajectories and scenes consisting of terrain and targets. The scenes and trajectories can either consist of simulated or actual data. The first modeling step produces an outline of scan footprints in xyz space. Once mission goals have been analyzed and it is determined that the scan footprints are appropriately distributed or placed, specific scans can then be chosen for the generation of complete radiometry-based range images and point clouds. The simulation is capable of quickly modeling ray-trace geometry associated with (1) various focal plane arrays and scanner configurations and (2) various scene and trajectories associated with particular maneuvers or missions.

  19. Planetary Cartography and Mapping: where we are Today, and where we are Heading For?

    NASA Astrophysics Data System (ADS)

    Naß, A.; Di, K.; Elgner, S.; van Gasselt, S.; Hare, T.; Hargitai, H.; Karachevtseva, I.; Kersten, E.; Manaud, N.; Roatsch, T.; Rossi, A. P.; Skinner, J., Jr.; Wählisch, M.

    2017-07-01

    Planetary Cartography does not only provides the basis to support planning (e.g., landing-site selection, orbital observations, traverse planning) and to facilitate mission conduct during the lifetime of a mission (e.g., observation tracking and hazard avoidance). It also provides the means to create science products after successful termination of a planetary mission by distilling data into maps. After a mission's lifetime, data and higher level products like mosaics and digital terrain models (DTMs) are stored in archives - and eventually into maps and higher-level data products - to form a basis for research and for new scientific and engineering studies. The complexity of such tasks increases with every new dataset that has been put on this stack of information, and in the same way as the complexity of autonomous probes increases, also tools that support these challenges require new levels of sophistication. In planetary science, cartography and mapping have a history dating back to the roots of telescopic space exploration and are now facing new technological and organizational challenges with the rise of new missions, new global initiatives, organizations and opening research markets. The focus of this contribution is to summarize recent activities in Planetary Cartography, highlighting current issues the community is facing to derive the future opportunities in this field. By this we would like to invite cartographers/researchers to join this community and to start thinking about how we can jointly solve some of these challenges.

  20. JPL Innovation Foundry

    NASA Technical Reports Server (NTRS)

    Sherwood, Brent; McCleese, Daniel

    2012-01-01

    Space science missions are increasingly challenged today: in ambition, by increasingly sophisticated hypotheses tested; in development, by the increasing complexity of advanced technologies; in budgeting, by the decline of flagship-class mission opportunities; in management, by expectations for breakthrough science despite a risk-averse programmatic climate; and in planning, by increasing competition for scarce resources. How are the space-science missions of tomorrow being formulated? The paper describes the JPL Innovation Foundry, created in 2011, to respond to this evolving context. The Foundry integrates methods, tools, and experts that span the mission concept lifecycle. Grounded in JPL's heritage of missions, flight instruments, mission proposals, and concept innovation, the Foundry seeks to provide continuity of support and cost-effective, on-call access to the right domain experts at the right time, as science definition teams and Principal Investigators mature mission ideas from "cocktail napkin" to PDR. The Foundry blends JPL capabilities in proposal development and concurrent engineering, including Team X, with new approaches for open-ended concept exploration in earlier, cost-constrained phases, and with ongoing research and technology projects. It applies complexity and cost models, projectformulation lessons learned, and strategy analyses appropriate to each level of concept maturity. The Foundry is organizationally integrated with JPL formulation program offices; staffed by JPL's line organizations for engineering, science, and costing; and overseen by senior Laboratory leaders to assure experienced coordination and review. Incubation of each concept is tailored depending on its maturity and proposal history, and its highest leverage modeling and analysis needs.

  1. Assimilation of river altimetry data for effective bed elevation and roughness coefficient

    NASA Astrophysics Data System (ADS)

    Brêda, João Paulo L. F.; Paiva, Rodrigo C. D.; Bravo, Juan Martin; Passaia, Otávio

    2017-04-01

    Hydrodynamic models of large rivers are important prediction tools of river discharge, height and floods. However, these techniques still carry considerable errors; part of them related to parameters uncertainties related to river bathymetry and roughness coefficient. Data from recent spatial altimetry missions offers an opportunity to reduce parameters uncertainty through inverse methods. This study aims to develop and access different methods of altimetry data assimilation to improve river bottom levels and Manning roughness estimations in a 1-D hydrodynamic model. The case study was a 1,100 km reach of the Madeira River, a tributary of the Amazon. The tested assimilation methods are direct insertion, linear interpolation, SCE-UA global optimization algorithm and a Kalman Filter adaptation. The Kalman Filter method is composed by new physically based covariance functions developed from steady-flow and backwater equations. It is accessed the benefits of altimetry missions with different spatio-temporal resolutions, such as ICESAT-1, Envisat and Jason 2. Level time series of 5 gauging stations and 5 GPS river height profiles are used to assess and validate the assimilation methods. Finally, the potential of future missions are discussed, such as ICESAT-2 and SWOT satellites.

  2. Multi-Agent Modeling and Simulation Approach for Design and Analysis of MER Mission Operations

    NASA Technical Reports Server (NTRS)

    Seah, Chin; Sierhuis, Maarten; Clancey, William J.

    2005-01-01

    A space mission operations system is a complex network of human organizations, information and deep-space network systems and spacecraft hardware. As in other organizations, one of the problems in mission operations is managing the relationship of the mission information systems related to how people actually work (practices). Brahms, a multi-agent modeling and simulation tool, was used to model and simulate NASA's Mars Exploration Rover (MER) mission work practice. The objective was to investigate the value of work practice modeling for mission operations design. From spring 2002 until winter 2003, a Brahms modeler participated in mission systems design sessions and operations testing for the MER mission held at Jet Propulsion Laboratory (JPL). He observed how designers interacted with the Brahms tool. This paper discussed mission system designers' reactions to the simulation output during model validation and the presentation of generated work procedures. This project spurred JPL's interest in the Brahms model, but it was never included as part of the formal mission design process. We discuss why this occurred. Subsequently, we used the MER model to develop a future mission operations concept. Team members were reluctant to use the MER model, even though it appeared to be highly relevant to their effort. We describe some of the tool issues we encountered.

  3. Preliminary Assessment of Thrust Augmentation of NEP Based Missions

    NASA Technical Reports Server (NTRS)

    Chew, Gilbert; Pelaccio, Dennis G.; Chiroux, Robert; Pervan, Sherry; Rauwolf, Gerald A.; White, Charles

    2005-01-01

    Science Applications International Corporation (SAIC), with support from NASA Marshall Space Flight Center, has conducted a preliminary study to compare options for augmenting the thrust of a conventional nuclear electric propulsion (NEP) system. These options include a novel nuclear propulsion system concept known as Hybrid Indirect Nuclear Propulsion (HINP) and conventional chemical propulsion. The utility and technical feasibility of the HINP concept are assessed, and features and potential of this new in-space propulsion system concept are identified. As part of the study, SAIC developed top-level design tools to model the size and performance of an HINP system, as well as for several chemical propulsion options, including liquid and gelled propellants. A mission trade study was performed to compare a representative HINP system with chemical propulsion options for thrust augmentation of NEP systems for a mission to Saturn's moon Titan. Details pertaining to the approach, features, initial demonstration results for HINP model development, and the mission trade study are presented. Key technology and design issues associated with the HINP concept and future work recommendations are also identified.

  4. Application of the GEM-T2 gravity field to altimetric satellite orbit computation

    NASA Technical Reports Server (NTRS)

    Haines, Bruce J.; Born, George H.; Williamson, Ronald G.; Koblinsky, Chester I.

    1994-01-01

    As part of a continuing effort to provide improved orbits for use with existing altimeter data, we have recomputed ephemerides for both the Seasat and Geosat Exact Repeat altimeter missions. The orbits were computed in a consistent fashion, using the Goddard Earth Model T2 (GEM-T2) gravity field along with available ground-based tracking data. Such an approach allows direct comparisons of sea level between the two altimeter systems. Evaluation of the resulting ephemerides indicates that root-mean-square accuracies of 30-50 cm have been achieved for the radial component of the orbits for both satellites. An exception occurs for the last year of the Geosat Exact Repeat Mission, when the rms radial orbit accuracy degrades to the 1-m level at times owing to the inability to adequately model the drag force arising from the increased solar activity.

  5. Recoding low-level simulator data into a record of meaningful task performance: the integrated task modeling environment (ITME).

    PubMed

    King, Robert; Parker, Simon; Mouzakis, Kon; Fletcher, Winston; Fitzgerald, Patrick

    2007-11-01

    The Integrated Task Modeling Environment (ITME) is a user-friendly software tool that has been developed to automatically recode low-level data into an empirical record of meaningful task performance. The present research investigated and validated the performance of the ITME software package by conducting complex simulation missions and comparing the task analyses produced by ITME with taskanalyses produced by experienced video analysts. A very high interrater reliability (> or = .94) existed between experienced video analysts and the ITME for the task analyses produced for each mission. The mean session time:analysis time ratio was 1:24 using video analysis techniques and 1:5 using the ITME. It was concluded that the ITME produced task analyses that were as reliable as those produced by experienced video analysts, and significantly reduced the time cost associated with these analyses.

  6. The NASA Evolutionary Xenon Thruster (NEXT): NASA's Next Step for U.S. Deep Space Propulsion

    NASA Technical Reports Server (NTRS)

    Schmidt, George R.; Patterson, Michael J.; Benson, Scott W.

    2008-01-01

    NASA s Evolutionary Xenon Thruster (NEXT) project is developing next generation ion propulsion technologies to enhance the performance and lower the costs of future NASA space science missions. This is being accomplished by producing Engineering Model (EM) and Prototype Model (PM) components, validating these via qualification-level and integrated system testing, and preparing the transition of NEXT technologies to flight system development. The project is currently completing one of the final milestones of the effort, that is operation of an integrated NEXT Ion Propulsion System (IPS) in a simulated space environment. This test will advance the NEXT system to a NASA Technology Readiness Level (TRL) of 6 (i.e., operation of a prototypical system in a representative environment), and will confirm its readiness for flight. Besides its promise for upcoming NASA science missions, NEXT may have excellent potential for future commercial and international spacecraft applications.

  7. Multiple Autonomous Discrete Event Controllers for Constellations

    NASA Technical Reports Server (NTRS)

    Esposito, Timothy C.

    2003-01-01

    The Multiple Autonomous Discrete Event Controllers for Constellations (MADECC) project is an effort within the National Aeronautics and Space Administration Goddard Space Flight Center's (NASA/GSFC) Information Systems Division to develop autonomous positioning and attitude control for constellation satellites. It will be accomplished using traditional control theory and advanced coordination algorithms developed by the Johns Hopkins University Applied Physics Laboratory (JHU/APL). This capability will be demonstrated in the discrete event control test-bed located at JHU/APL. This project will be modeled for the Leonardo constellation mission, but is intended to be adaptable to any constellation mission. To develop a common software architecture. the controllers will only model very high-level responses. For instance, after determining that a maneuver must be made. the MADECC system will output B (Delta)V (velocity change) value. Lower level systems must then decide which thrusters to fire and for how long to achieve that (Delta)V.

  8. CoMET: Cost and Mass Evaluation Tool for Spacecraft and Mission Design

    NASA Technical Reports Server (NTRS)

    Bieber, Ben S.

    2005-01-01

    New technology in space exploration is often developed without a complete knowledge of its impact. While the immediate benefits of a new technology are obvious, it is harder to understand its indirect consequences, which ripple through the entire system. COMET is a technology evaluation tool designed to illuminate how specific technology choices affect a mission at each system level. COMET uses simplified models for mass, power, and cost to analyze performance parameters of technologies of interest. The sensitivity analysis that CoMET provides shows whether developing a certain technology will greatly benefit the project or not. CoMET is an ongoing project approaching a web-based implementation phase. This year, development focused on the models for planetary daughter craft, such as atmospheric probes, blimps and balloons, and landers. These models are developed through research into historical data, well established rules of thumb, and engineering judgment of experts at JPL. The model is validated by corroboration with JpL advanced mission studies. Other enhancements to COMET include adding launch vehicle analysis and integrating an updated cost model. When completed, COMET will allow technological development to be focused on areas that will most drastically improve spacecraft performance.

  9. Medical resource preparation and allocation for humanitarian assistance based on module organization.

    PubMed

    Zhu, Min; Chen, Ruxue; Zhong, Shaobo; Qian, Yangming; Huang, Quanyi

    2017-02-01

    This research aims to associate the allocation of medical resources with the function of the modular organization and the possible needs for humanitarian assistance missions. The overseas humanitarian medical assistance mission, which was sent after a disaster on the hospital ship Peace Ark, part of China's People's Liberation Army (PLA) Navy, was considered as study model. The cases used for clustering and matching sample formation were randomly selected from the existing information related to Peace Ark's mission. Categories of the reusable resources clustered by this research met the requirement of the actual consumption almost completely (more than 95%) and the categories of non-reusable resources met the requirement by more than 80%. In the mission's original resource preparing plan, more than 30% of the non-reusable resource categories remained unused during the mission. In the original resource preparing plan, some key non-reusable resources inventories were completely exhausted at the end of the mission, while 5% to 30% of non-reusable resources remained in the resource allocation plan generated by this research at the end of the mission. The medical resource allocation plan generated here can enhance the supporting level for the humanitarian assistance mission. This research could lay the foundation for an assistant decision-making system for humanitarian assistance mission.

  10. H2 arcjet performance mapping program

    NASA Astrophysics Data System (ADS)

    1992-01-01

    Work performed during the period of Mar. 1991 to Jan. 1992 is reviewed. High power H2 arcjets are being considered for electric powered orbit transfer vehicles (EOTV). Mission analyses indicate that the overall arcjet thrust efficiency is very important since increasing the efficiency increases the thrust, and thereby reduces the total trip time for the same power. For example, increasing the thrust efficiency at the same specific impulse from 30 to 40 percent will reduce the trip time by 25 percent. For a 200 day mission, this equates to 50 days, which results in lower ground costs and less time during which the payload is dormant. Arcjet performance levels of 1200 seconds specific impulse (lsp) at 35 to 40 percent efficiency with lifetimes over 1000 hours are needed to support EOTV missions. Because of the potential very high efficiency levels, the objective of this program was to evaluate the ability of a scaled Giannini-style thruster to achieve the performance levels while operating at a reduced nominal power of 10 kW. To meet this objective, a review of past literature was conducted; scaling relationships were developed and applied to establish critical dimensions; a development thruster was designed with the aid of the plasma analysis model KARNAC and finite element thermal modeling; test hardware was fabricated; and a series of performance tests were conducted in RRC's Cell 11 vacuum chamber with its null-balance thrust stand.

  11. Bimodal Nuclear Thermal Rocket Sizing and Trade Matrix for Lunar, Near Earth Asteroid and Mars Missions

    NASA Astrophysics Data System (ADS)

    McCurdy, David R.; Krivanek, Thomas M.; Roche, Joseph M.; Zinolabedini, Reza

    2006-01-01

    The concept of a human rated transport vehicle for various near earth missions is evaluated using a liquid hydrogen fueled Bimodal Nuclear Thermal Propulsion (BNTP) approach. In an effort to determine the preliminary sizing and optimal propulsion system configuration, as well as the key operating design points, an initial investigation into the main system level parameters was conducted. This assessment considered not only the performance variables but also the more subjective reliability, operability, and maintainability attributes. The SIZER preliminary sizing tool was used to facilitate rapid modeling of the trade studies, which included tank materials, propulsive versus an aero-capture trajectory, use of artificial gravity, reactor chamber operating pressure and temperature, fuel element scaling, engine thrust rating, engine thrust augmentation by adding oxygen to the flow in the nozzle for supersonic combustion, and the baseline turbopump configuration to address mission redundancy and safety requirements. A high level system perspective was maintained to avoid focusing solely on individual component optimization at the expense of system level performance, operability, and development cost.

  12. Planetary spacecraft cost modeling utilizing labor estimating relationships

    NASA Technical Reports Server (NTRS)

    Williams, Raymond

    1990-01-01

    A basic computerized technology is presented for estimating labor hours and cost of unmanned planetary and lunar programs. The user friendly methodology designated Labor Estimating Relationship/Cost Estimating Relationship (LERCER) organizes the forecasting process according to vehicle subsystem levels. The level of input variables required by the model in predicting cost is consistent with pre-Phase A type mission analysis. Twenty one program categories were used in the modeling. To develop the model, numerous LER and CER studies were surveyed and modified when required. The result of the research along with components of the LERCER program are reported.

  13. Engineering-Level Model Atmospheres for Titan and Neptune

    NASA Technical Reports Server (NTRS)

    Justus, C. G.; Duvall, Aleta; Johnson, D. L.

    2003-01-01

    Engineering-level atmospheric models for Titan and Neptune have been developed for use in NASA s systems analysis studies of aerocapture applications in missions to the outer planets. Analogous to highly successful Global Reference Atmospheric Models for Earth (GRAM, Justus et al., 2000) and Mars (Mars-GRAM, Justus and Johnson, 2001, Justus et al., 2002) the new models are called Titan-GRAM and Neptune-GRAM. Like GRAM and Mars-GRAM, an important feature of Titan-GRAM and Neptune-GRAM is their ability to simulate quasi-random perturbations for Monte- Carlo analyses in developing guidance, navigation and control algorithms, and for thermal systems design.

  14. End-to-End Trade-space Analysis for Designing Constellation Missions

    NASA Astrophysics Data System (ADS)

    LeMoigne, J.; Dabney, P.; Foreman, V.; Grogan, P.; Hache, S.; Holland, M. P.; Hughes, S. P.; Nag, S.; Siddiqi, A.

    2017-12-01

    Multipoint measurement missions can provide a significant advancement in science return and this science interest coupled with many recent technological advances are driving a growing trend in exploring distributed architectures for future NASA missions. Distributed Spacecraft Missions (DSMs) leverage multiple spacecraft to achieve one or more common goals. In particular, a constellation is the most general form of DSM with two or more spacecraft placed into specific orbit(s) for the purpose of serving a common objective (e.g., CYGNSS). Because a DSM architectural trade-space includes both monolithic and distributed design variables, DSM optimization is a large and complex problem with multiple conflicting objectives. Over the last two years, our team has been developing a Trade-space Analysis Tool for Constellations (TAT-C), implemented in common programming languages for pre-Phase A constellation mission analysis. By evaluating alternative mission architectures, TAT-C seeks to minimize cost and maximize performance for pre-defined science goals. This presentation will describe the overall architecture of TAT-C including: a User Interface (UI) at several levels of details and user expertise; Trade-space Search Requests that are created from the Science requirements gathered by the UI and validated by a Knowledge Base; a Knowledge Base to compare the current requests to prior mission concepts to potentially prune the trade-space; a Trade-space Search Iterator which, with inputs from the Knowledge Base, and, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generates multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, modeling orbits to balance accuracy and performance. The current version includes uniform and non-uniform Walker constellations as well as Ad-Hoc and precessing constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The current GUI automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost. The end-to-end system will be demonstrated as part of the presentation.

  15. Computational Design of Materials: Planetary Entry to Electric Aircraft and Beyond

    NASA Technical Reports Server (NTRS)

    Thompson, Alexander; Lawson, John W.

    2014-01-01

    NASA's projects and missions push the bounds of what is possible. To support the agency's work, materials development must stay on the cutting edge in order to keep pace. Today, researchers at NASA Ames Research Center perform multiscale modeling to aid the development of new materials and provide insight into existing ones. Multiscale modeling enables researchers to determine micro- and macroscale properties by connecting computational methods ranging from the atomic level (density functional theory, molecular dynamics) to the macroscale (finite element method). The output of one level is passed on as input to the next level, creating a powerful predictive model.

  16. Space Radiation Risks for Astronauts on Multiple International Space Station Missions

    PubMed Central

    Cucinotta, Francis A.

    2014-01-01

    Mortality and morbidity risks from space radiation exposure are an important concern for astronauts participating in International Space Station (ISS) missions. NASA’s radiation limits set a 3% cancer fatality probability as the upper bound of acceptable risk and considers uncertainties in risk predictions using the upper 95% confidence level (CL) of the assessment. In addition to risk limitation, an important question arises as to the likelihood of a causal association between a crew-members’ radiation exposure in the past and a diagnosis of cancer. For the first time, we report on predictions of age and sex specific cancer risks, expected years of life-loss for specific diseases, and probability of causation (PC) at different post-mission times for participants in 1-year or multiple ISS missions. Risk projections with uncertainty estimates are within NASA acceptable radiation standards for mission lengths of 1-year or less for likely crew demographics. However, for solar minimum conditions upper 95% CL exceed 3% risk of exposure induced death (REID) by 18 months or 24 months for females and males, respectively. Median PC and upper 95%-confidence intervals are found to exceed 50% for several cancers for participation in two or more ISS missions of 18 months or longer total duration near solar minimum, or for longer ISS missions at other phases of the solar cycle. However, current risk models only consider estimates of quantitative differences between high and low linear energy transfer (LET) radiation. We also make predictions of risk and uncertainties that would result from an increase in tumor lethality for highly ionizing radiation reported in animal studies, and the additional risks from circulatory diseases. These additional concerns could further reduce the maximum duration of ISS missions within acceptable risk levels, and will require new knowledge to properly evaluate. PMID:24759903

  17. Space radiation risks for astronauts on multiple International Space Station missions.

    PubMed

    Cucinotta, Francis A

    2014-01-01

    Mortality and morbidity risks from space radiation exposure are an important concern for astronauts participating in International Space Station (ISS) missions. NASA's radiation limits set a 3% cancer fatality probability as the upper bound of acceptable risk and considers uncertainties in risk predictions using the upper 95% confidence level (CL) of the assessment. In addition to risk limitation, an important question arises as to the likelihood of a causal association between a crew-members' radiation exposure in the past and a diagnosis of cancer. For the first time, we report on predictions of age and sex specific cancer risks, expected years of life-loss for specific diseases, and probability of causation (PC) at different post-mission times for participants in 1-year or multiple ISS missions. Risk projections with uncertainty estimates are within NASA acceptable radiation standards for mission lengths of 1-year or less for likely crew demographics. However, for solar minimum conditions upper 95% CL exceed 3% risk of exposure induced death (REID) by 18 months or 24 months for females and males, respectively. Median PC and upper 95%-confidence intervals are found to exceed 50% for several cancers for participation in two or more ISS missions of 18 months or longer total duration near solar minimum, or for longer ISS missions at other phases of the solar cycle. However, current risk models only consider estimates of quantitative differences between high and low linear energy transfer (LET) radiation. We also make predictions of risk and uncertainties that would result from an increase in tumor lethality for highly ionizing radiation reported in animal studies, and the additional risks from circulatory diseases. These additional concerns could further reduce the maximum duration of ISS missions within acceptable risk levels, and will require new knowledge to properly evaluate.

  18. JIMM: the next step for mission-level models

    NASA Astrophysics Data System (ADS)

    Gump, Jamieson; Kurker, Robert G.; Nalepka, Joseph P.

    2001-09-01

    The (Simulation Based Acquisition) SBA process is one in which the planning, design, and test of a weapon system or other product is done through the more effective use of modeling and simulation, information technology, and process improvement. This process results in a product that is produced faster, cheaper, and more reliably than its predecessors. Because the SBA process requires realistic and detailed simulation conditions, it was necessary to develop a simulation tool that would provide a simulation environment acceptable for doing SBA analysis. The Joint Integrated Mission Model (JIMM) was created to help define and meet the analysis, test and evaluation, and training requirements of a Department of Defense program utilizing SBA. Through its generic nature of representing simulation entities, its data analysis capability, and its robust configuration management process, JIMM can be used to support a wide range of simulation applications as both a constructive and a virtual simulation tool. JIMM is a Mission Level Model (MLM). A MLM is capable of evaluating the effectiveness and survivability of a composite force of air and space systems executing operational objectives in a specific scenario against an integrated air and space defense system. Because MLMs are useful for assessing a system's performance in a realistic, integrated, threat environment, they are key to implementing the SBA process. JIMM is a merger of the capabilities of one legacy model, the Suppressor MLM, into another, the Simulated Warfare Environment Generator (SWEG) MLM. By creating a more capable MLM, JIMM will not only be a tool to support the SBA initiative, but could also provide the framework for the next generation of MLMs.

  19. SCRL-Model for Human Space Flight Operations Enterprise Supply Chain

    NASA Technical Reports Server (NTRS)

    Tucker, Brian

    2010-01-01

    Standard approach to evaluate and configure adaptable and sustainable program and mission supply chains at an enterprise level. End-to-end view. Total Lifecycle. Evaluate the readiness of the supply chain during the supply chain development phase.

  20. The Green Propellant Infusion Mission Thruster Performance Testing for Plume Diagnostics

    NASA Technical Reports Server (NTRS)

    Deans, Matthew C.; Reed, Brian D.; Arrington, Lynn A.; Williams, George J.; Kojima, Jun J.; Kinzbach, McKenzie I.; McLean, Christopher H.

    2014-01-01

    The Green Propellant Infusion Mission (GPIM) is sponsored by NASA's Space Technology Mission Directorate (STMD) Technology Demonstration Mission (TDM) office. The goal of GPIM is to advance the technology readiness level of a green propulsion system, specifically, one using the monopropellant, AF-M315E, by demonstrating ground handling, spacecraft processing, and on-orbit operations. One of the risks identified for GPIM is potential contamination of sensitive spacecraft surfaces from the effluents in the plumes of AF-M315E thrusters. NASA Glenn Research Center (GRC) is conducting activities to characterize the effects of AF-M315E plume impingement and deposition. GRC has established individual plume models of the 22-N and 1-N thrusters that will be used on the GPIM spacecraft. The model simulations will be correlated with plume measurement data from Laboratory and Engineering Model 22-N, AF-M315E thrusters. The thrusters are currently being tested in a small rocket, altitude facility at NASA GRC. A suite of diagnostics, including Raman spectroscopy, Rayleigh spectroscopy, and Schlieren imaging are being used to acquire plume measurements of AF-M315E thrusters. Plume data will include temperature, velocity, relative density, and species concentration. The plume measurement data will be compared to the corresponding simulations of the plume model. The GRC effort will establish a data set of AF-M315E plume measurements and a plume model that can be used for future AF-M315E applications.

  1. Comparing simulations and test data of a radiation damaged CCD for the Euclid mission

    NASA Astrophysics Data System (ADS)

    Skottfelt, Jesper; Hall, David; Gow, Jason; Murray, Neil; Holland, Andrew; Prod'homme, Thibaut

    2016-07-01

    The radiation damage effects from the harsh radiative environment outside the Earth's atmosphere can be a cause for concern for most space missions. With the science goals becoming ever more demanding, the requirements on the precision of the instruments on board these missions also increases, and it is therefore important to investigate how the radiation induced damage affects the Charge-Coupled Devices (CCDs) that most of these instruments rely on. The primary goal of the Euclid mission is to study the nature of dark matter and dark energy using weak lensing and baryonic acoustic oscillation techniques. The weak lensing technique depends on very precise shape measurements of distant galaxies obtained by a large CCD array. It is anticipated that over the 6 year nominal lifetime of mission, the CCDs will be degraded to an extent that these measurements will not be possible unless the radiation damage effects are corrected. We have therefore created a Monte Carlo model that simulates the physical processes taking place when transferring signal through a radiation damaged CCD. The software is based on Shockley-Read-Hall theory, and is made to mimic the physical properties in the CCD as close as possible. The code runs on a single electrode level and takes charge cloud size and density, three dimensional trap position, and multi-level clocking into account. A key element of the model is that it takes device specific simulations of electron density as a direct input, thereby avoiding to make any analytical assumptions about the size and density of the charge cloud. This paper illustrates how test data and simulated data can be compared in order to further our understanding of the positions and properties of the individual radiation-induced traps.

  2. Cryogenic Propellant Storage and Transfer (CPST) Technology Maturation: Establishing a Foundation for a Technology Demonstration Mission (TDM)

    NASA Technical Reports Server (NTRS)

    Doherty, Michael P.; Meyer, Michael L.; Motil, Susan M.; Ginty, Carol A.

    2014-01-01

    As part of U.S. National Space Policy, NASA is seeking an innovative path for human space exploration, which strengthens the capability to extend human and robotic presence throughout the solar system. NASA is laying the groundwork to enable humans to safely reach multiple potential destinations, including asteroids, Lagrange points, the Moon and Mars. In support of this, NASA is embarking on the Technology Demonstration Mission Cryogenic Propellant Storage and Transfer (TDM CPST) Project to test and validate key cryogenic capabilities and technologies required for future exploration elements, opening up the architecture for large cryogenic propulsion stages (CPS) and propellant depots. The TDM CPST project will provide an on-orbit demonstration of the capability to store, transfer, and measure cryogenic propellants for a duration which is relevant to enable long term human space exploration missions beyond low Earth orbit (LEO). Recognizing that key cryogenic fluid management technologies anticipated for on-orbit (flight) demonstration needed to be matured to a readiness level appropriate for infusion into the design of the flight demonstration, the NASA Headquarters Space Technology Mission Directorate authorized funding for a one-year (FY12) ground based technology maturation program. The strategy, proposed by the CPST Project Manager, focused on maturation through modeling, studies, and ground tests of the storage and fluid transfer Cryogenic Fluid Management (CFM) technology sub-elements and components that were not already at a Technology Readiness Level (TRL) of 5. A technology maturation plan (TMP) was subsequently approved which described: the CFM technologies selected for maturation, the ground testing approach to be used, quantified success criteria of the technologies, hardware and data deliverables, and a deliverable to provide an assessment of the technology readiness after completion of the test, study or modeling activity. This paper will present the testing, studies, and modeling that occurred in FY12 to mature cryogenic fluid management technologies for propellant storage, transfer, and supply, to examine extensibility to full scale, long duration missions, and to develop and validate analytical models. Finally, the paper will briefly describe an upcoming test to demonstrate Liquid Oxygen (LO2) Zero Boil-Off (ZBO).

  3. Cryogenic Propellant Storage and Transfer (CPST) Technology Maturation: Establishing a Foundation for a Technology Demonstration Mission (TDM)

    NASA Technical Reports Server (NTRS)

    Doherty, Michael P.; Meyer, Michael L.; Motil, Susan M.; Ginty, Carol A.

    2013-01-01

    As part of U.S. National Space Policy, NASA is seeking an innovative path for human space exploration, which strengthens the capability to extend human and robotic presence throughout the solar system. NASA is laying the groundwork to enable humans to safely reach multiple potential destinations, including asteroids, Lagrange points, the Moon and Mars. In support of this, NASA is embarking on the Technology Demonstration Mission Cryogenic Propellant Storage and Transfer (TDM CPST) Project to test and validate key cryogenic capabilities and technologies required for future exploration elements, opening up the architecture for large cryogenic propulsion stages (CPS) and propellant depots. The TDM CPST project will provide an on-orbit demonstration of the capability to store, transfer, and measure cryogenic propellants for a duration which is relevant to enable long term human space exploration missions beyond low Earth orbit (LEO). Recognizing that key cryogenic fluid management technologies anticipated for on-orbit (flight) demonstration needed to be matured to a readiness level appropriate for infusion into the design of the flight demonstration, the NASA Headquarters Space Technology Mission Directorate authorized funding for a one-year (FY12) ground based technology maturation program. The strategy, proposed by the CPST Project Manager, focused on maturation through modeling, studies, and ground tests of the storage and fluid transfer Cryogenic Fluid Management (CFM) technology sub-elements and components that were not already at a Technology Readiness Level (TRL) of 5. A technology maturation plan (TMP) was subsequently approved which described: the CFM technologies selected for maturation, the ground testing approach to be used, quantified success criteria of the technologies, hardware and data deliverables, and a deliverable to provide an assessment of the technology readiness after completion of the test, study or modeling activity. This paper will present the testing, studies, and modeling that occurred in FY12 to mature cryogenic fluid management technologies for propellant storage, transfer, and supply, to examine extensibility to full scale, long duration missions, and to develop and validate analytical models. Finally, the paper will briefly describe an upcoming test to demonstrate Liquid Oxygen (LO2) Zero Boil- Off (ZBO).

  4. Integrated Vehicle and Trajectory Design of Small Spacecraft with Electric Propulsion for Earth and Interplanetary Missions

    NASA Technical Reports Server (NTRS)

    Spangelo, Sara; Dalle, Derek; Longmier, Benjamin

    2015-01-01

    This paper investigates the feasibility of Earth-transfer and interplanetary mission architectures for miniaturized spacecraft using emerging small solar electric propulsion technologies. Emerging small SEP thrusters offer significant advantages relative to existing technologies and will enable U-class systems to perform trajectory maneuvers with significant Delta V requirements. The approach in this paper is unique because it integrates trajectory design with vehicle sizing and accounts for the system and operational constraints of small U-class missions. The modeling framework includes integrated propulsion, orbit, energy, and external environment dynamics and systems-level power, energy, mass, and volume constraints. The trajectory simulation environment models orbit boosts in Earth orbit and flyby and capture trajectories to interplanetary destinations. A family of small spacecraft mission architectures are studied, including altitude and inclination transfers in Earth orbit and trajectories that escape Earth orbit and travel to interplanetary destinations such as Mercury, Venus, and Mars. Results are presented visually to show the trade-offs between competing performance objectives such as maximizing available mass and volume for payloads and minimizing transfer time. The results demonstrate the feasibility of using small spacecraft to perform significant Earth and interplanetary orbit transfers in less than one year with reasonable U-class mass, power, volume, and mission durations.

  5. Transforming Systems Engineering through Model-Centric Engineering

    DTIC Science & Technology

    2018-02-28

    intelligence (e.g., Artificial Intelligence , etc.), because they provide a means for representing knowledge. We see these capabilities coming to use in both...level, including:  Performance is measured by degree of success of a mission  Artificial Intelligence (AI) is applied to counterparties so that they...Modeling, Artificial Intelligence , Simulation and Modeling, 1989. [140] SAE ARP4761. Guidelines and Methods for Conducting the Safety Assessment Process

  6. A mission executor for an autonomous underwater vehicle

    NASA Technical Reports Server (NTRS)

    Lee, Yuh-Jeng; Wilkinson, Paul

    1991-01-01

    The Naval Postgraduate School has been conducting research into the design and testing of an Autonomous Underwater Vehicle (AUV). One facet of this research is to incrementally design a software architecture and implement it in an advanced testbed, the AUV II. As part of the high level architecture, a Mission Executor is being constructed using CLIPS (C Language Integrated Production System) version 5.0. The Mission Executor is an expert system designed to oversee progress from the AUV launch point to a goal area and back to the origin. It is expected that the executor will make informed decisions about the mission, taking into account the navigational path, the vehicle subsystem health, and the sea environment, as well as the specific mission profile which is downloaded from an offboard mission planner. Heuristics for maneuvering, avoidance of uncharted obstacles, waypoint navigation, and reaction to emergencies (essentially the expert knowledge of a submarine captain) are required. Many of the vehicle subsystems are modeled as objects using the CLIPS Object Oriented Language (COOL) embedded in CLIPS 5.0. Also, truth maintenance is applied to the knowledge base to keep configurations updated.

  7. Active-Reserve Force Cost Model

    DTIC Science & Technology

    2015-01-01

    structure to be maintained for a given level of expenditure. We have developed this methodology and set of associated computer-based tools to...rotational, and deployed units or systems • Attain acceptable steady state operational or presence levels , as measured by the number of units a...at the community level . By community, we mean the set of units of a given type: mission, platform, or capability. We do this because AC-RC force-mix

  8. Analog model study of the ground-water basin of the Upper Coachella Valley, California

    USGS Publications Warehouse

    Tyley, Stephen J.

    1974-01-01

    An analog model of the ground-water basin of the upper Coachella Valley was constructed to determine the effects of imported water on ground-water levels. The model was considered verified when the ground-water levels generated by the model approximated the historical change in water levels of the ground-water basin caused by man's activities for the period 1986-67. The ground-water basin was almost unaffected by man's activities until about 1945 when ground-water development caused the water levels to begin to decline. The Palm Springs area has had the largest water-level decline, 75 feet since 1986, because of large pumpage, reduced natural inflow from the San Gorgonio Pass area, and diversions of natural inflows at Snow and Falls Creeks and Chino Canyon starting in 1945. The San Gorgonio Pass inflow had been reduced from about 18,000 acre-feet in 1986 to about 9,000 acre-feet by 1967 because of increased ground-water pumpage in the San Gorgonio Pass area, dewatering of the San Gorgonio Pass area that took place when the tunnel for the Metropolitan Water District of Southern California was drilled, and diversions of surface inflow at Snow and Falls Creeks. In addition, 1944-64 was a period of below-normal precipitation which, in part, contributed to the declines in water levels in the Coachella Valley. The Desert Hot Springs, Garnet Hill, and Mission Creek subbasins have had relatively little development; consequently, the water-level declines have been small, ranging from 5 to 15 feet since 1986. In the Point Happy area a decline of about 2 feet per year continued until 1949 when delivery of Colorado River water to the lower valley through the Coachella Canal was initiated. Since 1949 the water levels in the Point Happy area have been rising and by 1967 were above their 1986 levels. The Whitewater River subbasin includes the largest aquifer in the basin, having sustained ground-water pumpage of about 740,000 acre-feet from 1986 to 1967, and will probably continue to provide the most significant supply of ground water for the upper valley. The total ground-water storage depletion for the entire upper valley for 1986-67 was about 600,000 acre-feet, an average storage decrease of about 25,000 acre-feet per year since 1945. Transmissivity for the Whitewater River subbasin ranges from 860,000 gallons per day per foot (near Point Happy) to 50,000 gallons per day per foot, with most of the subbasin about 800,000 gallons per day per foot. In contrast, the transmissivities of the Desert Hot Springs, Mission Creek, and Garnet Hill subbasins generally range from 2,000 to 100,000, but the highest value, beneath the Mission Creek streambed deposits, is 200,000 gallons per day per foot; the transmissivity for most of the area of th6 three subbasins is 80,000 gallons per day per foot. The storage coefficients are representative of water-table conditions, ranging from 0.18 beneath the Mission Creek stream deposits to 0.06 in the Palm Springs area. The model indicated that the outflow at Point Happy decreased from 50,000 acre-feet in 1936 to 30,000 acre-feet by 1967 as a result of the rising water levels in the lower valley. The most logical area to recharge the Colorado River water is the Windy Point-Whitewater area, where adequate percolation rates of 2-4 acre-feet per acre per day are probable. The Whitewater River bed may be the best location to spread the water if the largest part of the imported water can be recharged during low-flow periods. The area in sec. 21, T. 2 S., R. 4 E., would be adequate for the smaller quantities of recharge proposed for the Mission Creek area. Projected pumpage for the period 1968-2000 was programmed on the model with the proposed recharge of Colorado River water for the same period. The model indicated a maximum water-level increase of 200 feet above the 1967 water level at Windy Point, the proposed recharge site, by the year 2000, a 130-foot increase by 1990, and a 20-foot increas

  9. Space Missions Trade Space Generation and Assessment Using JPL Rapid Mission Architecture (RMA) Team Approach

    NASA Technical Reports Server (NTRS)

    Moeller, Robert C.; Borden, Chester; Spilker, Thomas; Smythe, William; Lock, Robert

    2011-01-01

    The JPL Rapid Mission Architecture (RMA) capability is a novel collaborative team-based approach to generate new mission architectures, explore broad trade space options, and conduct architecture-level analyses. RMA studies address feasibility and identify best candidates to proceed to further detailed design studies. Development of RMA first began at JPL in 2007 and has evolved to address the need for rapid, effective early mission architectural development and trade space exploration as a precursor to traditional point design evaluations. The RMA approach integrates a small team of architecture-level experts (typically 6-10 people) to generate and explore a wide-ranging trade space of mission architectures driven by the mission science (or technology) objectives. Group brainstorming and trade space analyses are conducted at a higher level of assessment across multiple mission architectures and systems to enable rapid assessment of a set of diverse, innovative concepts. This paper describes the overall JPL RMA team, process, and high-level approach. Some illustrative results from previous JPL RMA studies are discussed.

  10. Health Management Applications for International Space Station

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Duncavage, Dan

    2005-01-01

    Traditional mission and vehicle management involves teams of highly trained specialists monitoring vehicle status and crew activities, responding rapidly to any anomalies encountered during operations. These teams work from the Mission Control Center and have access to engineering support teams with specialized expertise in International Space Station (ISS) subsystems. Integrated System Health Management (ISHM) applications can significantly augment these capabilities by providing enhanced monitoring, prognostic and diagnostic tools for critical decision support and mission management. The Intelligent Systems Division of NASA Ames Research Center is developing many prototype applications using model-based reasoning, data mining and simulation, working with Mission Control through the ISHM Testbed and Prototypes Project. This paper will briefly describe information technology that supports current mission management practice, and will extend this to a vision for future mission control workflow incorporating new ISHM applications. It will describe ISHM applications currently under development at NASA and will define technical approaches for implementing our vision of future human exploration mission management incorporating artificial intelligence and distributed web service architectures using specific examples. Several prototypes are under development, each highlighting a different computational approach. The ISStrider application allows in-depth analysis of Caution and Warning (C&W) events by correlating real-time telemetry with the logical fault trees used to define off-nominal events. The application uses live telemetry data and the Livingstone diagnostic inference engine to display the specific parameters and fault trees that generated the C&W event, allowing a flight controller to identify the root cause of the event from thousands of possibilities by simply navigating animated fault tree models on their workstation. SimStation models the functional power flow for the ISS Electrical Power System and can predict power balance for nominal and off-nominal conditions. SimStation uses realtime telemetry data to keep detailed computational physics models synchronized with actual ISS power system state. In the event of failure, the application can then rapidly diagnose root cause, predict future resource levels and even correlate technical documents relevant to the specific failure. These advanced computational models will allow better insight and more precise control of ISS subsystems, increasing safety margins by speeding up anomaly resolution and reducing,engineering team effort and cost. This technology will make operating ISS more efficient and is directly applicable to next-generation exploration missions and Crew Exploration Vehicles.

  11. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    NASA Technical Reports Server (NTRS)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  12. Comparative Benchmark Dose Modeling as a Tool to Make the First Estimate of Safe Human Exposure Levels to Lunar Dust

    NASA Technical Reports Server (NTRS)

    James, John T.; Lam, Chiu-wing; Scully, Robert R.

    2013-01-01

    Brief exposures of Apollo Astronauts to lunar dust occasionally elicited upper respiratory irritation; however, no limits were ever set for prolonged exposure ot lunar dust. Habitats for exploration, whether mobile of fixed must be designed to limit human exposure to lunar dust to safe levels. We have used a new technique we call Comparative Benchmark Dose Modeling to estimate safe exposure limits for lunar dust collected during the Apollo 14 mission.

  13. Model-Based Trade Space Exploration for Near-Earth Space Missions

    NASA Technical Reports Server (NTRS)

    Cohen, Ronald H.; Boncyk, Wayne; Brutocao, James; Beveridge, Iain

    2005-01-01

    We developed a capability for model-based trade space exploration to be used in the conceptual design of Earth-orbiting space missions. We have created a set of reusable software components to model various subsystems and aspects of space missions. Several example mission models were created to test the tools and process. This technique and toolset has demonstrated itself to be valuable for space mission architectural design.

  14. Mars Mission Analysis Trades Based on Legacy and Future Nuclear Propulsion Options

    NASA Astrophysics Data System (ADS)

    Joyner, Russell; Lentati, Andrea; Cichon, Jaclyn

    2007-01-01

    The purpose of this paper is to discuss the results of mission-based system trades when using a nuclear thermal propulsion (NTP) system for Solar System exploration. The results are based on comparing reactor designs that use a ceramic-metallic (CERMET), graphite matrix, graphite composite matrix, or carbide matrix fuel element designs. The composite graphite matrix and CERMET designs have been examined for providing power as well as propulsion. Approaches to the design of the NTP to be discussed will include an examination of graphite, composite, carbide, and CERMET core designs and the attributes of each in regards to performance and power generation capability. The focus is on NTP approaches based on tested fuel materials within a prismatic fuel form per the Argonne National Laboratory testing and the ROVER/NERVA program. NTP concepts have been examined for several years at Pratt & Whitney Rocketdyne for use as the primary propulsion for human missions beyond earth. Recently, an approach was taken to examine the design trades between specific NTP concepts; NERVA-based (UC)C-Graphite, (UC,ZrC)C-Composite, (U,Zr)C-Solid Carbide and UO2-W CERMET. Using Pratt & Whitney Rocketdyne's multidisciplinary design analysis capability, a detailed mission and vehicle model has been used to examine how several of these NTP designs impact a human Mars mission. Trends for the propulsion system mass as a function of power level (i.e. thrust size) for the graphite-carbide and CERMET designs were established and correlated against data created over the past forty years. These were used for the mission trade study. The resulting mission trades presented in this paper used a comprehensive modeling approach that captures the mission, vehicle subsystems, and NTP sizing.

  15. Computational Modeling of Interventions and Protective Thresholds to Prevent Disease Transmission in Deploying Populations

    PubMed Central

    2014-01-01

    Military personnel are deployed abroad for missions ranging from humanitarian relief efforts to combat actions; delay or interruption in these activities due to disease transmission can cause operational disruptions, significant economic loss, and stressed or exceeded military medical resources. Deployed troops function in environments favorable to the rapid and efficient transmission of many viruses particularly when levels of protection are suboptimal. When immunity among deployed military populations is low, the risk of vaccine-preventable disease outbreaks increases, impacting troop readiness and achievement of mission objectives. However, targeted vaccination and the optimization of preexisting immunity among deployed populations can decrease the threat of outbreaks among deployed troops. Here we describe methods for the computational modeling of disease transmission to explore how preexisting immunity compares with vaccination at the time of deployment as a means of preventing outbreaks and protecting troops and mission objectives during extended military deployment actions. These methods are illustrated with five modeling case studies for separate diseases common in many parts of the world, to show different approaches required in varying epidemiological settings. PMID:25009579

  16. Computational modeling of interventions and protective thresholds to prevent disease transmission in deploying populations.

    PubMed

    Burgess, Colleen; Peace, Angela; Everett, Rebecca; Allegri, Buena; Garman, Patrick

    2014-01-01

    Military personnel are deployed abroad for missions ranging from humanitarian relief efforts to combat actions; delay or interruption in these activities due to disease transmission can cause operational disruptions, significant economic loss, and stressed or exceeded military medical resources. Deployed troops function in environments favorable to the rapid and efficient transmission of many viruses particularly when levels of protection are suboptimal. When immunity among deployed military populations is low, the risk of vaccine-preventable disease outbreaks increases, impacting troop readiness and achievement of mission objectives. However, targeted vaccination and the optimization of preexisting immunity among deployed populations can decrease the threat of outbreaks among deployed troops. Here we describe methods for the computational modeling of disease transmission to explore how preexisting immunity compares with vaccination at the time of deployment as a means of preventing outbreaks and protecting troops and mission objectives during extended military deployment actions. These methods are illustrated with five modeling case studies for separate diseases common in many parts of the world, to show different approaches required in varying epidemiological settings.

  17. The Department of the Navy Systems Engineering Career Competency Model (SECCM)

    DTIC Science & Technology

    2015-05-13

    Respond 71% Value 18% Organize 3% Characterize 4% Affective Domain Total KSAs : 869 ENG Career Field Competency Model 10 1.0 Mission Level...The Department of the Navy Systems Engineering Career Competency Model (SECCM) 2015 Acquisition Symposium Naval Postgraduate School Monterey...Career Competency Model (SECCM) 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER

  18. Transportation analyses for the lunar-Mars initiative

    NASA Technical Reports Server (NTRS)

    Woodcock, Gordon R.; Buddington, Patricia A.

    1991-01-01

    This paper focuses on certain results of an ongoing NASA-sponsored study by Boeing, including (1) a series of representative space exploration scenarios; (2) the levels of effort required to accomplish each; and (3) a range of candidate transportation system as partial implementations of the scenarios. This effort predated release of the Synthesis report; the three levels of activity described are not responses to the Synthesis architectures. These three levels (minimum, median and ambitious), do envelope the range of scope described in the four Synthesis architecture models. The level of analysis detail was to the current known level of detail of transportation hardware systems and mission scenarios. The study did not include detailed analysis of earth-to-orbit transportation, surface systems, or tracking and communications systems. The influence of earth-to-orbit systems was considered in terms of delivery capacity and cost. Aspects of additional options, such as in situ resource utilization are explored as needed to indicate potential benefits. Results favored cryogenic chemical propulsion for low activity levels and undemanding missions (such as cargo and some lunar missions), nuclear thermal propulsion for median activity levels similar to the Synthesis architectures, and nuclear thermal propulsion with aerobraking or nuclear electric propulsion for high activity levels. Solar electric propulsion was seen as having an important role if the present high unit cost (i.e., dollars per watt) of space photovoltaics could be reduced by a factor of five or more at production rates of megawatts per year.

  19. KSC00pp0075

    NASA Image and Video Library

    2000-01-14

    KENNEDY SPACE CENTER, Fla. -- At the 195-foot level of the Fixed Service Structure on Launch Pad 39A, the STS-99 crew pose for a photograph during Terminal Countdown Demonstration Test (TCDT) activities. Standing left to right are Pilot Dominic Gorie, Mission Specialist Mamoru Mohri (Ph.D.), Mission Specialist Janice Voss (Ph.D.), Commander Kevin Kregel, Mission Specialist Janet Lynn Kavandi (Ph.D.), and Mission Specialist Gerhard Thiele (Ph.D.). Thiele is with the European Space Agency and Mohri is with the National Space Development Agency (NASDA) of Japan. The TCDT provides the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  20. KSC-00pp0075

    NASA Image and Video Library

    2000-01-14

    KENNEDY SPACE CENTER, Fla. -- At the 195-foot level of the Fixed Service Structure on Launch Pad 39A, the STS-99 crew pose for a photograph during Terminal Countdown Demonstration Test (TCDT) activities. Standing left to right are Pilot Dominic Gorie, Mission Specialist Mamoru Mohri (Ph.D.), Mission Specialist Janice Voss (Ph.D.), Commander Kevin Kregel, Mission Specialist Janet Lynn Kavandi (Ph.D.), and Mission Specialist Gerhard Thiele (Ph.D.). Thiele is with the European Space Agency and Mohri is with the National Space Development Agency (NASDA) of Japan. The TCDT provides the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  1. Exoplanet Yield Estimation for Decadal Study Concepts using EXOSIMS

    NASA Astrophysics Data System (ADS)

    Morgan, Rhonda; Lowrance, Patrick; Savransky, Dmitry; Garrett, Daniel

    2016-01-01

    The anticipated upcoming large mission study concepts for the direct imaging of exo-earths present an exciting opportunity for exoplanet discovery and characterization. While these telescope concepts would also be capable of conducting a broad range of astrophysical investigations, the most difficult technology challenges are driven by the requirements for imaging exo-earths. The exoplanet science yield for these mission concepts will drive design trades and mission concept comparisons.To assist in these trade studies, the Exoplanet Exploration Program Office (ExEP) is developing a yield estimation tool that emphasizes transparency and consistent comparison of various design concepts. The tool will provide a parametric estimate of science yield of various mission concepts using contrast curves from physics-based model codes and Monte Carlo simulations of design reference missions using realistic constraints, such as solar avoidance angles, the observatory orbit, propulsion limitations of star shades, the accessibility of candidate targets, local and background zodiacal light levels, and background confusion by stars and galaxies. The python tool utilizes Dmitry Savransky's EXOSIMS (Exoplanet Open-Source Imaging Mission Simulator) design reference mission simulator that is being developed for the WFIRST Preliminary Science program. ExEP is extending and validating the tool for future mission concepts under consideration for the upcoming 2020 decadal review. We present a validation plan and preliminary yield results for a point design.

  2. Increasing Small Satellite Reliability- A Public-Private Initiative

    NASA Technical Reports Server (NTRS)

    Johnson, Michael A.; Beauchamp, Patricia; Schone, Harald; Sheldon, Doug; Fuhrman, Linda; Sullivan, Erica; Fairbanks, Tom; Moe, Miquel; Leitner, Jesse

    2017-01-01

    At present, CubeSat components and buses are generally not appropriate for missions where significant risk of failure, or the inability to quantify risk or confidence, is acceptable. However, in the future we anticipate that CubeSats will be used for missions requiring reliability of 1-3 years for Earth-observing missions and even longer for Planetary, Heliophysics, and Astrophysics missions. Their growing potential utility is driving an interagency effort to improve and quantify CubeSat reliability, and more generally, small satellite mission risk. The Small Satellite Reliability Initiative (SSRI)—an ongoing activity with broad collaborative participation from civil, DoD, and commercial space systems providers and stakeholders—targets this challenge. The Initiative seeks to define implementable and broadly-accepted approaches to achieve reliability and acceptable risk postures associated with several SmallSat mission risk classes—from “do no harm” missions, to those associated with missions whose failure would result in loss or delay of key national objectives. These approaches will maintain, to the extent practical, cost efficiencies associated with small satellite missions and consider constraints associated with supply chain elements, as appropriate. The SSRI addresses this challenge from two architectural levels—the mission- or system-level, and the component- or subsystem-level. The mission- or system-level scope targets assessment approaches that are efficient and effective, with mitigation strategies that facilitate resiliency to mission or system anomalies while the component- or subsystem-level scope addresses the challenge at lower architectural levels. The initiative does not limit strategies and approaches to proven and traditional methodologies, but is focused on fomenting thought on novel and innovative solutions. This paper discusses the genesis of and drivers for this initiative, how the public-private collaboration is being executed, findings and recommendations derived to date, and next steps towards broadening small satellite mission potential.

  3. Global Tropospheric Noise Maps for InSAR Observations

    NASA Astrophysics Data System (ADS)

    Yun, S. H.; Hensley, S.; Agram, P. S.; Chaubell, M.; Fielding, E. J.; Pan, L.

    2014-12-01

    Radio wave's differential phase delay variation through the troposphere is the largest error sources in Interferometric Synthetic Aperture Radar (InSAR) measurements, and water vapor variability in the troposphere is known to be the dominant factor. We use the precipitable water vapor (PWV) products from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) sensors mounted on Terra and Aqua satellites to produce tropospheric noise maps of InSAR. We estimate the slope and y-intercept of power spectral density curve of MODIS PWV and calculate the structure function to estimate the expected tropospheric noise level as a function of distance. The results serve two purposes: 1) to provide guidance on the expected covariance matrix for geophysical modeling, 2) to provide quantitative basis for the science Level-1 requirements of the planned NASA-ISRO L-band SAR mission (NISAR mission). We populate lookup tables of such power spectrum parameters derived from each 1-by-1 degree tile of global coverage. The MODIS data were retrieved from OSCAR (Online Services for Correcting Atmosphere in Radar) server. Users will be able to use the lookup tables and calculate expected tropospheric noise level of any date of MODIS data at any distance scale. Such calculation results can be used for constructing covariance matrix for geophysical modeling, or building statistics to support InSAR missions' requirements. For example, about 74% of the world had InSAR tropospheric noise level (along a radar line-of-sight for an incidence angle of 40 degrees) of 2 cm or less at 50 km distance scale during the time period of 2010/01/01 - 2010/01/09.

  4. JPL future missions and energy storage technology implications

    NASA Technical Reports Server (NTRS)

    Pawlik, Eugene V.

    1987-01-01

    The mission model for JPL future programs is presented. This model identifies mission areas where JPL is expected to have a major role and/or participate in a significant manner. These missions are focused on space science and applications missions, but they also include some participation in space station activities. The mission model is described in detail followed by a discussion on the needs for energy storage technology required to support these future activities.

  5. Lagrangian Photochemical Box-Model Calculations of Asian Pacific Rim Outflow During TRACE-P

    NASA Astrophysics Data System (ADS)

    Hamlin, A.; Crawford, J.; Olson, J.; Avery, M.; Sachse, G.; Barrick, J.; Blake, D.; Tan, D.; Sandholm, S.; Kondo, Y.; Singh, H.; Eisele, F.; Zondlo, M.; Flocke, F.; Talbot, R.

    2006-12-01

    NASA's TRACE-P (TRAnsport and Chemical Evolution over the Pacific) mission was conducted over the northwestern Pacific February-April, 2001. During two transit flights across the Pacific, extensive pollution was observed from an Asian outflow event that split into two branches over the central Pacific, one subsiding and moving southward over the central Pacific and the other continuing eastward in the upper troposphere. The subsiding branch was observed as a widespread stagnant pollution layer between 2 and 4 km over the central Pacific during transit flights from Kona, HI to Guam. In this region, high levels of O3 (70 ppbv), CO (217 ppbv), and NOx (114 pptv) were well in excess of typical values observed during TRACE-P along the Asian coast. Evidence suggests that the subsiding branch experienced extensive photochemical processing compared to the branch that remained at altitude. To examine the processes controlling the chemical evolution of ozone and its precursors in this outflow event, data collected during the TRACE-P mission have been combined with lagrangian photochemical box model calculations. One of the largest sources of uncertainty in these calculations was associated with predicted water vapor levels along the transport trajectories calculated using the HYSPLIT model. Water vapor levels predicted by HYSPLIT trajectory calculations in the subsiding layer ranged from 3390 to 4880 ppm, while the median level observed in the pollution layer was only 637 ppm. Simulations of ozone production and associated radical chemistry differed dramatically when using water vapor levels based on trajectory calculations versus observed water vapor levels. Levels of PAN and HO2NO2, NOx reservoir species, are also influenced by uncertainties in temperature along the trajectories. These results highlight the importance of accurately representing the humidification and warming of subsiding air masses in 3-D chemical- transport models.

  6. Assessment of the SMAP Level-4 Surface and Root-Zone Soil Moisture Product Using In Situ Measurements

    USDA-ARS?s Scientific Manuscript database

    The Soil Moisture Active Passive (SMAP) mission Level-4 Surface and Root-Zone Soil Moisture (L4_SM) data product is generated by assimilating SMAP L-band brightness temperature observations into the NASA Catchment land surface model. The L4_SM product is available from 31 March 2015 to present (with...

  7. Study of alternative probe technologies

    NASA Technical Reports Server (NTRS)

    1977-01-01

    A number of implied technologies for a deep probe mission was examined; i.e., one that would provide the capability to scientifically examine planetary atmospheres at the 1000 bar level. Conditions imposed by current Jupiter, Saturn, and Uranus atmospheric models were considered. The major thrust of the measurements was to determine lower atmosphere composition, even to trace constituents of one part per billion. Two types of instruments having the necessary accuracy to meet the science objectives were considered and integrated into a deep probe configuration. One deep probe option that resulted was identified as a Minimum Technology Development approach. The significant feature of this option is that only three technology developments are required to enable the mission, i.e., (1) science instrument development, (2) advanced data processing, and (3) external high pressure/thermal insulation. It is concluded that a probe designed for a Jupiter mission could, with minor changes, be used for a Saturn or Uranus mission.

  8. A Survey of Formal Methods for Intelligent Swarms

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Rash, James; Hinchey, Mike; Rouff, Chrustopher A.

    2004-01-01

    Swarms of intelligent autonomous spacecraft, involving complex behaviors and interactions, are being proposed for future space exploration missions. Such missions provide greater flexibility and offer the possibility of gathering more science data than traditional single spacecraft missions. The emergent properties of swarms make these missions powerful, but simultaneously far more difficult to design, and to assure that the proper behaviors will emerge. These missions are also considerably more complex than previous types of missions, and NASA, like other organizations, has little experience in developing or in verifying and validating these types of missions. A significant challenge when verifying and validating swarms of intelligent interacting agents is how to determine that the possible exponential interactions and emergent behaviors are producing the desired results. Assuring correct behavior and interactions of swarms will be critical to mission success. The Autonomous Nano Technology Swarm (ANTS) mission is an example of one of the swarm types of missions NASA is considering. The ANTS mission will use a swarm of picospacecraft that will fly from Earth orbit to the Asteroid Belt. Using an insect colony analogy, ANTS will be composed of specialized workers for asteroid exploration. Exploration would consist of cataloguing the mass, density, morphology, and chemical composition of the asteroids, including any anomalous concentrations of specific minerals. To perform this task, ANTS would carry miniaturized instruments, such as imagers, spectrometers, and detectors. Since ANTS and other similar missions are going to consist of autonomous spacecraft that may be out of contact with the earth for extended periods of time, and have low bandwidths due to weight constraints, it will be difficult to observe improper behavior and to correct any errors after launch. Providing V&V (verification and validation) for this type of mission is new to NASA, and represents the cutting edge in system correctness, and requires higher levels of assurance than other (traditional) missions that use a single or small number of spacecraft that are deterministic in nature and have near continuous communication access. One of the highest possible levels of assurance comes from the application of formal methods. Formal methods are mathematics-based tools and techniques for specifying and verifying (software and hardware) systems. They are particularly useful for specifying complex parallel systems, such as exemplified by the ANTS mission, where the entire system is difficult for a single person to fully understand, a problem that is multiplied with multiple developers. Once written, a formal specification can be used to prove properties of a system (e.g., the underlying system will go from one state to another or not into a specific state) and check for particular types of errors (e.g., race or livelock conditions). A formal specification can also be used as input to a model checker for further validation. This report gives the results of a survey of formal methods techniques for verification and validation of space missions that use swarm technology. Multiple formal methods were evaluated to determine their effectiveness in modeling and assuring the behavior of swarms of spacecraft using the ANTS mission as an example system. This report is the first result of the project to determine formal approaches that are promising for formally specifying swarm-based systems. From this survey, the most promising approaches were selected and are discussed relative to their possible application to the ANTS mission. Future work will include the application of an integrated approach, based on the selected approaches identified in this report, to the formal specification of the ANTS mission.

  9. NEXT Ion Propulsion System Development Status and Capabilities

    NASA Technical Reports Server (NTRS)

    Patterson, Michael J.; Benson, Scott W.

    2008-01-01

    NASA s Evolutionary Xenon Thruster (NEXT) project is developing next generation ion propulsion technologies to provide future NASA science missions with enhanced mission performance benefit at a low total development cost. The objective of the NEXT project is to advance next generation ion propulsion technology by producing engineering model system components, validating these through qualification-level and integrated system testing, and ensuring preparedness for transitioning to flight system development. As NASA s Evolutionary Xenon Thruster technology program completes advanced development activities, it is advantageous to review the existing technology capabilities of the system under development. This paper describes the NEXT ion propulsion system development status, characteristics and performance. A review of mission analyses results conducted to date using the NEXT system is also provided.

  10. Office of Aeronautics and Space Technology preliminary requirements for space science and applications platform studies

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Needs and requirements for a free flying space science and applications platform to host groupings of compatible, extended mission experiments in earth orbit are discussed. A payload model which serves to define a typical set of mission requirements in the form of a descriptive data base is presented along with experiment leval and group level data summarizations and flight schedules. The payload descriptions are grouped by technology into the following categories: communications, materials (long term effect upon), materials technology development, power, sensors, and thermal control.

  11. An Analysis of Fuel Cell Options for an All-electric Unmanned Aerial Vehicle

    NASA Technical Reports Server (NTRS)

    Kohout, Lisa L.; Schmitz, Paul C.

    2007-01-01

    A study was conducted to assess the performance characteristics of both PEM and SOFC-based fuel cell systems for an all-electric high altitude, long endurance Unmanned Aerial Vehicle (UAV). Primary and hybrid systems were considered. Fuel options include methane, hydrogen, and jet fuel. Excel-based models were used to calculate component mass as a function of power level and mission duration. Total system mass and stored volume as a function of mission duration for an aircraft operating at 65 kft altitude were determined and compared.

  12. Space Radiation Cancer Risks and Uncertainities for Different Mission Time Periods

    NASA Technical Reports Server (NTRS)

    Kim,Myung-Hee Y.; Cucinotta, Francis A.

    2012-01-01

    Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which includes high energy protons and high charge and energy (HZE) nuclei. For long duration missions, space radiation presents significant health risks including cancer mortality. Probabilistic risk assessment (PRA) is essential for radiation protection of crews on long term space missions outside of the protection of the Earth s magnetic field and for optimization of mission planning and costs. For the assessment of organ dosimetric quantities and cancer risks, the particle spectra at each critical body organs must be characterized. In implementing a PRA approach, a statistical model of SPE fluence was developed, because the individual SPE occurrences themselves are random in nature while the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. An overall cumulative probability of a GCR environment for a specified mission period was estimated for the temporal characterization of the GCR environment represented by the deceleration potential (theta). Finally, this probabilistic approach to space radiation cancer risk was coupled with a model of the radiobiological factors and uncertainties in projecting cancer risks. Probabilities of fatal cancer risk and 95% confidence intervals will be reported for various periods of space missions.

  13. Grand Challenge Problems in Real-Time Mission Control Systems for NASA's 21st Century Missions

    NASA Technical Reports Server (NTRS)

    Pfarr, Barbara B.; Donohue, John T.; Hughes, Peter M.

    1999-01-01

    Space missions of the 21st Century will be characterized by constellations of distributed spacecraft, miniaturized sensors and satellites, increased levels of automation, intelligent onboard processing, and mission autonomy. Programmatically, these missions will be noted for dramatically decreased budgets and mission development lifecycles. Current progress towards flexible, scaleable, low-cost, reusable mission control systems must accelerate given the current mission deployment schedule, and new technology will need to be infused to achieve desired levels of autonomy and processing capability. This paper will discuss current and future missions being managed at NASA's Goddard Space Flight Center in Greenbelt, MD. It will describe the current state of mission control systems and the problems they need to overcome to support the missions of the 21st Century.

  14. Analyzing the Critical Supply Chain For Unmanned Aircraft Systems

    DTIC Science & Technology

    2017-03-23

    with a decision support tool that facilitates interdiction strategy planning. Overall, the different models developed in the study provide modeling...allow adaptation to different levels of fidelity of the supply chain, based on the user’s mission objectives and available data. A House of Quality...priorities are unknown or incorrect. 1.7 Implications The models presented in this research can be utilized from two different perspectives of

  15. Experiments in Error Propagation within Hierarchal Combat Models

    DTIC Science & Technology

    2015-09-01

    Bayesian Information Criterion CNO Chief of Naval Operations DOE Design of Experiments DOD Department of Defense MANA Map Aware Non-uniform Automata ...ground up” approach. First, it develops a mission-level model for one on one submarine combat in Map Aware Non-uniform Automata (MANA) simulation, an... Automata (MANA), an agent based simulation that can model the different postures of submarines. It feeds the results from MANA into stochastic

  16. Planning a pharmacy-led medical mission trip, part 4: an exploratory study of student experiences.

    PubMed

    Brown, Dana A; Fairclough, Jamie L; Ferrill, Mary J

    2012-09-01

    At the Gregory School of Pharmacy (GSOP), pharmacy students routinely participate in domestic and international medical mission trips. Participation can be for academic credit as part of final-year Advanced Pharmacy Practice Experiences (APPEs) or as required community service hours. These mission experiences could potentially result in both professional and personal transformations for participating students. To evaluate data collected from GSOP pharmacy students regarding their experiences on the medical mission field in 2011 and how that participation has impacted the students professionally and personally. GSOP students participating in an international or domestic medical mission trip in the summer of 2011 were asked to voluntarily complete pre- and posttrip surveys. Of the 68 final-year APPE students and student volunteers who participated in a summer 2011 GSOP medical mission trip, 36 (53%) completed pre- and posttrip surveys. The mission trips significantly impacted students' beliefs regarding better preparation to care for the medical needs of patients, identification of others' needs, understanding team dynamics, perceptions about the value of patient care, and comfort level with the provision of medical and pharmaceutical care in a foreign country. However, there were no statistically significant improvements in students' perceptions of their ability to care for the emotional needs of patients, the importance of team unity, and their level of respect for team members; their ability to lead or participate in future trips; and their belief that participating preceptors and faculty serve as effective role models of servant leaders. Based on the findings from this exploratory study, participation in a domestic or international medical mission trip as a student volunteer or APPE student appears to have a positive impact on some of the beliefs and perceptions of GSOP students. By continuing to follow these particular students and similar cohorts of students in the future, further insight may be gained regarding the long-term impact of medical mission experiences during pharmacy school training.

  17. MPST Software: grl_pef_check

    NASA Technical Reports Server (NTRS)

    Call, Jared A.; Kwok, John H.; Fisher, Forest W.

    2013-01-01

    This innovation is a tool used to verify and validate spacecraft sequences at the predicted events file (PEF) level for the GRAIL (Gravity Recovery and Interior Laboratory, see http://www.nasa. gov/mission_pages/grail/main/index. html) mission as part of the Multi-Mission Planning and Sequencing Team (MPST) operations process to reduce the possibility for errors. This tool is used to catch any sequence related errors or issues immediately after the seqgen modeling to streamline downstream processes. This script verifies and validates the seqgen modeling for the GRAIL MPST process. A PEF is provided as input, and dozens of checks are performed on it to verify and validate the command products including command content, command ordering, flight-rule violations, modeling boundary consistency, resource limits, and ground commanding consistency. By performing as many checks as early in the process as possible, grl_pef_check streamlines the MPST task of generating GRAIL command and modeled products on an aggressive schedule. By enumerating each check being performed, and clearly stating the criteria and assumptions made at each step, grl_pef_check can be used as a manual checklist as well as an automated tool. This helper script was written with a focus on enabling the user with the information they need in order to evaluate a sequence quickly and efficiently, while still keeping them informed and active in the overall sequencing process. grl_pef_check verifies and validates the modeling and sequence content prior to investing any more effort into the build. There are dozens of various items in the modeling run that need to be checked, which is a time-consuming and errorprone task. Currently, no software exists that provides this functionality. Compared to a manual process, this script reduces human error and saves considerable man-hours by automating and streamlining the mission planning and sequencing task for the GRAIL mission.

  18. Estimating the Deep Space Network modification costs to prepare for future space missions by using major cost drivers

    NASA Technical Reports Server (NTRS)

    Remer, Donald S.; Sherif, Josef; Buchanan, Harry R.

    1993-01-01

    This paper develops a cost model to do long range planning cost estimates for Deep Space Network (DSN) support of future space missions. The paper focuses on the costs required to modify and/or enhance the DSN to prepare for future space missions. The model is a function of eight major mission cost drivers and estimates both the total cost and the annual costs of a similar future space mission. The model is derived from actual cost data from three space missions: Voyager (Uranus), Voyager (Neptune), and Magellan. Estimates derived from the model are tested against actual cost data for two independent missions, Viking and Mariner Jupiter/Saturn (MJS).

  19. Current Level of Mission Control Automation at NASA/Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Maks, Lori; Breed, Julie; Rackley, Michael; Powers, Edward I. (Technical Monitor)

    2001-01-01

    NASA is particularly concerned with reducing mission operations costs through increased automation. This paper examines the operations procedures within NASA Mission Control Centers in order to uncover the level of automation that currently exists within them. Based on an assessment of mission operations procedures within three representative control centers, this paper recommends specific areas where there is potential for mission cost reduction through increased automation.

  20. General relativistic satellite astrometry. II. Modeling parallax and proper motion

    NASA Astrophysics Data System (ADS)

    de Felice, F.; Bucciarelli, B.; Lattanzi, M. G.; Vecchiato, A.

    2001-07-01

    The non-perturbative general relativistic approach to global astrometry introduced by de Felice et al. (\\cite{defetal}) is here extended to account for the star motions on the Schwarzschild celestial sphere. A new expression of the observables, i.e. angular distances among stars, is provided, which takes into account the effects of parallax and proper motions. This dynamical model is then tested on an end-to-end simulation of the global astrometry mission GAIA. The results confirm the findings of our earlier work, which applied to the case of a static (angular coordinates only) sphere. In particular, measurements of large arcs among stars (each measurement good to ~ 100 mu arcsec, as expected for V ~ 17 mag stars) repeated over an observing period comparable to the mission lifetime foreseen for GAIA, can be modeled to yield estimates of positions, parallaxes, and annual proper motions good to ~ 15 mu arcsec. This second round of experiments confirms, within the limitations of the simulation and the assumptions of the current relativistic model, that the space-born global astrometry initiated with Hipparcos can be pushed down to the 10-5 arcsec accuracy level proposed with the GAIA mission. Finally, the simplified case we have solved can be used as reference for testing the limiting behavior of more realistic models as they become available.

  1. Large/Complex Antenna Performance Validation for Spaceborne Radar/Radiometeric Instruments

    NASA Technical Reports Server (NTRS)

    Focardi, Paolo; Harrell, Jefferson; Vacchione, Joseph

    2013-01-01

    Over the past decade, Earth observing missions which employ spaceborne combined radar & radiometric instruments have been developed and implemented. These instruments include the use of large and complex deployable antennas whose radiation characteristics need to be accurately determined over 4 pisteradians. Given the size and complexity of these antennas, the performance of the flight units cannot be readily measured. In addition, the radiation performance is impacted by the presence of the instrument's service platform which cannot easily be included in any measurement campaign. In order to meet the system performance knowledge requirements, a two pronged approach has been employed. The first is to use modeling tools to characterize the system and the second is to build a scale model of the system and use RF measurements to validate the results of the modeling tools. This paper demonstrates the resulting level of agreement between scale model and numerical modeling for two recent missions: (1) the earlier Aquarius instrument currently in Earth orbit and (2) the upcoming Soil Moisture Active Passive (SMAP) mission. The results from two modeling approaches, Ansoft's High Frequency Structure Simulator (HFSS) and TICRA's General RF Applications Software Package (GRASP), were compared with measurements of approximately 1/10th scale models of the Aquarius and SMAP systems. Generally good agreement was found between the three methods but each approach had its shortcomings as will be detailed in this paper.

  2. Venus Mobile Explorer with RPS for Active Cooling: A Feasibility Study

    NASA Technical Reports Server (NTRS)

    Leifer, Stephanie D.; Green, Jacklyn R.; Balint, Tibor S.; Manvi, Ram

    2009-01-01

    We present our findings from a study to evaluate the feasibility of a radioisotope power system (RPS) combined with active cooling to enable a long-duration Venus surface mission. On-board power with active cooling technology featured prominently in both the National Research Council's Decadal Survey and in the 2006 NASA Solar System Exploration Roadmap as mission-enabling for the exploration of Venus. Power and cooling system options were reviewed and the most promising concepts modeled to develop an assessment tool for Venus mission planners considering a variety of future potential missions to Venus, including a Venus Mobile Explorer (either a balloon or rover concept), a long-lived Venus static lander, or a Venus Geophysical Network. The concepts modeled were based on the integration of General Purpose Heat Source (GPHS) modules with different types of Stirling cycle heat engines for power and cooling. Unlike prior investigations which reported on single point design concepts, this assessment tool allows the user to generate either a point design or parametric curves of approximate power and cooling system mass, power level, and number of GPHS modules needed for a "black box" payload housed in a spherical pressure vessel.

  3. Space Radiation Effects and Reliability Consideration for the Proposed Jupiter Europa Orbiter

    NASA Technical Reports Server (NTRS)

    Johnston, Allan

    2011-01-01

    The proposed Jupiter Europa Orbiter (JEO) mission to explore the Jovian moon Europa poses a number of challenges. The spacecraft must operate for about seven years during the transit time to the vicinity of Jupiter, and then endure unusually high radiation levels during exploration and orbiting phases. The ability to withstand usually high total dose levels is critical for the mission, along with meeting the high reliability standards for flagship NASA missions. Reliability of new microelectronic components must be sufficiently understood to meet overall mission requirements.The proposed Jupiter Europa Orbiter (JEO) mission to explore the Jovian moon Europa poses a number of challenges. The spacecraft must operate for about seven years during the transit time to the vicinity of Jupiter, and then endure unusually high radiation levels during exploration and orbiting phases. The ability to withstand usually high total dose levels is critical for the mission, along with meeting the high reliability standards for flagship NASA missions. Reliability of new microelectronic components must be sufficiently understood to meet overall mission requirements.

  4. Assessment and Improvement of GOCE based Global Geopotential Models Using Wavelet Decomposition

    NASA Astrophysics Data System (ADS)

    Erol, Serdar; Erol, Bihter; Serkan Isik, Mustafa

    2016-07-01

    The contribution of recent Earth gravity field satellite missions, specifically GOCE mission, leads significant improvement in quality of gravity field models in both accuracy and resolution manners. However the performance and quality of each released model vary not only depending on the spatial location of the Earth but also the different bands of the spectral expansion. Therefore the assessment of the global model performances with validations using in situ-data in varying territories on the Earth is essential for clarifying their exact performances in local. Beside of this, their spectral evaluation and quality assessment of the signal in each part of the spherical harmonic expansion spectrum is essential to have a clear decision for the commission error content of the model and determining its optimal degree, revealed the best results, as well. The later analyses provide also a perspective and comparison on the global behavior of the models and opportunity to report the sequential improvement of the models depending on the mission developments and hence the contribution of the new data of missions. In this study a review on spectral assessment results of the recently released GOCE based global geopotential models DIR-R5, TIM-R5 with the enhancement using EGM2008, as reference model, in Turkey, versus the terrestrial data is provided. Beside of reporting the GOCE mission contribution to the models in Turkish territory, the possible improvement in the spectral quality of these models, via decomposition that are highly contaminated by noise, is purposed. In the analyses the motivation is on achieving an optimal amount of improvement that rely on conserving the useful component of the GOCE signal as much as possible, while fusing the filtered GOCE based models with EGM2008 in the appropriate spectral bands. The investigation also contain the assessment of the coherence and the correlation between the Earth gravity field parameters (free-air gravity anomalies and geoid undulations), derived from the validated geopotential models and terrestrial data (GPS/leveling, terrestrial gravity observations, DTM etc.), as well as the WGM2012 products. In the conclusion, with the numerical results, the performance of the assessed models are clarified in Turkish territory and the potential of the Wavelet decomposition in the improvement of the geopotential models is verified.

  5. Space Station needs, attributes and architectural options. Volume 2, book 1, part 1: Mission requirements

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The baseline mission model used to develop the space station mission-related requirements is described as well as the 90 civil missions that were evaluated, (including the 62 missions that formed the baseline model). Mission-related requirements for the space station baseline are defined and related to space station architectural development. Mission-related sensitivity analyses are discussed.

  6. Advances in Autonomous Systems for Missions of Space Exploration

    NASA Astrophysics Data System (ADS)

    Gross, A. R.; Smith, B. D.; Briggs, G. A.; Hieronymus, J.; Clancy, D. J.

    New missions of space exploration will require unprecedented levels of autonomy to successfully accomplish their objectives. Both inherent complexity and communication distances will preclude levels of human involvement common to current and previous space flight missions. With exponentially increasing capabilities of computer hardware and software, including networks and communication systems, a new balance of work is being developed between humans and machines. This new balance holds the promise of meeting the greatly increased space exploration requirements, along with dramatically reduced design, development, test, and operating costs. New information technologies, which take advantage of knowledge-based software, model-based reasoning, and high performance computer systems, will enable the development of a new generation of design and development tools, schedulers, and vehicle and system health monitoring and maintenance capabilities. Such tools will provide a degree of machine intelligence and associated autonomy that has previously been unavailable. These capabilities are critical to the future of space exploration, since the science and operational requirements specified by such missions, as well as the budgetary constraints that limit the ability to monitor and control these missions by a standing army of ground- based controllers. System autonomy capabilities have made great strides in recent years, for both ground and space flight applications. Autonomous systems have flown on advanced spacecraft, providing new levels of spacecraft capability and mission safety. Such systems operate by utilizing model-based reasoning that provides the capability to work from high-level mission goals, while deriving the detailed system commands internally, rather than having to have such commands transmitted from Earth. This enables missions of such complexity and communications distance as are not otherwise possible, as well as many more efficient and low cost applications. One notable example of such missions are those to explore for the existence of water on planets such as Mars and the moons of Jupiter. It is clear that water does not exist on the surfaces of such bodies, but may well be located at some considerable depth below the surface, thus requiring a subsurface drilling capability. Subsurface drilling on planetary surfaces will require a robust autonomous control and analysis system, currently a major challenge, but within conceivable reach of planned technology developments. This paper will focus on new and innovative software for remote, autonomous, space systems flight operations, including flight test results, lessons learned, and implications for the future. An additional focus will be on technologies for planetary exploration using autonomous systems and astronaut-assistance systems that employ new spoken language technology. Topics to be presented will include a description of key autonomous control concepts, illustrated by the Remote Agent program that commanded the Deep Space 1 spacecraft to new levels of system autonomy, recent advances in distributed autonomous system capabilities, and concepts for autonomous vehicle health management systems. A brief description of teaming spacecraft and rovers for complex exploration missions will also be provided. New software for autonomous science data acquisition for planetary exploration will also be described, as well as advanced systems for safe planetary landings. Current results of autonomous planetary drilling system research will be presented. A key thrust within NASA is to develop technologies that will leverage the capabilities of human astronauts during planetary surface explorations. One such technology is spoken dialogue interfaces, which would allow collaboration with semi-autonomous agents that are engaged in activities that are normally accomplished using language, e.g., astronauts in space suits interacting with groups of semi-autonomous rovers and other astronauts. This technology will be described and discussed in the context of future exploration missions and the major new capabilities enabled by such systems. Finally, plans and directions for the future of autonomous systems will be presented.

  7. Recent Development Activities and Future Mission Applications of NASA's Evolutionary Xenon Thruster (NEXT)

    NASA Technical Reports Server (NTRS)

    Patterson, Michael J.; Pencil, Eric J.

    2014-01-01

    NASAs Evolutionary Xenon Thruster (NEXT) project is developing next generation ion propulsion technologies to enhance the performance and lower the costs of future NASA space science missions. This is being accomplished by producing Engineering Model (EM) and Prototype Model (PM) components, validating these via qualification-level and integrated system testing, and preparing the transition of NEXT technologies to flight system development. This presentation is a follow-up to the NEXT project overviews presented in 2009-2010. It reviews the status of the NEXT project, presents the current system performance characteristics, and describes planned activities in continuing the transition of NEXT technology to a first flight. In 2013 a voluntary decision was made to terminate the long duration test of the NEXT thruster, given the thruster design has exceeded all expectations by accumulating over 50,000 hours of operation to demonstrate around 900 kg of xenon throughput. Besides its promise for upcoming NASA science missions, NEXT has excellent potential for future commercial and international spacecraft applications.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferraioli, Luigi; Hueller, Mauro; Vitale, Stefano

    The scientific objectives of the LISA Technology Package experiment on board of the LISA Pathfinder mission demand accurate calibration and validation of the data analysis tools in advance of the mission launch. The level of confidence required in the mission outcomes can be reached only by intensively testing the tools on synthetically generated data. A flexible procedure allowing the generation of a cross-correlated stationary noise time series was set up. A multichannel time series with the desired cross-correlation behavior can be generated once a model for a multichannel cross-spectral matrix is provided. The core of the procedure comprises a noisemore » coloring, multichannel filter designed via a frequency-by-frequency eigendecomposition of the model cross-spectral matrix and a subsequent fit in the Z domain. The common problem of initial transients in a filtered time series is solved with a proper initialization of the filter recursion equations. The noise generator performance was tested in a two-dimensional case study of the closed-loop LISA Technology Package dynamics along the two principal degrees of freedom.« less

  9. Assimilation of satellite altimetry data in hydrological models for improved inland surface water information: Case studies from the "Sentinel-3 Hydrologic Altimetry Processor prototypE" project (SHAPE)

    NASA Astrophysics Data System (ADS)

    Gustafsson, David; Pimentel, Rafael; Fabry, Pierre; Bercher, Nicolas; Roca, Mónica; Garcia-Mondejar, Albert; Fernandes, Joana; Lázaro, Clara; Ambrózio, Américo; Restano, Marco; Benveniste, Jérôme

    2017-04-01

    This communication is about the Sentinel-3 Hydrologic Altimetry Processor prototypE (SHAPE) project, with a focus on the components dealing with assimilation of satellite altimetry data into hydrological models. The SHAPE research and development project started in September 2015, within the Scientific Exploitation of Operational Missions (SEOM) programme of the European Space Agency. The objectives of the project are to further develop and assess recent improvement in altimetry data, processing algorithms and methods for assimilation in hydrological models, with the overarching goal to support improved scientific use of altimetry data and improved inland water information. The objective is also to take scientific steps towards a future Inland Water dedicated processor on the Sentinel-3 ground segment. The study focuses on three main variables of interest in hydrology: river stage, river discharge and lake level. The improved altimetry data from the project is used to estimate river stage, river discharge and lake level information in a data assimilation framework using the hydrological dynamic and semi-distributed model HYPE (Hydrological Predictions for the Environment). This model has been developed by SMHI and includes data assimilation module based on the Ensemble Kalman filter method. The method will be developed and assessed for a number of case studies with available in situ reference data and satellite altimetry data based on mainly the CryoSat-2 mission on which the new processor will be run; Results will be presented from case studies on the Amazon and Danube rivers and Lake Vänern (Sweden). The production of alti-hydro products (water level time series) are improved thanks to the use of water masks. This eases the geo-selection of the CryoSat-2 altimetric measurements since there are acquired from a geodetic orbit and are thus spread along the river course in space and and time. The specific processing of data from this geodetic orbit space-time pattern will be discussed as well as the subsequent possible strategies for data assimilation into models (and eventually highlight a generalized approach toward multi-mission data processing). Notably, in case of data assimilation along the course of rivers, the river slope might be estimated and compensated for, in order to produce local water level "pseudo time series" at arbitrary locations, and specifically at model's inlets.

  10. Stellar atmosphere modeling of extremely hot, compact stars

    NASA Astrophysics Data System (ADS)

    Rauch, Thomas; Ringat, Ellen; Werner, Klaus

    Present X-ray missions like Chandra and XMM-Newton provide excellent spectra of extremely hot white dwarfs, e.g. burst spectra of novae. Their analysis requires adequate NLTE model atmospheres. The Tuebingen Non-LTE Model-Atmosphere Package (TMAP) can calculate such model at-mospheres and spectral energy distributions at a high level of sophistication. We present a new grid of models that is calculated in the parameter range of novae and supersoft X-ray sources and show examples of their application.

  11. End-to-End Trade-Space Analysis for Designing Constellation

    NASA Technical Reports Server (NTRS)

    Le Moigne, Jacqueline; Dabney, Philip; Foreman, Veronica; Grogan, Paul T.; Hache, Sigfried; Holland, Matthew; Hughes, Steven; Nag, Sreeja; Siddiqi, Afreen

    2017-01-01

    Multipoint measurement missions can provide a significant advancement in science return and this science interest coupled with as many recent technological advances are driving a growing trend in exploring distributed architectures for future NASA missions. Distributed Spacecraft Missions (DSMs) leverage multiple spacecraft to achieve one or more common goals. In particular, a constellation is the most general form of DSM with two or more spacecraft placed into specific orbit(s) for the purpose of serving a common objective (e.g., CYGNSS). Because a DSM architectural trade-space includes both monolithic and distributed design variables, DSM optimization is a large and complex problem with multiple conflicting objectives. Over the last two years, our team has been developing a Trade-space Analysis Tool for Constellations (TAT-C), implemented in common programming languages for pre-Phase A constellation mission analysis. By evaluating alternative mission architectures, TAT-C seeks to minimize cost and maximize performance for pre-defined science goals. This presentation will describe the overall architecture of TAT-C including: a User Interface (UI) at several levels of details and user expertise; Trade-space Search Requests that are created from the Science requirements gathered by the UI and validated by a Knowledge Base; a Knowledge Base to compare the current requests to prior mission concepts to potentially prune the trade-space; a Trade-space Search Iterator which, with inputs from the Knowledge Base, and, in collaboration with the Orbit & Coverage, Reduction & Metrics, and Cost& Risk modules, generates multiple potential architectures and their associated characteristics. TAT-C leverages the use of the Goddard Mission Analysis Tool (GMAT) to compute coverage and ancillary data, modeling orbits to balance accuracy and performance. The current version includes uniform and non-uniform Walker constellations as well as Ad-Hoc and precessing constellations, and its cost model represents an aggregate model consisting of Cost Estimating Relationships (CERs) from widely accepted models. The current GUI automatically generates graphics representing metrics such as average revisit time or coverage as a function of cost. The end-to-end system will be demonstrated as part of the presentation.

  12. Planetary Image Geometry Library

    NASA Technical Reports Server (NTRS)

    Deen, Robert C.; Pariser, Oleg

    2010-01-01

    The Planetary Image Geometry (PIG) library is a multi-mission library used for projecting images (EDRs, or Experiment Data Records) and managing their geometry for in-situ missions. A collection of models describes cameras and their articulation, allowing application programs such as mosaickers, terrain generators, and pointing correction tools to be written in a multi-mission manner, without any knowledge of parameters specific to the supported missions. Camera model objects allow transformation of image coordinates to and from view vectors in XYZ space. Pointing models, specific to each mission, describe how to orient the camera models based on telemetry or other information. Surface models describe the surface in general terms. Coordinate system objects manage the various coordinate systems involved in most missions. File objects manage access to metadata (labels, including telemetry information) in the input EDRs and RDRs (Reduced Data Records). Label models manage metadata information in output files. Site objects keep track of different locations where the spacecraft might be at a given time. Radiometry models allow correction of radiometry for an image. Mission objects contain basic mission parameters. Pointing adjustment ("nav") files allow pointing to be corrected. The object-oriented structure (C++) makes it easy to subclass just the pieces of the library that are truly mission-specific. Typically, this involves just the pointing model and coordinate systems, and parts of the file model. Once the library was developed (initially for Mars Polar Lander, MPL), adding new missions ranged from two days to a few months, resulting in significant cost savings as compared to rewriting all the application programs for each mission. Currently supported missions include Mars Pathfinder (MPF), MPL, Mars Exploration Rover (MER), Phoenix, and Mars Science Lab (MSL). Applications based on this library create the majority of operational image RDRs for those missions. A Java wrapper around the library allows parts of it to be used from Java code (via a native JNI interface). Future conversions of all or part of the library to Java are contemplated.

  13. GPD+ wet tropospheric corrections for eight altimetric missions for the Sea Level ECV generation

    NASA Astrophysics Data System (ADS)

    Fernandes, Joana; Lázaro, Clara; Benveniste, Jérôme

    2016-04-01

    Due to its large spatio-temporal variability, the delay induced by the water vapour and liquid water content of the atmosphere in the altimeter signal or wet tropospheric correction (WTC) is still one of the largest sources of uncertainty in satellite altimetry. In the scope of the Sea Level (SL) Climate Change Initiative (cci) project, the University of Porto (UPorto) has been developing methods to improve the WTC (Fernandes et al., 2015). Started as a coastal algorithm to remove land effects in the microwave radiometers (MWR) on board altimeter missions, the GNSS-derived Path Delay (GPD) methodology evolved to cover the open ocean, including high latitudes, correcting for invalid observations due to land, ice and rain contamination, band instrument malfunction. The most recent version of the algorithm, GPD Plus (GPD+) computes wet path delays based on: i) WTC from the on-board MWR measurements, whenever they exist and are valid; ii) new WTC values estimated through space-time objective analysis of all available data sources, whenever the previous are considered invalid. In the estimation of the new WTC values, the following data sets are used: valid measurements from the on-board MWR, water vapour products derived from a set of 17 scanning imaging radiometers (SI-MWR) on board various remote sensing satellites and tropospheric delays derived from Global Navigation Satellite Systems (GNSS) coastal and island stations. In the estimation process, WTC derived from an atmospheric model such as the European Centre for Medium-range Weather Forecasts (ECMWF) ReAnalysis (ERA) Interim or the operational (Op) model are used as first guess, which is the adopted value in the absence of measurements. The corrections are provided for all missions used to generate the SL Essential Climate Variable (ECV): TOPEX/Poseidon- T/P, Jason-1, Jason-2, ERS-1, ERS-2, Envisat, CryoSat-2 and SARAL/ALtiKa. To ensure consistency and long term stability of the WTC datasets, the radiometers used in the GPD+ estimations have been inter-calibrated against the stable and independently-calibrated Special Sensor Microwave Imager (SSM/I) and SSMI/I Sounder (SSM/IS) sensors on-board the Defense Meteorological Satellite Program satellite series (F10, F11, F13, F14, F16 and F17). The new products reduce the sea level anomaly variance, both along-track and at crossovers with respect to previous non-calibrated versions and to other WTC data sets such as AVISO Composite (Comp) correction and atmospheric models. Improvements are particularly significant for TP and all ESA missions, especially in the coastal regions and at high latitudes. In comparison with previous GPD versions, the main impacts are on the sea level trends at decadal time scales and on regional sea level trends. For CryoSat-2, the GPD+ WTC improves the SL ECV when compared to the baseline correction from the ECMWF Op model. In view to obtain the best WTC for use in the version 2 of the SL_cci ECV, new products are under development, based on recently released on-board MWR WTC for missions such as Jason-1, Envisat and SARAL. Fernandes, M.J., Clara Lázaro, Michaël Ablain, Nelson Pires, Improved wet path delays for all ESA and reference altimetric missions, Remote Sensing of Environment, Volume 169, November 2015, Pages 50-74, ISSN 0034-4257, http://dx.doi.org/10.1016/j.rse.2015.07.023

  14. Probability Estimates of Solar Proton Doses During Periods of Low Sunspot Number for Short Duration Missions

    NASA Technical Reports Server (NTRS)

    Atwell, William; Tylka, Allan J.; Dietrich, William F.; Rojdev, Kristina; Matzkind, Courtney

    2016-01-01

    In an earlier paper presented at ICES in 2015, we investigated solar particle event (SPE) radiation exposures (absorbed dose) to small, thinly-shielded spacecraft during a period when the monthly smoothed sunspot number (SSN) was less than 30. Although such months are generally considered "solar-quiet", SPEs observed during these months even include Ground Level Events, the most energetic type of SPE. In this paper, we add to previous study those SPEs that occurred in 1973-2015 when the SSN was greater than 30 but less than 50. Based on the observable energy range of the solar protons, we classify the event as GLEs, sub-GLEs, and sub-sub-GLEs, all of which are potential contributors to the radiation hazard. We use the spectra of these events to construct a probabilistic model of the absorbed dose due to solar protons when SSN < 50 at various confidence levels for various depths of shielding and for various mission durations. We provide plots and tables of solar proton-induced absorbed dose as functions of confidence level, shielding thickness, and mission-duration that will be useful to system designers.

  15. Assessments on GOCE-based Gravity Field Model Comparisons with Terrestrial Data Using Wavelet Decomposition and Spectral Enhancement Approaches

    NASA Astrophysics Data System (ADS)

    Erol, Serdar; Serkan Isık, Mustafa; Erol, Bihter

    2016-04-01

    The recent Earth gravity field satellite missions data lead significant improvement in Global Geopotential Models in terms of both accuracy and resolution. However the improvement in accuracy is not the same everywhere in the Earth and therefore quantifying the level of improvement locally is necessary using the independent data. The validations of the level-3 products from the gravity field satellite missions, independently from the estimation procedures of these products, are possible using various arbitrary data sets, as such the terrestrial gravity observations, astrogeodetic vertical deflections, GPS/leveling data, the stationary sea surface topography. Quantifying the quality of the gravity field functionals via recent products has significant importance for determination of the regional geoid modeling, base on the satellite and terrestrial data fusion with an optimal algorithm, beside the statistical reporting the improvement rates depending on spatial location. In the validations, the errors and the systematic differences between the data and varying spectral content of the compared signals should be considered in order to have comparable results. In this manner this study compares the performance of Wavelet decomposition and spectral enhancement techniques in validation of the GOCE/GRACE based Earth gravity field models using GPS/leveling and terrestrial gravity data in Turkey. The terrestrial validation data are filtered using Wavelet decomposition technique and the numerical results from varying levels of decomposition are compared with the results which are derived using the spectral enhancement approach with contribution of an ultra-high resolution Earth gravity field model. The tests include the GO-DIR-R5, GO-TIM-R5, GOCO05S, EIGEN-6C4 and EGM2008 global models. The conclusion discuss the superiority and drawbacks of both concepts as well as reporting the performance of tested gravity field models with an estimate of their contribution to modeling the geoid in Turkish territory.

  16. Development of the Architectural Simulation Model for Future Launch Systems and its Application to an Existing Launch Fleet

    NASA Technical Reports Server (NTRS)

    Rabadi, Ghaith

    2005-01-01

    A significant portion of lifecycle costs for launch vehicles are generated during the operations phase. Research indicates that operations costs can account for a large percentage of the total life-cycle costs of reusable space transportation systems. These costs are largely determined by decisions made early during conceptual design. Therefore, operational considerations are an important part of vehicle design and concept analysis process that needs to be modeled and studied early in the design phase. However, this is a difficult and challenging task due to uncertainties of operations definitions, the dynamic and combinatorial nature of the processes, and lack of analytical models and the scarcity of historical data during the conceptual design phase. Ultimately, NASA would like to know the best mix of launch vehicle concepts that would meet the missions launch dates at the minimum cost. To answer this question, we first need to develop a model to estimate the total cost, including the operational cost, to accomplish this set of missions. In this project, we have developed and implemented a discrete-event simulation model using ARENA (a simulation modeling environment) to determine this cost assessment. Discrete-event simulation is widely used in modeling complex systems, including transportation systems, due to its flexibility, and ability to capture the dynamics of the system. The simulation model accepts manifest inputs including the set of missions that need to be accomplished over a period of time, the clients (e.g., NASA or DoD) who wish to transport the payload to space, the payload weights, and their destinations (e.g., International Space Station, LEO, or GEO). A user of the simulation model can define an architecture of reusable or expendable launch vehicles to achieve these missions. Launch vehicles may belong to different families where each family may have it own set of resources, processing times, and cost factors. The goal is to capture the required resource levels of the major launch elements and their required facilities. The model s output can show whether or not a certain architecture of vehicles can meet the launch dates, and if not, how much the delay cost would be. It will also produce aggregate figures of missions cost based on element procurement cost, processing cost, cargo integration cost, delay cost, and mission support cost. One of the most useful features of this model is that it is stochastic where it accepts statistical distributions to represent the processing times mimicking the stochastic nature of real systems.

  17. A high power ion thruster for deep space missions

    NASA Astrophysics Data System (ADS)

    Polk, James E.; Goebel, Dan M.; Snyder, John S.; Schneider, Analyn C.; Johnson, Lee K.; Sengupta, Anita

    2012-07-01

    The Nuclear Electric Xenon Ion System ion thruster was developed for potential outer planet robotic missions using nuclear electric propulsion (NEP). This engine was designed to operate at power levels ranging from 13 to 28 kW at specific impulses of 6000-8500 s and for burn times of up to 10 years. State-of-the-art performance and life assessment tools were used to design the thruster, which featured 57-cm-diameter carbon-carbon composite grids operating at voltages of 3.5-6.5 kV. Preliminary validation of the thruster performance was accomplished with a laboratory model thruster, while in parallel, a flight-like development model (DM) thruster was completed and two DM thrusters fabricated. The first thruster completed full performance testing and a 2000-h wear test. The second successfully completed vibration tests at the full protoflight levels defined for this NEP program and then passed performance validation testing. The thruster design, performance, and the experimental validation of the design tools are discussed in this paper.

  18. A high power ion thruster for deep space missions.

    PubMed

    Polk, James E; Goebel, Dan M; Snyder, John S; Schneider, Analyn C; Johnson, Lee K; Sengupta, Anita

    2012-07-01

    The Nuclear Electric Xenon Ion System ion thruster was developed for potential outer planet robotic missions using nuclear electric propulsion (NEP). This engine was designed to operate at power levels ranging from 13 to 28 kW at specific impulses of 6000-8500 s and for burn times of up to 10 years. State-of-the-art performance and life assessment tools were used to design the thruster, which featured 57-cm-diameter carbon-carbon composite grids operating at voltages of 3.5-6.5 kV. Preliminary validation of the thruster performance was accomplished with a laboratory model thruster, while in parallel, a flight-like development model (DM) thruster was completed and two DM thrusters fabricated. The first thruster completed full performance testing and a 2000-h wear test. The second successfully completed vibration tests at the full protoflight levels defined for this NEP program and then passed performance validation testing. The thruster design, performance, and the experimental validation of the design tools are discussed in this paper.

  19. Electrochromic Radiator Coupon Level Testing and Full Scale Thermal Math Modeling for Use on Altair Lunar Lander

    NASA Technical Reports Server (NTRS)

    Sheth, Rubik; Bannon, Erika; Bower, Chad

    2009-01-01

    In order to control system and component temperatures, many spacecraft thermal control systems use a radiator coupled with a pumped fluid loop to reject waste heat from the vehicle. Since heat loads and radiation environments can vary considerably according to mission phase, the thermal control system must be able to vary the heat rejection. The ability to "turn down" the heat rejected from the thermal control system is critically important when designing the system.. Electrochromic technology as a radiator coating is being investigated to vary the amount of heat being rejected by a radiator. Coupon level tests were performed to test the feasibility of the technology. Furthermore, thermal math models were developed to better understand the turndown ratios required by full scale radiator architectures to handle the various operation scenarios during a mission profile for Altair Lunar Lander. This paper summarizes results from coupon level tests as well as thermal math models developed to investigate how electrochromics can be used to provide the largest turn down ratio for a radiator. Data from the various design concepts of radiators and their architectures are outlined. Recommendations are made on which electrochromic radiator concept should be carried further for future thermal vacuum testing.

  20. Electrochromic Radiator Coupon Level Testing and Full Scale Thermal Math Modeling for Use on Altair Lunar Lander

    NASA Technical Reports Server (NTRS)

    Bannon, Erika T.; Bower, Chad E.; Sheth, Rubik; Stephan, Ryan

    2010-01-01

    In order to control system and component temperatures, many spacecraft thermal control systems use a radiator coupled with a pumped fluid loop to reject waste heat from the vehicle. Since heat loads and radiation environments can vary considerably according to mission phase, the thermal control system must be able to vary the heat rejection. The ability to "turn down" the heat rejected from the thermal control system is critically important when designing the system. Electrochromic technology as a radiator coating is being investigated to vary the amount of heat rejected by a radiator. Coupon level tests were performed to test the feasibility of this technology. Furthermore, thermal math models were developed to better understand the turndown ratios required by full scale radiator architectures to handle the various operation scenarios encountered during a mission profile for the Altair Lunar Lander. This paper summarizes results from coupon level tests as well as the thermal math models developed to investigate how electrochromics can be used to increase turn down ratios for a radiator. Data from the various design concepts of radiators and their architectures are outlined. Recommendations are made on which electrochromic radiator concept should be carried further for future thermal vacuum testing.

  1. Space Station needs, attributes and architectural options, volume 2, book 3: Cost and programmatics

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The cost and programmatic considerations which integrate mission requirements and architectural options into a cohesive system for exploitation of space opportunities within affordable limits are discussed. The mission requirements, baseline architecture, a top level baseline schedule, and acquisition costs are summarized. The work breakdown structure (WBS) used to structure the program, and the WBS dictionary are included. The costing approach used, including the operation of the primary costing tool, the SPACE cost model are described. The rationale for the choice of cost estimating relationships is given and costs at the module level are shown. Detailed costs at the subsystem level are shown. The baseline schedule and annual funding profiles are provided. Alternate schedules are developed to provide different funding profiles. Alternate funding sources are discussed and foreign and contractor participation is outlined. The results of the benefit analysis are given and the accrued benefits deriving from an implemented space station program are outlined.

  2. High-Performance, Radiation-Hardened Electronics for Space Environments

    NASA Technical Reports Server (NTRS)

    Keys, Andrew S.; Watson, Michael D.; Frazier, Donald O.; Adams, James H.; Johnson, Michael A.; Kolawa, Elizabeth A.

    2007-01-01

    The Radiation Hardened Electronics for Space Environments (RHESE) project endeavors to advance the current state-of-the-art in high-performance, radiation-hardened electronics and processors, ensuring successful performance of space systems required to operate within extreme radiation and temperature environments. Because RHESE is a project within the Exploration Technology Development Program (ETDP), RHESE's primary customers will be the human and robotic missions being developed by NASA's Exploration Systems Mission Directorate (ESMD) in partial fulfillment of the Vision for Space Exploration. Benefits are also anticipated for NASA's science missions to planetary and deep-space destinations. As a technology development effort, RHESE provides a broad-scoped, full spectrum of approaches to environmentally harden space electronics, including new materials, advanced design processes, reconfigurable hardware techniques, and software modeling of the radiation environment. The RHESE sub-project tasks are: SelfReconfigurable Electronics for Extreme Environments, Radiation Effects Predictive Modeling, Radiation Hardened Memory, Single Event Effects (SEE) Immune Reconfigurable Field Programmable Gate Array (FPGA) (SIRF), Radiation Hardening by Software, Radiation Hardened High Performance Processors (HPP), Reconfigurable Computing, Low Temperature Tolerant MEMS by Design, and Silicon-Germanium (SiGe) Integrated Electronics for Extreme Environments. These nine sub-project tasks are managed by technical leads as located across five different NASA field centers, including Ames Research Center, Goddard Space Flight Center, the Jet Propulsion Laboratory, Langley Research Center, and Marshall Space Flight Center. The overall RHESE integrated project management responsibility resides with NASA's Marshall Space Flight Center (MSFC). Initial technology development emphasis within RHESE focuses on the hardening of Field Programmable Gate Arrays (FPGA)s and Field Programmable Analog Arrays (FPAA)s for use in reconfigurable architectures. As these component/chip level technologies mature, the RHESE project emphasis shifts to focus on efforts encompassing total processor hardening techniques and board-level electronic reconfiguration techniques featuring spare and interface modularity. This phased approach to distributing emphasis between technology developments provides hardened FPGA/FPAAs for early mission infusion, then migrates to hardened, board-level, high speed processors with associated memory elements and high density storage for the longer duration missions encountered for Lunar Outpost and Mars Exploration occurring later in the Constellation schedule.

  3. IInvestigations of space-time variability of the sea level in the Barents Sea and the White Sea by satellite altimetry data and results of hydrodynamic modelling

    NASA Astrophysics Data System (ADS)

    Lebedev, S. A.; Zilberstein, O. I.; Popov, S. K.; Tikhonova, O. V.

    2003-04-01

    The problem of retrieving of the sea level anomalies in the Barents and White Seas from satellite can be considered as two different problems. The first one is to calculate the anomalies of sea level along the trek taking into account all amendments including tidal heights. The second one is to obtain of fields of the sea level anomalies on the grid over one cycle of the exact repeat altimetry mission. Experience results show that there is preferable to use the regional tidal model for calculating tidal heights. To construct of the anomalies fields of the sea level during the exact repeat mission (cycle 35 days for ERS-1 and ERS-2), when a density of the coverage of the area of water of the Barents and White Seas by satellite measurements achieves maximum. It is necessary to solve the problem of the error minimum. This error is based by the temporal difference of the measurements over one cycle and by the specific of the hydrodynamic regime of the both seas (tidal, storm surge variations, tidal currents). To solve this problem it is assumed to use the results of the hydrodynamic modeling. The error minimum is preformed by the regression of the model results and satellite measurements. As a version it is considered the possibility of the utilizing of the neuronet obtained by the model results to construct maps of the sea level anomalies. The comparison of the model results and the calculation of the satellite altimetry variability of the sea level of Barents and White Seas shows a good coincidence between them. The satellite altimetry data of ERS-1/2 and TOPEX/POSEIDON of Ocean Altimeter Pathfinder Project (NASA/GSFC) has been used in this study. Results of the regional tidal model computations and three dimensional baroclinic model created in the Hydrometeocenter have been used as well. This study also exploited the atmosphere date of the Project REANALYSIS. The research was undertaken with partial support from the Russian Basic Research Foundation (Project No. 01-07-90106).

  4. Software engineering processes for Class D missions

    NASA Astrophysics Data System (ADS)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  5. Impact modeling and prediction of attacks on cyber targets

    NASA Astrophysics Data System (ADS)

    Khalili, Aram; Michalk, Brian; Alford, Lee; Henney, Chris; Gilbert, Logan

    2010-04-01

    In most organizations, IT (information technology) infrastructure exists to support the organization's mission. The threat of cyber attacks poses risks to this mission. Current network security research focuses on the threat of cyber attacks to the organization's IT infrastructure; however, the risks to the overall mission are rarely analyzed or formalized. This connection of IT infrastructure to the organization's mission is often neglected or carried out ad-hoc. Our work bridges this gap and introduces analyses and formalisms to help organizations understand the mission risks they face from cyber attacks. Modeling an organization's mission vulnerability to cyber attacks requires a description of the IT infrastructure (network model), the organization mission (business model), and how the mission relies on IT resources (correlation model). With this information, proper analysis can show which cyber resources are of tactical importance in a cyber attack, i.e., controlling them enables a large range of cyber attacks. Such analysis also reveals which IT resources contribute most to the organization's mission, i.e., lack of control over them gravely affects the mission. These results can then be used to formulate IT security strategies and explore their trade-offs, which leads to better incident response. This paper presents our methodology for encoding IT infrastructure, organization mission and correlations, our analysis framework, as well as initial experimental results and conclusions.

  6. Definition of mission requirements for the follow-on EUMETSAT polar system

    NASA Astrophysics Data System (ADS)

    Phillips, P. L.; Schlüssel, P.; Accadia, C. J.; Munro, R.; Wilson, J. J. W.; Perez-Albinana, A.; Banfi, S.

    2007-10-01

    EUMETSAT has initiated preparatory activities for the definition of the follow-on EUMETSAT Polar System (post- EPS) needed for the timeframe 2020 onwards as a replacement for the current EUMETSAT Polar System. Based on the first outputs of the EUMETSAT post-EPS user consultation process initiated in 2005, mission requirements for potential post-EPS missions have been drafted. Expertise from a variety of communities was drawn upon in order to ascertain user needs expressed in terms of geophysical variables, for operational meteorology, climate monitoring, atmospheric chemistry, oceanography, and hydrology. Current trends in the evolution of these applications were considered in order to derive the necessary satellite products that will be required in the post-EPS era. The increasing complexity of models with regard to parameterisation and data assimilation, along with the trend towards coupled atmosphere, ocean and land models, generates new requirements, particularly in the domains of clouds and precipitation, trace gases and ocean/land surface products. Following the requirements definition, concept studies at instrument and system levels will shortly commence with the support of the European Space Agency (ESA), together with industry and representatives of the user and science communities. Such studies, planned for completion by end of 2008, aim at defining and trading off possible mission and system concepts and will establish preliminary functional requirements for full or partial implementation of post-EPS mission requirements. Cost drivers and needs for critical research and development will also be identified. The generation of both the user and mission requirements have been supported substantially by the post-EPS Mission Experts Team and the Application Expert Groups. Their support is gratefully acknowledged.

  7. EMC: Mission Statement

    Science.gov Websites

    EMC: Mission Statement Mesoscale Modeling Branch Mission Statement The Mesoscale Modeling Branch , advanced numerical techniques applied to mesoscale modeling problems, parameterization of mesoscale new observing systems. The Mesoscale Modeling Branch publishes research results in various media for

  8. Utilizing Mars Global Reference Atmospheric Model (Mars-GRAM 2005) to Evaluate Entry Probe Mission Sites

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, Carl G.

    2008-01-01

    The Mars Global Reference Atmospheric Model (Mars-GRAM 2005) is an engineering-level atmospheric model widely used for diverse mission applications. An overview is presented of Mars-GRAM 2005 and its new features. The "auxiliary profile" option is one new feature of Mars-GRAM 2005. This option uses an input file of temperature and density versus altitude to replace the mean atmospheric values from Mars-GRAM's conventional (General Circulation Model) climatology. Any source of data or alternate model output can be used to generate an auxiliary profile. Auxiliary profiles for this study were produced from mesoscale model output (Southwest Research Institute's Mars Regional Atmospheric Modeling System (MRAMS) model and Oregon State University's Mars mesoscale model (MMM5) model) and a global Thermal Emission Spectrometer (TES) database. The global TES database has been specifically generated for purposes of making Mars-GRAM auxiliary profiles. This data base contains averages and standard deviations of temperature, density, and thermal wind components, averaged over 5-by-5 degree latitude-longitude bins and 15 degree Ls bins, for each of three Mars years of TES nadir data. The Mars Science Laboratory (MSL) sites are used as a sample of how Mars-GRAM' could be a valuable tool for planning of future Mars entry probe missions. Results are presented using auxiliary profiles produced from the mesoscale model output and TES observed data for candidate MSL landing sites. Input parameters rpscale (for density perturbations) and rwscale (for wind perturbations) can be used to "recalibrate" Mars-GRAM perturbation magnitudes to better replicate observed or mesoscale model variability.

  9. Management of unmanned moving sensors through human decision layers: a bi-level optimization process with calls to costly sub-processes

    NASA Astrophysics Data System (ADS)

    Dambreville, Frédéric

    2013-10-01

    While there is a variety of approaches and algorithms for optimizing the mission of an unmanned moving sensor, there are much less works which deal with the implementation of several sensors within a human organization. In this case, the management of the sensors is done through at least one human decision layer, and the sensors management as a whole arises as a bi-level optimization process. In this work, the following hypotheses are considered as realistic: Sensor handlers of first level plans their sensors by means of elaborated algorithmic tools based on accurate modelling of the environment; Higher level plans the handled sensors according to a global observation mission and on the basis of an approximated model of the environment and of the first level sub-processes. This problem is formalized very generally as the maximization of an unknown function, defined a priori by sampling a known random function (law of model error). In such case, each actual evaluation of the function increases the knowledge about the function, and subsequently the efficiency of the maximization. The issue is to optimize the sequence of value to be evaluated, in regards to the evaluation costs. There is here a fundamental link with the domain of experiment design. Jones, Schonlau and Welch proposed a general method, the Efficient Global Optimization (EGO), for solving this problem in the case of additive functional Gaussian law. In our work, a generalization of the EGO is proposed, based on a rare event simulation approach. It is applied to the aforementioned bi-level sensor planning.

  10. Solar Power System Options for the Radiation and Technology Demonstration Spacecraft

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.; Haraburda, Francis M.; Riehl, John P.

    2000-01-01

    The Radiation and Technology Demonstration (RTD) Mission has the primary objective of demonstrating high-power (10 kilowatts) electric thruster technologies in Earth orbit. This paper discusses the conceptual design of the RTD spacecraft photovoltaic (PV) power system and mission performance analyses. These power system studies assessed multiple options for PV arrays, battery technologies and bus voltage levels. To quantify performance attributes of these power system options, a dedicated Fortran code was developed to predict power system performance and estimate system mass. The low-thrust mission trajectory was analyzed and important Earth orbital environments were modeled. Baseline power system design options are recommended on the basis of performance, mass and risk/complexity. Important findings from parametric studies are discussed and the resulting impacts to the spacecraft design and cost.

  11. Human Mars Entry, Descent, and Landing Architecture Study Overview

    NASA Technical Reports Server (NTRS)

    Cianciolo, Alicia D.; Polsgrove, Tara T.

    2016-01-01

    The Entry, Descent, and Landing (EDL) Architecture Study is a multi-NASA center activity to analyze candidate EDL systems as they apply to human Mars landing in the context of the Evolvable Mars Campaign. The study, led by the Space Technology Mission Directorate (STMD), is performed in conjunction with the NASA's Science Mission Directorate and the Human Architecture Team, sponsored by NASA's Human Exploration and Operations Mission Directorate. The primary objective is to prioritize future STMD EDL technology investments by (1) generating Phase A-level designs for selected concepts to deliver 20 t human class payloads, (2) developing a parameterized mass model for each concept capable of examining payloads between 5 and 40 t, and (3) evaluating integrated system performance using trajectory simulations. This paper summarizes the initial study results.

  12. Probing the Physics and Chemistry in Hot Jupiter Exoclimes for Future Missions

    NASA Astrophysics Data System (ADS)

    Afrin Badhan, Mahmuda; Kopparapu, Ravi Kumar; Domagal-Goldman, Shawn; Deming, Drake; Hébrard, Eric; Irwin, Patrick GJ; Batalha, Natasha; Mandell, Avi

    2017-01-01

    Unique and exotic planets give us an opportunity to understand how planetary systems form and evolve over their lifetime, by placing our own planetary system in the context of vastly different extrasolar systems. In particular, close-in planets such as Hot Jupiters provide us with valuable insights about the host stellar atmosphere and planetary atmospheres subjected to such high levels of stellar insolation. Observed spectroscopic signatures from a planet reveal all spectrally active species in its atmosphere, along with information about its thermal structure and dynamics, allowing us to characterize the planet's atmosphere. NASA’s upcoming missions will give us the high-resolution spectra necessary to constrain such atmospheric properties with unprecedented accuracy. However, to interpret the observed signals from exoplanetary transit events with any certainty, we need reliable atmospheric modeling tools that map both the physical and chemical processes affecting the particular type of planet under investigation. My work seeks to expand on past efforts in these two categories for irradiated giant exoplanets. These atmospheric models can be combined with future mission simulations to build tools that allow us to self-consistently “retrieve” the signatures we can expect to observe with the instruments. In my work thus far, I have built the robust Markov Chain Monte Carlo convergence scheme, with an analytical radiative equilibrium formulation to represent the thermal structures, within the NEMESIS atmospheric radiative transfer modeling and retrieval tool. I have combined this physics-based thermal structure with photochemical abundance profiles for the major gas atmospheric constituents, using the NASA Astrobiology Institute’s VPL/Atmos photochemistry model, which I recently extended to giant planet regimes. Here I will present my new Hot Jupiter models and retrievals results constructed from these latest enhancements. For comparison, I will show applications to both archival data from present missions and JWST/NIRSpec simulations, and discuss any new information we expect to reliably extract from the upcoming JWST mission.

  13. Analytical solution of perturbed relative motion: an application of satellite formations to geodesy

    NASA Astrophysics Data System (ADS)

    Wnuk, Edwin

    In the upcoming years, several space missions will be operated using a number of spacecraft flying in formation. Clusters of spacecraft with a carefully designed orbits and optimal formation geometry enable a wide variety of applications ranging from remote sensing to astronomy, geodesy and basic physics. Many of the applications require precise relative navigation and autonomous orbit control of satellites moving in a formation. For many missions a centimeter level of orbit control accuracy is required. The GRACE mission, since its launch in 2002, has been improving the Earth's gravity field model to a very high level of accuracy. This mission is a formation flying one consisting of two satellites moving in coplanar orbits and provides range and range-rate measurements between the satellites in the along-track direction. Future geodetic missions probably will employ alternative architectures using additional satellites and/or performing out-of-plane motion, e.g cartwheel orbits. The paper presents an analytical model of a satellite formation motion that enables propagation of the relative spacecraft motion. The model is based on the analytical theory of satellite relative motion that was presented in the previous our papers (Wnuk and Golebiewska, 2005, 2006). This theory takes into account the influence of the following gravitational perturbation effects: 1) zonal and tesseral harmonic geopotential coefficients up to arbitrary degree and order, 2) Lunar gravity, 3) Sun gravity. Formulas for differential perturbations were derived with any restriction concerning a plane of satellite orbits. They can be applied in both: in plane and out of plane cases. Using this propagator we calculated relative orbits and future relative satellite positions for different types of formations: in plane, out of plane, cartwheel and others. We analyzed the influence of particular parts of perturbation effects and estimated the accuracy of predicted relative spacecrafts positions. References 1,Wnuk E., Golebiewska J.,2005, ,,The relative motion of Earth's orbiting satellites", Celestial Mechanics, 91, 373-389. 2.Wnuk E., Golebiewska J.,2006, "Differential Perturbations and Semimajor Axis Estimation for Satellite Formation Orbits", American Institute of Aeronautics and Astronautics, Electronic Library, 2006, 6018.

  14. Probabilistic Model Development

    NASA Technical Reports Server (NTRS)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  15. Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)

    DTIC Science & Technology

    2005-02-01

    method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background

  16. Approximation Model Building for Reliability & Maintainability Characteristics of Reusable Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Unal, Resit; Morris, W. Douglas; White, Nancy H.; Lepsch, Roger A.; Brown, Richard W.

    2000-01-01

    This paper describes the development of parametric models for estimating operational reliability and maintainability (R&M) characteristics for reusable vehicle concepts, based on vehicle size and technology support level. A R&M analysis tool (RMAT) and response surface methods are utilized to build parametric approximation models for rapidly estimating operational R&M characteristics such as mission completion reliability. These models that approximate RMAT, can then be utilized for fast analysis of operational requirements, for lifecycle cost estimating and for multidisciplinary sign optimization.

  17. Space Radiation Cancer Risks

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.

    2007-01-01

    Space radiation presents major challenges to astronauts on the International Space Station and for future missions to the Earth s moon or Mars. Methods used to project risks on Earth need to be modified because of the large uncertainties in projecting cancer risks from space radiation, and thus impact safety factors. We describe NASA s unique approach to radiation safety that applies uncertainty based criteria within the occupational health program for astronauts: The two terrestrial criteria of a point estimate of maximum acceptable level of risk and application of the principle of As Low As Reasonably Achievable (ALARA) are supplemented by a third requirement that protects against risk projection uncertainties using the upper 95% confidence level (CL) in the radiation cancer projection model. NASA s acceptable level of risk for ISS and their new lunar program have been set at the point-estimate of a 3-percent risk of exposure induced death (REID). Tissue-averaged organ dose-equivalents are combined with age at exposure and gender-dependent risk coefficients to project the cumulative occupational radiation risks incurred by astronauts. The 95% CL criteria in practice is a stronger criterion than ALARA, but not an absolute cut-off as is applied to a point projection of a 3% REID. We describe the most recent astronaut dose limits, and present a historical review of astronaut organ doses estimates from the Mercury through the current ISS program, and future projections for lunar and Mars missions. NASA s 95% CL criteria is linked to a vibrant ground based radiobiology program investigating the radiobiology of high-energy protons and heavy ions. The near-term goal of research is new knowledge leading to the reduction of uncertainties in projection models. Risk projections involve a product of many biological and physical factors, each of which has a differential range of uncertainty due to lack of data and knowledge. The current model for projecting space radiation cancer risk relies on the three assumptions of linearity, additivity, and scaling along with the use of population averages. We describe uncertainty estimates for this model, and new experimental data that sheds light on the accuracy of the underlying assumptions. These methods make it possible to express risk management objectives in terms of quantitative metrics, i.e., the number of days in space without exceeding a given risk level within well defined confidence limits. The resulting methodology is applied to several human space exploration mission scenarios including lunar station, deep space outpost, and a Mars mission. Factors that dominate risk projection uncertainties and application of this approach to assess candidate mitigation approaches are described.

  18. System of systems design: Evaluating aircraft in a fleet context using reliability and non-deterministic approaches

    NASA Astrophysics Data System (ADS)

    Frommer, Joshua B.

    This work develops and implements a solution framework that allows for an integrated solution to a resource allocation system-of-systems problem associated with designing vehicles for integration into an existing fleet to extend that fleet's capability while improving efficiency. Typically, aircraft design focuses on using a specific design mission while a fleet perspective would provide a broader capability. Aspects of design for both the vehicles and missions may be, for simplicity, deterministic in nature or, in a model that reflects actual conditions, uncertain. Toward this end, the set of tasks or goals for the to-be-planned system-of-systems will be modeled more accurately with non-deterministic values, and the designed platforms will be evaluated using reliability analysis. The reliability, defined as the probability of a platform or set of platforms to complete possible missions, will contribute to the fitness of the overall system. The framework includes building surrogate models for metrics such as capability and cost, and includes the ideas of reliability in the overall system-level design space. The concurrent design and allocation system-of-systems problem is a multi-objective mixed integer nonlinear programming (MINLP) problem. This study considered two system-of-systems problems that seek to simultaneously design new aircraft and allocate these aircraft into a fleet to provide a desired capability. The Coast Guard's Integrated Deepwater System program inspired the first problem, which consists of a suite of search-and-find missions for aircraft based on descriptions from the National Search and Rescue Manual. The second represents suppression of enemy air defense operations similar to those carried out by the U.S. Air Force, proposed as part of the Department of Defense Network Centric Warfare structure, and depicted in MILSTD-3013. The two problems seem similar, with long surveillance segments, but because of the complex nature of aircraft design, the analysis of the vehicle for high-speed attack combined with a long loiter period is considerably different from that for quick cruise to an area combined with a low speed search. However, the framework developed to solve this class of system-of-systems problem handles both scenarios and leads to a solution type for this kind of problem. On the vehicle-level of the problem, different technology can have an impact on the fleet-level. One such technology is Morphing, the ability to change shape, which is an ideal candidate technology for missions with dissimilar segments, such as the aforementioned two. A framework, using surrogate models based on optimally-sized aircraft, and using probabilistic parameters to define a concept of operations, is investigated; this has provided insight into the setup of the optimization problem, the use of the reliability metric, and the measurement of fleet level impacts of morphing aircraft. The research consisted of four phases. The two initial phases built and defined the framework to solve system-of-systems problem; these investigations used the search-and-find scenario as the example application. The first phase included the design of fixed-geometry and morphing aircraft for a range of missions and evaluated the aircraft capability using non-deterministic mission parameters. The second phase introduced the idea of multiple aircraft in a fleet, but only considered a fleet consisting of one aircraft type. The third phase incorporated the simultaneous design of a new vehicle and allocation into a fleet for the search-and-find scenario; in this phase, multiple types of aircraft are considered. The fourth phase repeated the simultaneous new aircraft design and fleet allocation for the SEAD scenario to show that the approach is not specific to the search-and-find scenario. The framework presented in this work appears to be a viable approach for concurrently designing and allocating constituents in a system, specifically aircraft in a fleet. The research also shows that new technology impact can be assessed at the fleet level using conceptual design principles.

  19. Self-Directed Cooperative Planetary Rovers

    NASA Technical Reports Server (NTRS)

    Zilberstein, Shlomo; Morris, Robert (Technical Monitor)

    2003-01-01

    The project is concerned with the development of decision-theoretic techniques to optimize the scientific return of planetary rovers. Planetary rovers are small unmanned vehicles equipped with cameras and a variety of sensors used for scientific experiments. They must operate under tight constraints over such resources as operation time, power, storage capacity, and communication bandwidth. Moreover, the limited computational resources of the rover limit the complexity of on-line planning and scheduling. We have developed a comprehensive solution to this problem that involves high-level tools to describe a mission; a compiler that maps a mission description and additional probabilistic models of the components of the rover into a Markov decision problem; and algorithms for solving the rover control problem that are sensitive to the limited computational resources and high-level of uncertainty in this domain.

  20. Collaboration support system for "Phobos-Soil" space mission.

    NASA Astrophysics Data System (ADS)

    Nazarov, V.; Nazirov, R.; Zakharov, A.

    2009-04-01

    Rapid development of communication facilities leads growth of interactions done via electronic means. However we can see some paradox in this segment in last times: Extending of communication facilities increases collaboration chaos. And it is very sensitive for space missions in general and scientific space mission particularly because effective decision of this task provides successful realization of the missions and promises increasing the ratio of functional characteristic and cost of mission at all. Resolving of this problem may be found by using respective modern technologies and methods which widely used in different branches and not in the space researches only. Such approaches as Social Networking, Web 2.0 and Enterprise 2.0 look most prospective in this context. The primary goal of the "Phobos-Soil" mission is an investigation of the Phobos which is the Martian moon and particularly its regolith, internal structure, peculiarities of the orbital and proper motion, as well as a number of different scientific measurements and experiments for investigation of the Martian environment. A lot of investigators involved in the mission. Effective collaboration system is key facility for information support of the mission therefore. Further to main goal: communication between users of the system, modern approaches allows using such capabilities as self-organizing community, user generated content, centralized and federative control of the system. Also it may have one unique possibility - knowledge management which is very important for space mission realization. Therefore collaboration support system for "Phobos-Soil" mission designed on the base of multilayer model which includes such levels as Communications, Announcement and Information, Data sharing and Knowledge management. The collaboration support system for "Phobos-Soil" mission will be used as prototype for prospective Russian scientific space missions and the presentation describes its architecture, methodological and technical aspects of its design.

  1. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Astrophysics Data System (ADS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    For long duration missions outside of the protection of the Earth's magnetic field, space radi-ation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We es-timated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration po-tential (φ). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  2. Probabilistic Assessment of Cancer Risk from Solar Particle Events

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Cucinotta, Francis A.

    2010-01-01

    For long duration missions outside of the protection of the Earth s magnetic field, space radiation presents significant health risks including cancer mortality. Space radiation consists of solar particle events (SPEs), comprised largely of medium energy protons (less than several hundred MeV); and galactic cosmic ray (GCR), which include high energy protons and heavy ions. While the frequency distribution of SPEs depends strongly upon the phase within the solar activity cycle, the individual SPE occurrences themselves are random in nature. We estimated the probability of SPE occurrence using a non-homogeneous Poisson model to fit the historical database of proton measurements. Distributions of particle fluences of SPEs for a specified mission period were simulated ranging from its 5 th to 95th percentile to assess the cancer risk distribution. Spectral variability of SPEs was also examined, because the detailed energy spectra of protons are important especially at high energy levels for assessing the cancer risk associated with energetic particles for large events. We estimated the overall cumulative probability of GCR environment for a specified mission period using a solar modulation model for the temporal characterization of the GCR environment represented by the deceleration potential (^). Probabilistic assessment of cancer fatal risk was calculated for various periods of lunar and Mars missions. This probabilistic approach to risk assessment from space radiation is in support of mission design and operational planning for future manned space exploration missions. In future work, this probabilistic approach to the space radiation will be combined with a probabilistic approach to the radiobiological factors that contribute to the uncertainties in projecting cancer risks.

  3. CLARREO Cornerstone of the Earth Observing System: Measuring Decadal Change Through Accurate Emitted Infrared and Reflected Solar Spectra and Radio Occultation

    NASA Technical Reports Server (NTRS)

    Sandford, Stephen P.

    2010-01-01

    The Climate Absolute Radiance and Refractivity Observatory (CLARREO) is one of four Tier 1 missions recommended by the recent NRC Decadal Survey report on Earth Science and Applications from Space (NRC, 2007). The CLARREO mission addresses the need to provide accurate, broadly acknowledged climate records that are used to enable validated long-term climate projections that become the foundation for informed decisions on mitigation and adaptation policies that address the effects of climate change on society. The CLARREO mission accomplishes this critical objective through rigorous SI traceable decadal change observations that are sensitive to many of the key uncertainties in climate radiative forcings, responses, and feedbacks that in turn drive uncertainty in current climate model projections. These same uncertainties also lead to uncertainty in attribution of climate change to anthropogenic forcing. For the first time CLARREO will make highly accurate, global, SI-traceable decadal change observations sensitive to the most critical, but least understood, climate forcings, responses, and feedbacks. The CLARREO breakthrough is to achieve the required levels of accuracy and traceability to SI standards for a set of observations sensitive to a wide range of key decadal change variables. The required accuracy levels are determined so that climate trend signals can be detected against a background of naturally occurring variability. Climate system natural variability therefore determines what level of accuracy is overkill, and what level is critical to obtain. In this sense, the CLARREO mission requirements are considered optimal from a science value perspective. The accuracy for decadal change traceability to SI standards includes uncertainties associated with instrument calibration, satellite orbit sampling, and analysis methods. Unlike most space missions, the CLARREO requirements are driven not by the instantaneous accuracy of the measurements, but by accuracy in the large time/space scale averages that are key to understanding decadal changes.

  4. Constraining early and interacting dark energy with gravitational wave standard sirens: the potential of the eLISA mission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caprini, Chiara; Tamanini, Nicola, E-mail: chiara.caprini@cea.fr, E-mail: nicola.tamanini@cea.fr

    We perform a forecast analysis of the capability of the eLISA space-based interferometer to constrain models of early and interacting dark energy using gravitational wave standard sirens. We employ simulated catalogues of standard sirens given by merging massive black hole binaries visible by eLISA, with an electromagnetic counterpart detectable by future telescopes. We consider three-arms mission designs with arm length of 1, 2 and 5 million km, 5 years of mission duration and the best-level low frequency noise as recently tested by the LISA Pathfinder. Standard sirens with eLISA give access to an intermediate range of redshift 1 ∼< zmore » ∼< 8, and can therefore provide competitive constraints on models where the onset of the deviation from ΛCDM (i.e. the epoch when early dark energy starts to be non-negligible, or when the interaction with dark matter begins) occurs relatively late, at z ∼< 6. If instead early or interacting dark energy is relevant already in the pre-recombination era, current cosmological probes (especially the cosmic microwave background) are more efficient than eLISA in constraining these models, except possibly in the interacting dark energy model if the energy exchange is proportional to the energy density of dark energy.« less

  5. Experimental And Numerical Evaluation Of Gaseous Agents For Suppressing Cup-Burner Flames In Low Gravity

    NASA Technical Reports Server (NTRS)

    Takahashi, Fumiaki; Linteris, Gregory T.; Katta, Viswanath R.

    2003-01-01

    Longer duration missions to the moon, to Mars, and on the International Space Station (ISS) increase the likelihood of accidental fires. NASA's fire safety program for human-crewed space flight is based largely on removing ignition sources and controlling the flammability of the material on-board. There is ongoing research to improve the flammability characterization of materials in low gravity; however, very little research has been conducted on fire suppression in the low-gravity environment. Although the existing suppression systems aboard the Space Shuttle (halon 1301, CF3Br) and the ISS (CO2 or water-based form) may continue to be used, alternative effective agents or techniques are desirable for long-duration missions. The goal of the present investigation is to: (1) understand the physical and chemical processes of fire suppression in various gravity and O2 levels simulating spacecraft, Mars, and moon missions; (2) provide rigorous testing of analytical models, which include detailed combustion-suppression chemistry and radiation sub-models, so that the model can be used to interpret (and predict) the suppression behavior in low gravity; and (3) provide basic research results useful for advances in space fire safety technology, including new fire-extinguishing agents and approaches.

  6. Constellation Probabilistic Risk Assessment (PRA): Design Consideration for the Crew Exploration Vehicle

    NASA Technical Reports Server (NTRS)

    Prassinos, Peter G.; Stamatelatos, Michael G.; Young, Jonathan; Smith, Curtis

    2010-01-01

    Managed by NASA's Office of Safety and Mission Assurance, a pilot probabilistic risk analysis (PRA) of the NASA Crew Exploration Vehicle (CEV) was performed in early 2006. The PRA methods used follow the general guidance provided in the NASA PRA Procedures Guide for NASA Managers and Practitioners'. Phased-mission based event trees and fault trees are used to model a lunar sortie mission of the CEV - involving the following phases: launch of a cargo vessel and a crew vessel; rendezvous of these two vessels in low Earth orbit; transit to th$: moon; lunar surface activities; ascension &om the lunar surface; and return to Earth. The analysis is based upon assumptions, preliminary system diagrams, and failure data that may involve large uncertainties or may lack formal validation. Furthermore, some of the data used were based upon expert judgment or extrapolated from similar componentssystemsT. his paper includes a discussion of the system-level models and provides an overview of the analysis results used to identify insights into CEV risk drivers, and trade and sensitivity studies. Lastly, the PRA model was used to determine changes in risk as the system configurations or key parameters are modified.

  7. Knowledge-based control for robot self-localization

    NASA Technical Reports Server (NTRS)

    Bennett, Bonnie Kathleen Holte

    1993-01-01

    Autonomous robot systems are being proposed for a variety of missions including the Mars rover/sample return mission. Prior to any other mission objectives being met, an autonomous robot must be able to determine its own location. This will be especially challenging because location sensors like GPS, which are available on Earth, will not be useful, nor will INS sensors because their drift is too large. Another approach to self-localization is required. In this paper, we describe a novel approach to localization by applying a problem solving methodology. The term 'problem solving' implies a computational technique based on logical representational and control steps. In this research, these steps are derived from observing experts solving localization problems. The objective is not specifically to simulate human expertise but rather to apply its techniques where appropriate for computational systems. In doing this, we describe a model for solving the problem and a system built on that model, called localization control and logic expert (LOCALE), which is a demonstration of concept for the approach and the model. The results of this work represent the first successful solution to high-level control aspects of the localization problem.

  8. Selenide isotope generator for the Galileo Mission: SIG/Galileo hermetic receptable test program final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Roedel, S.

    1979-06-01

    The purpose of the receptacle test program was to test various types of hermetically sealed electrical receptacles and to select one model as the spaceflight hardware item for SIG/Galileo thermoelectric generators. The design goal of the program was to qualify a hermetic seal integrity of less than or equal to 1 x 10/sup -9/ std cc He/sec -atm at 400/sup 0/F (204/sup 0/C) and verify a reliability of 0.95 at a 50% confidence level for a flight mission in excess of 7 years.

  9. Envisioning Cognitive Robots for Future Space Exploration

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Stoica, Adrian

    2010-01-01

    Cognitive robots in the context of space exploration are envisioned with advanced capabilities of model building, continuous planning/re-planning, self-diagnosis, as well as the ability to exhibit a level of 'understanding' of new situations. An overview of some JPL components (e.g. CASPER, CAMPOUT) and a description of the architecture CARACaS (Control Architecture for Robotic Agent Command and Sensing) that combines these in the context of a cognitive robotic system operating in a various scenarios are presented. Finally, two examples of typical scenarios of a multi-robot construction mission and a human-robot mission, involving direct collaboration with humans is given.

  10. STS-103 crew pose at 195-foot level of Fixed Service Structure

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the 195-foot level of the Fixed Service Structure on Launch Pad 39B, the STS-103 crew take a break from Terminal Countdown Demonstration Test (TCDT) activities. Standing from left to right are Mission Specialists Jean-Frangois Clervoy of France and Claude Nicollier of Switzerland, who are with the European Space Agency; Commander Curtis L. Brown Jr.; Pilot Scott J. Kelly; and Mission Specialists John M. Grunsfeld (Ph.D.), C. Michael Foale (Ph.D.) and Steven L. Smith. The TCDT provides the crew with the emergency egress training, opportunities to inspect their mission payloads in the orbiter's payload bay, and simulated countdown exercises. STS-103 is a 'call-up' mission due to the need to replace and repair portions of the Hubble Space Telescope, including the gyroscopes that allow the telescope to point at stars, galaxies and planets. The STS-103 crew will be replacing a Fine Guidance Sensor, an older computer with a new enhanced model, an older data tape recorder with a solid-state digital recorder, a failed spare transmitter with a new one, and degraded insulation on the telescope with new thermal insulation. The crew will also install a Battery Voltage/Temperature Improvement Kit to protect the spacecraft batteries from overcharging and overheating when the telescope goes into a safe mode. Four EVA's are planned to make the necessary repairs and replacements on the telescope. The mission is targeted for launch Dec. 6 at 2:37 a.m. EST.

  11. Process modeling KC-135 aircraft

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.

    1991-01-01

    Instrumentation will be provided for KC-135 aircraft which will provide a quantitative measure of g-level variation during parabolic flights and its effect on experiments which demonstrate differences in results obtained with differences in convective flow. The flight apparatus will provide video recording of the effects of the g-level variations on varying fluid samples. The apparatus will be constructed to be available to fly on the KC-135 during most missions.

  12. Multi-level systems modeling and optimization for novel aircraft

    NASA Astrophysics Data System (ADS)

    Subramanian, Shreyas Vathul

    This research combines the disciplines of system-of-systems (SoS) modeling, platform-based design, optimization and evolving design spaces to achieve a novel capability for designing solutions to key aeronautical mission challenges. A central innovation in this approach is the confluence of multi-level modeling (from sub-systems to the aircraft system to aeronautical system-of-systems) in a way that coordinates the appropriate problem formulations at each level and enables parametric search in design libraries for solutions that satisfy level-specific objectives. The work here addresses the topic of SoS optimization and discusses problem formulation, solution strategy, the need for new algorithms that address special features of this problem type, and also demonstrates these concepts using two example application problems - a surveillance UAV swarm problem, and the design of noise optimal aircraft and approach procedures. This topic is critical since most new capabilities in aeronautics will be provided not just by a single air vehicle, but by aeronautical Systems of Systems (SoS). At the same time, many new aircraft concepts are pressing the boundaries of cyber-physical complexity through the myriad of dynamic and adaptive sub-systems that are rising up the TRL (Technology Readiness Level) scale. This compositional approach is envisioned to be active at three levels: validated sub-systems are integrated to form conceptual aircraft, which are further connected with others to perform a challenging mission capability at the SoS level. While these multiple levels represent layers of physical abstraction, each discipline is associated with tools of varying fidelity forming strata of 'analysis abstraction'. Further, the design (composition) will be guided by a suitable hierarchical complexity metric formulated for the management of complexity in both the problem (as part of the generative procedure and selection of fidelity level) and the product (i.e., is the mission best achieved via a large collection of interacting simple systems, or a relatively few highly capable, complex air vehicles). The vastly unexplored area of optimization in evolving design spaces will be studied and incorporated into the SoS optimization framework. We envision a framework that resembles a multi-level, mult-fidelity, multi-disciplinary assemblage of optimization problems. The challenge is not simply one of scaling up to a new level (the SoS), but recognizing that the aircraft sub-systems and the integrated vehicle are now intensely cyber-physical, with hardware and software components interacting in complex ways that give rise to new and improved capabilities. The work presented here is a step closer to modeling the information flow that exists in realistic SoS optimization problems between sub-contractors, contractors and the SoS architect.

  13. Formal Methods for Automated Diagnosis of Autosub 6000

    NASA Technical Reports Server (NTRS)

    Ernits, Juhan; Dearden, Richard; Pebody, Miles

    2009-01-01

    This is a progress report on applying formal methods in the context of building an automated diagnosis and recovery system for Autosub 6000, an Autonomous Underwater Vehicle (AUV). The diagnosis task involves building abstract models of the control system of the AUV. The diagnosis engine is based on Livingstone 2, a model-based diagnoser originally built for aerospace applications. Large parts of the diagnosis model can be built without concrete knowledge about each mission, but actual mission scripts and configuration parameters that carry important information for diagnosis are changed for every mission. Thus we use formal methods for generating the mission control part of the diagnosis model automatically from the mission script and perform a number of invariant checks to validate the configuration. After the diagnosis model is augmented with the generated mission control component model, it needs to be validated using verification techniques.

  14. Space Weather Models and Their Validation and Verification at the CCMC

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2010-01-01

    The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.

  15. Manned Mars mission cost estimate

    NASA Technical Reports Server (NTRS)

    Hamaker, Joseph; Smith, Keith

    1986-01-01

    The potential costs of several options of a manned Mars mission are examined. A cost estimating methodology based primarily on existing Marshall Space Flight Center (MSFC) parametric cost models is summarized. These models include the MSFC Space Station Cost Model and the MSFC Launch Vehicle Cost Model as well as other modes and techniques. The ground rules and assumptions of the cost estimating methodology are discussed and cost estimates presented for six potential mission options which were studied. The estimated manned Mars mission costs are compared to the cost of the somewhat analogous Apollo Program cost after normalizing the Apollo cost to the environment and ground rules of the manned Mars missions. It is concluded that a manned Mars mission, as currently defined, could be accomplished for under $30 billion in 1985 dollars excluding launch vehicle development and mission operations.

  16. Thermospheric density and wind retrieval from Swarm observations

    NASA Astrophysics Data System (ADS)

    Visser, Pieter; Doornbos, Eelco; van den IJssel, Jose; Teixeira da Encarnação, João

    2013-11-01

    The three-satellite ESA Swarm mission aims at mapping the Earth's global geomagnetic field at unprecedented spatial and temporal resolution and precision. Swarm also aims at observing thermospheric density and possibly horizontal winds. Precise orbit determination (POD) and Thermospheric Density and Wind (TDW) chains form part of the Swarm Constellation and Application Facility (SCARF), which will provide the so-called Level 2 products. The POD and TDW chains generate the orbit, accelerometer calibration, and thermospheric density and wind Level 2 products. The POD and TDW chains have been tested with data from the CHAMP and GRACE missions, indicating that a 3D orbit precision of about 10 cm can be reached. In addition, POD allows to determine daily accelerometer bias and scale factor values with a precision of around 10-15 nm/s2 and 0.01-0.02, respectively, for the flight direction. With these accelerometer calibration parameter values, derived thermospheric density is consistent at the 9-11% level (standard deviation) with values predicted by models (taking into account that model values are 20-30% higher). The retrieval of crosswinds forms part of the processing chain, but will be challenging. The Swarm observations will be used for further developing and improving density and wind retrieval algorithms.

  17. Targeted observations to improve tropical cyclone track forecasts in the Atlantic and eastern Pacific basins

    NASA Astrophysics Data System (ADS)

    Aberson, Sim David

    In 1997, the National Hurricane Center and the Hurricane Research Division began conducting operational synoptic surveillance missions with the Gulfstream IV-SP jet aircraft to improve operational forecast models. During the first two years, twenty-four missions were conducted around tropical cyclones threatening the continental United States, Puerto Rico, and the Virgin Islands. Global Positioning System dropwindsondes were released from the aircraft at 150--200 km intervals along the flight track in the tropical cyclone environment to obtain wind, temperature, and humidity profiles from flight level (around 150 hPa) to the surface. The observations were processed and formatted aboard the aircraft and transmitted to the National Centers for Environmental Prediction (NCEP). There, they were ingested into the Global Data Assimilation System that subsequently provides initial and time-dependent boundary conditions for numerical models that forecast tropical cyclone track and intensity. Three dynamical models were employed in testing the targeting and sampling strategies. With the assimilation into the numerical guidance of all the observations gathered during the surveillance missions, only the 12-h Geophysical Fluid Dynamics Laboratory Hurricane Model forecast showed statistically significant improvement. Neither the forecasts from the Aviation run of the Global Spectral Model nor the shallow-water VICBAR model were improved with the assimilation of the dropwindsonde data. This mediocre result is found to be due mainly to the difficulty in operationally quantifying the storm-motion vector used to create accurate synthetic data to represent the tropical cyclone vortex in the models. A secondary limit on forecast improvements from the surveillance missions is the limited amount of data provided by the one surveillance aircraft in regular missions. The inability of some surveillance missions to surround the tropical cyclone with dropwindsonde observations is a possible third limit, though the results are inconclusive. Due to limited aircraft resources, optimal observing strategies for these missions must be developed. Since observations in areas of decaying error modes are unlikely to have large impact on subsequent forecasts, such strategies should be based on taking observations in those geographic locations corresponding to the most rapidly growing error modes in the numerical models and on known deficiencies in current data assimilation systems. Here, the most rapidly growing modes are represented by areas of large forecast spread in the NCEP bred-mode global ensemble forecasting system. The sampling strategy requires sampling the entire target region at approximately the same resolution as the North American rawinsonde network to limit the possibly spurious spread of information from dropwindsonde observations into data-sparse regions where errors are likely to grow. When only the subset of data in these fully-sampled target regions is assimilated into the numerical models, statistically significant reduction of the track forecast errors of up to 25% within the critical first two days of the forecast are seen. These model improvements are comparable with the cumulative business-as-usual track forecast model improvements expected over eighteen years.

  18. Improved Traceability of a Small Satellite Mission Concept to Requirements Using Model Based System Engineering

    NASA Technical Reports Server (NTRS)

    Reil, Robin L.

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the "traditional" document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This paper presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magic's MagicDraw modeling tool. The model incorporates mission concept and requirement information from the mission's original DBSE design efforts. Active dependency relationships are modeled to demonstrate the completeness and consistency of the requirements to the mission concept. Anecdotal information and process-duration metrics are presented for both the MBSE and original DBSE design efforts of SporeSat.

  19. Improved Traceability of Mission Concept to Requirements Using Model Based Systems Engineering

    NASA Technical Reports Server (NTRS)

    Reil, Robin

    2014-01-01

    Model Based Systems Engineering (MBSE) has recently been gaining significant support as a means to improve the traditional document-based systems engineering (DBSE) approach to engineering complex systems. In the spacecraft design domain, there are many perceived and propose benefits of an MBSE approach, but little analysis has been presented to determine the tangible benefits of such an approach (e.g. time and cost saved, increased product quality). This thesis presents direct examples of how developing a small satellite system model can improve traceability of the mission concept to its requirements. A comparison of the processes and approaches for MBSE and DBSE is made using the NASA Ames Research Center SporeSat CubeSat mission as a case study. A model of the SporeSat mission is built using the Systems Modeling Language standard and No Magics MagicDraw modeling tool. The model incorporates mission concept and requirement information from the missions original DBSE design efforts. Active dependency relationships are modeled to analyze the completeness and consistency of the requirements to the mission concept. Overall experience and methodology are presented for both the MBSE and original DBSE design efforts of SporeSat.

  20. Career Excess Mortality Risk from Diagnostic Radiological Exams Required for Crewmembers Participating in Long Duration Space Flight

    NASA Technical Reports Server (NTRS)

    Dodge, C. W.; Gonzalez, S. M.; Picco, C. E.; Johnston, S. L.; Shavers, M. R.; VanBaalen, M.

    2008-01-01

    NASA requires astronauts to undergo diagnostic x-ray examinations as a condition for their employment. The purpose of these procedures is to assess the astronaut s overall health and to diagnose conditions that could jeopardize the success of long duration space missions. These include exams for acceptance into the astronaut corps, routine periodic exams, as well as evaluations taken pre and post missions. Issues: According to NASA policy these medical examinations are considered occupational radiological exposures, and thus, are included when computing the astronaut s overall radiation dose and associated excess cancer mortality risk. As such, astronauts and administrators are concerned about the amount of radiation received from these procedures due to the possibility that these additional doses may cause astronauts to exceed NASA s administrative limits, thus disqualifying them from future flights. Methods: Radiation doses and cancer mortality risks following required medical radiation exposures are presented herein for representative male and female astronaut careers. Calculation of the excess cancer mortality risk was performed by adapting NASA s operational risk assessment model. Averages for astronaut height, weight, number of space missions and age at selection into the astronaut corps were used as inputs to the NASA risk model. Conclusion: The results show that the level of excess cancer mortality imposed by all required medical procedures over an entire astronaut s career is approximately the same as that resulting from a single short duration space flight (i.e. space shuttle mission). In short the summation of all medical procedures involving ionizing radiation should have no impact on the number of missions an astronaut can fly over their career. Learning Objectives: 1. The types of diagnostic medical exams which astronauts are subjected to will be presented. 2. The level of radiation dose and excess mortality risk to the average male and female astronaut will be presented.

  1. Technical Feasibility Assessment of Lunar Base Mission Scenarios

    NASA Astrophysics Data System (ADS)

    Magelssen, Trygve ``Spike''; Sadeh, Eligar

    2005-02-01

    Investigation of the literature pertaining to lunar base (LB) missions and the technologies required for LB development has revealed an information gap that hinders technical feasibility assessment. This information gap is the absence of technical readiness levels (TRL) (Mankins, 1995) and information pertaining to the criticality of the critical enabling technologies (CETs) that enable mission success. TRL is a means of identifying technical readiness stages of a technology. Criticality is defined as the level of influence the CET has on the mission scenario. The hypothesis of this research study is that technical feasibility is a function of technical readiness and technical readiness is a function of criticality. A newly developed research analysis method is used to identify the technical feasibility of LB mission scenarios. A Delphi is used to ascertain technical readiness levels and CET criticality-to-mission. The research analysis method is applied to the Delphi results to determine the technical feasibility of the LB mission scenarios that include: observatory, science research, lunar settlement, space exploration gateway, space resource utilization, and space tourism. The CETs identified encompasses four major system level technologies of: transportation, life support, structures, and power systems. Results of the technical feasibility assessment show the observatory and science research LB mission scenarios to be more technical ready out of all the scenarios, but all mission scenarios are in very close proximity to each other in regard to criticality and TRL and no one mission scenario stands out as being absolutely more technically ready than any of the other scenarios. What is significant and of value are the Delphi results concerning CET criticality-to-mission and the TRL values evidenced in the Tables that can be used by anyone assessing the technical feasibility of LB missions.

  2. Astronaut Thermal Exposure: Re-Entry After Low Earth Orbit Rescue Mission

    NASA Technical Reports Server (NTRS)

    Gillis, David B.; Hamilton, Douglas; Ilcus, Stana; Stepaniak, Phil; Son, Chang; Bue, Grant

    2009-01-01

    The STS-125 mission, launched May 11, 2009, is the final servicing mission to the Hubble Space Telescope. The repair mission's EVA tasks are described, including: installing a new wide field camera; installing the Cosmic Origins Spectrograph; repairing the Space Telescope Imaging Spectrograph; installing a new outer blanket layer; adding a Soft Capture and Rendezvous System for eventual controlled deorbit in about 2014; replacing the 'A' side Science Instrument Command and Data Handling module; repairing the Advanced Camera for surveys; and, replacing the rate sensor unit gyroscopes, fine guidance sensors and 3 batteries. Additionally, the Shuttle crew cabin thermal environment is described. A CFD model of per person CO2 demonstrates a discrepancy between crew breathing volume and general mid-deck levels of CO2. A follow-on CFD analysis of the mid-deck temperature distribution is provided. Procedural and engineering mitigation plans are presented to counteract thermal exposure upon reentry to the Earth atmosphere. Some of the procedures include: full cold soak the night prior to deorbit; modifying deck stowage to reduce interference with air flow; and early securing of avionics post-landing to reduce cabin thermal load prior to hatch opening. Engineering mitigation activities include modifying the location of the aft starboard ICUs, eliminating the X3 stack and eliminating ICU exhaust air directed onto astronauts; improved engineering data of ICU performance; and, verifying the adequacy of mid-deck temperature control using CFD models in addition to lumped parameter models. Post-mitigation CFD models of mid-deck temperature profiles and distribution are provided.

  3. Fluid Distribution for In-space Cryogenic Propulsion

    NASA Technical Reports Server (NTRS)

    Lear, William

    2005-01-01

    The ultimate goal of this task is to enable the use of a single supply of cryogenic propellants for three distinct spacecraft propulsion missions: main propulsion, orbital maneuvering, and attitude control. A fluid distribution system is sought which allows large propellant flows during the first two missions while still allowing control of small propellant flows during attitude control. Existing research has identified the probable benefits of a combined thermal management/power/fluid distribution system based on the Solar Integrated Thermal Management and Power (SITMAP) cycle. Both a numerical model and an experimental model are constructed in order to predict the performance of such an integrated thermal management/propulsion system. This research task provides a numerical model and an experimental apparatus which will simulate an integrated thermal/power/fluid management system based on the SITMAP cycle, and assess its feasibility for various space missions. Various modifications are done to the cycle, such as the addition of a regeneration process that allows heat to be transferred into the working fluid prior to the solar collector, thereby reducing the collector size and weight. Fabri choking analysis was also accounted for. Finally the cycle is to be optimized for various space missions based on a mass based figure of merit, namely the System Mass Ratio (SMR). -. 1 he theoretical and experimental results from these models are be used to develop a design code (JETSIT code) which is able to provide design parameters for such a system, over a range of cooling loads, power generation, and attitude control thrust levels. The performance gains and mass savings will be compared to those of existing spacecraft systems.

  4. Exploration Medical System Technical Architecture Overview

    NASA Technical Reports Server (NTRS)

    Cerro, J.; Rubin, D.; Mindock, J.; Middour, C.; McGuire, K.; Hanson, A.; Reilly, J.; Burba, T.; Urbina, M.

    2018-01-01

    The Exploration Medical Capability (ExMC) Element Systems Engineering (SE) goals include defining the technical system needed to support medical capabilities for a Mars exploration mission. A draft medical system architecture was developed based on stakeholder needs, system goals, and system behaviors, as captured in an ExMC concept of operations document and a system model. This talk will discuss a high-level view of the medical system, as part of a larger crew health and performance system, both of which will support crew during Deep Space Transport missions. Other mission components, such as the flight system, ground system, caregiver, and patient, will be discussed as aspects of the context because the medical system will have important interactions with each. Additionally, important interactions with other aspects of the crew health and performance system are anticipated, such as health & wellness, mission task performance support, and environmental protection. This talk will highlight areas in which we are working with other disciplines to understand these interactions.

  5. Considerations in miniaturizing simplified agro-ecosystems for advanced life support

    NASA Technical Reports Server (NTRS)

    Volk, T.

    1996-01-01

    Miniaturizing the Earth's biogeochemical cycles to support human life during future space missions is the goal of the NASA research and engineering program in advanced life support. Mission requirements to reduce mass, volume, and power have focused efforts on (1) a maximally simplified agro-ecosystem of humans, food crops, and microbes; and, (2) a design for optimized productivity of food crops with high light levels over long days, with hydroponics, with elevated carbon dioxide and other controlled environmental factors, as well as with genetic selection for desirable crop properties. Mathematical modeling contributes to the goals by establishing trade-offs, by analyzing the growth and development of experimental crops, and by pointing to the possibilities of directed phasic control using modified field crop models to increase the harvest index.

  6. Considerations in miniaturizing simplified agro-ecosystems for advanced life support.

    PubMed

    Volk, T

    1996-01-01

    Miniaturizing the Earth's biogeochemical cycles to support human life during future space missions is the goal of the NASA research and engineering program in advanced life support. Mission requirements to reduce mass, volume, and power have focused efforts on (1) a maximally simplified agro-ecosystem of humans, food crops, and microbes; and, (2) a design for optimized productivity of food crops with high light levels over long days, with hydroponics, with elevated carbon dioxide and other controlled environmental factors, as well as with genetic selection for desirable crop properties. Mathematical modeling contributes to the goals by establishing trade-offs, by analyzing the growth and development of experimental crops, and by pointing to the possibilities of directed phasic control using modified field crop models to increase the harvest index.

  7. An improved and homogeneous altimeter sea level record from the ESA Climate Change Initiative

    NASA Astrophysics Data System (ADS)

    Legeais, Jean-François; Ablain, Michaël; Zawadzki, Lionel; Zuo, Hao; Johannessen, Johnny A.; Scharffenberg, Martin G.; Fenoglio-Marc, Luciana; Joana Fernandes, M.; Baltazar Andersen, Ole; Rudenko, Sergei; Cipollini, Paolo; Quartly, Graham D.; Passaro, Marcello; Cazenave, Anny; Benveniste, Jérôme

    2018-02-01

    Sea level is a very sensitive index of climate change since it integrates the impacts of ocean warming and ice mass loss from glaciers and the ice sheets. Sea level has been listed as an essential climate variable (ECV) by the Global Climate Observing System (GCOS). During the past 25 years, the sea level ECV has been measured from space by different altimetry missions that have provided global and regional observations of sea level variations. As part of the Climate Change Initiative (CCI) program of the European Space Agency (ESA) (established in 2010), the Sea Level project (SL_cci) aimed to provide an accurate and homogeneous long-term satellite-based sea level record. At the end of the first phase of the project (2010-2013), an initial version (v1.1) of the sea level ECV was made available to users (Ablain et al., 2015). During the second phase of the project (2014-2017), improved altimeter standards were selected to produce new sea level products (called SL_cci v2.0) based on nine altimeter missions for the period 1993-2015 (https://doi.org/10.5270/esa-sea_level_cci-1993_2015-v_2.0-201612; Legeais and the ESA SL_cci team, 2016c). Corresponding orbit solutions, geophysical corrections and altimeter standards used in this v2.0 dataset are described in detail in Quartly et al. (2017). The present paper focuses on the description of the SL_cci v2.0 ECV and associated uncertainty and discusses how it has been validated. Various approaches have been used for the quality assessment such as internal validation, comparisons with sea level records from other groups and with in situ measurements, sea level budget closure analyses and comparisons with model outputs. Compared with the previous version of the sea level ECV, we show that use of improved geophysical corrections, careful bias reduction between missions and inclusion of new altimeter missions lead to improved sea level products with reduced uncertainties on different spatial and temporal scales. However, there is still room for improvement since the uncertainties remain larger than the GCOS requirements (GCOS, 2011). Perspectives on subsequent evolution are also discussed.

  8. Managing Analysis Models in the Design Process

    NASA Technical Reports Server (NTRS)

    Briggs, Clark

    2006-01-01

    Design of large, complex space systems depends on significant model-based support for exploration of the design space. Integrated models predict system performance in mission-relevant terms given design descriptions and multiple physics-based numerical models. Both the design activities and the modeling activities warrant explicit process definitions and active process management to protect the project from excessive risk. Software and systems engineering processes have been formalized and similar formal process activities are under development for design engineering and integrated modeling. JPL is establishing a modeling process to define development and application of such system-level models.

  9. Determining Crust and Upper Mantle Structure by Bayesian Joint Inversion of Receiver Functions and Surface Wave Dispersion at a Single Station: Preparation for Data from the InSight Mission

    NASA Astrophysics Data System (ADS)

    Jia, M.; Panning, M. P.; Lekic, V.; Gao, C.

    2017-12-01

    The InSight (Interior Exploration using Seismic Investigations, Geodesy and Heat Transport) mission will deploy a geophysical station on Mars in 2018. Using seismology to explore the interior structure of the Mars is one of the main targets, and as part of the mission, we will use 3-component seismic data to constrain the crust and upper mantle structure including P and S wave velocities and densities underneath the station. We will apply a reversible jump Markov chain Monte Carlo algorithm in the transdimensional hierarchical Bayesian inversion framework, in which the number of parameters in the model space and the noise level of the observed data are also treated as unknowns in the inversion process. Bayesian based methods produce an ensemble of models which can be analyzed to quantify uncertainties and trade-offs of the model parameters. In order to get better resolution, we will simultaneously invert three different types of seismic data: receiver functions, surface wave dispersion (SWD), and ZH ratios. Because the InSight mission will only deliver a single seismic station to Mars, and both the source location and the interior structure will be unknown, we will jointly invert the ray parameter in our approach. In preparation for this work, we first verify our approach by using a set of synthetic data. We find that SWD can constrain the absolute value of velocities while receiver functions constrain the discontinuities. By joint inversion, the velocity structure in the crust and upper mantle is well recovered. Then, we apply our approach to real data from an earth-based seismic station BFO located in Black Forest Observatory in Germany, as already used in a demonstration study for single station location methods. From the comparison of the results, our hierarchical treatment shows its advantage over the conventional method in which the noise level of observed data is fixed as a prior.

  10. Benchmark Problems for Spacecraft Formation Flying Missions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Leitner, Jesse A.; Burns, Richard D.; Folta, David C.

    2003-01-01

    To provide high-level focus to distributed space system flight dynamics and control research, several benchmark problems are suggested. These problems are not specific to any current or proposed mission, but instead are intended to capture high-level features that would be generic to many similar missions.

  11. Reinventing The Design Process: Teams and Models

    NASA Technical Reports Server (NTRS)

    Wall, Stephen D.

    1999-01-01

    The future of space mission designing will be dramatically different from the past. Formerly, performance-driven paradigms emphasized data return with cost and schedule being secondary issues. Now and in the future, costs are capped and schedules fixed-these two variables must be treated as independent in the design process. Accordingly, JPL has redesigned its design process. At the conceptual level, design times have been reduced by properly defining the required design depth, improving the linkages between tools, and managing team dynamics. In implementation-phase design, system requirements will be held in crosscutting models, linked to subsystem design tools through a central database that captures the design and supplies needed configuration management and control. Mission goals will then be captured in timelining software that drives the models, testing their capability to execute the goals. Metrics are used to measure and control both processes and to ensure that design parameters converge through the design process within schedule constraints. This methodology manages margins controlled by acceptable risk levels. Thus, teams can evolve risk tolerance (and cost) as they would any engineering parameter. This new approach allows more design freedom for a longer time, which tends to encourage revolutionary and unexpected improvements in design.

  12. Radiation transport modeling and assessment to better predict radiation exposure, dose, and toxicological effects to human organs on long duration space flights.

    PubMed

    Denkins, P; Badhwar, G; Obot, V; Wilson, B; Jejelewo, O

    2001-01-01

    NASA is very interested in improving its ability to monitor and forecast the radiation levels that pose a health risk to space-walking astronauts as they construct the International Space Station and astronauts that will participate in long-term and deep-space missions. Human exploratory missions to the moon and Mars within the next quarter century, will expose crews to transient radiation from solar particle events which include high-energy galactic cosmic rays and high-energy protons. Because the radiation levels in space are high and solar activity is presently unpredictable, adequate shielding is needed to minimize the deleterious health effects of exposure to radiation. Today, numerous models have been developed and used to predict radiation exposure. Such a model is the Space Environment Information Systems (SPENVIS) modeling program, developed by the Belgian Institute for Space Aeronautics. SPENVIS, which has been assessed to be an excellent tool in characterizing the radiation environment for microelectronics and investigating orbital debris, is being evaluated for its usefulness with determining the dose and dose-equivalent for human exposure. Thus far. the calculations for dose-depth relations under varying shielding conditions have been in agreement with calculations done using HZETRN and PDOSE, which are well-known and widely used models for characterizing the environments for human exploratory missions. There is disagreement when assessing the impact of secondary radiation particles since SPENVIS does a crude estimation of the secondary radiation particles when calculating LET versus Flux. SPENVIS was used to model dose-depth relations for the blood-forming organs. Radiation sickness and cancer are life-threatening consequences resulting from radiation exposure. In space. exposure to radiation generally includes all of the critical organs. Biological and toxicological impacts have been included for discussion along with alternative risk mitigation methods--shielding and anti-carcinogens. c 2001. Elsevier Science Ltd. All rights reserved.

  13. Radiation transport modeling and assessment to better predict radiation exposure, dose, and toxicological effects to human organs on long duration space flights

    NASA Technical Reports Server (NTRS)

    Denkins, P.; Badhwar, G.; Obot, V.; Wilson, B.; Jejelewo, O.

    2001-01-01

    NASA is very interested in improving its ability to monitor and forecast the radiation levels that pose a health risk to space-walking astronauts as they construct the International Space Station and astronauts that will participate in long-term and deep-space missions. Human exploratory missions to the moon and Mars within the next quarter century, will expose crews to transient radiation from solar particle events which include high-energy galactic cosmic rays and high-energy protons. Because the radiation levels in space are high and solar activity is presently unpredictable, adequate shielding is needed to minimize the deleterious health effects of exposure to radiation. Today, numerous models have been developed and used to predict radiation exposure. Such a model is the Space Environment Information Systems (SPENVIS) modeling program, developed by the Belgian Institute for Space Aeronautics. SPENVIS, which has been assessed to be an excellent tool in characterizing the radiation environment for microelectronics and investigating orbital debris, is being evaluated for its usefulness with determining the dose and dose-equivalent for human exposure. Thus far. the calculations for dose-depth relations under varying shielding conditions have been in agreement with calculations done using HZETRN and PDOSE, which are well-known and widely used models for characterizing the environments for human exploratory missions. There is disagreement when assessing the impact of secondary radiation particles since SPENVIS does a crude estimation of the secondary radiation particles when calculating LET versus Flux. SPENVIS was used to model dose-depth relations for the blood-forming organs. Radiation sickness and cancer are life-threatening consequences resulting from radiation exposure. In space. exposure to radiation generally includes all of the critical organs. Biological and toxicological impacts have been included for discussion along with alternative risk mitigation methods--shielding and anti-carcinogens. c 2001. Elsevier Science Ltd. All rights reserved.

  14. Radiation transport modeling and assessment to better predict radiation exposure, dose, and toxicological effects to human organs on long duration space flights

    NASA Astrophysics Data System (ADS)

    Denkins, Pamela; Badhwar, Gautam; Obot, Victor; Wilson, Bobby; Jejelewo, Olufisayo

    2001-08-01

    NASA is very interested in improving its ability to monitor and forecast the radiation levels that pose a health risk to space-walking astronauts as they construct the International Space Station and astronauts that will participate in long-term and deep-space missions. Human exploratory missions to the moon and Mars within the next quarter century, will expose crews to transient radiation from solar particle events which include high-energy galactic cosmic rays and high-energy protons. Because the radiation levels in space are high and solar activity is presently unpredictable, adequate shielding is needed to minimize the deleterious health effects of exposure to radiation. Today, numerous models have been developed and used to predict radiation exposure. Such a model is the Space Environment Information Systems (SPENVIS) modeling program, developed by the Belgian Institute for Space Aeronautics. SPENVIS, which has been assessed to be an excellent tool in characterizing the radiation environment for microelectronics and investigating orbital debris, is being evaluated for its usefulness with determining the dose and dose-equivalent for human exposure. Thus far, the calculations for dose-depth relations under varying shielding conditions have been in agreement with calculations done using HZETRN and PDOSE, which are well-known and widely used models for characterizing the environments for human exploratory missions. There is disagreement when assessing the impact of secondary radiation particles since SPENVIS does a crude estimation of the secondary radiation particles when calculating LET versus Flux. SPENVIS was used to model dose-depth relations for the blood-forming organs. Radiation sickness and cancer are life-threatening consequences resulting from radiation exposure. In space, exposure to radiation generally includes all of the critical organs. Biological and toxicological impacts have been included for discussion along with alternative risk mitigation methods — shielding and anti-carcinogens.

  15. CubeSat mission design software tool for risk estimating relationships

    NASA Astrophysics Data System (ADS)

    Gamble, Katharine Brumbaugh; Lightsey, E. Glenn

    2014-09-01

    In an effort to make the CubeSat risk estimation and management process more scientific, a software tool has been created that enables mission designers to estimate mission risks. CubeSat mission designers are able to input mission characteristics, such as form factor, mass, development cycle, and launch information, in order to determine the mission risk root causes which historically present the highest risk for their mission. Historical data was collected from the CubeSat community and analyzed to provide a statistical background to characterize these Risk Estimating Relationships (RERs). This paper develops and validates the mathematical model based on the same cost estimating relationship methodology used by the Unmanned Spacecraft Cost Model (USCM) and the Small Satellite Cost Model (SSCM). The RER development uses general error regression models to determine the best fit relationship between root cause consequence and likelihood values and the input factors of interest. These root causes are combined into seven overall CubeSat mission risks which are then graphed on the industry-standard 5×5 Likelihood-Consequence (L-C) chart to help mission designers quickly identify areas of concern within their mission. This paper is the first to document not only the creation of a historical database of CubeSat mission risks, but, more importantly, the scientific representation of Risk Estimating Relationships.

  16. Quantitative Validation of the Integrated Medical Model (IMM) for ISS Missions

    NASA Technical Reports Server (NTRS)

    Young, Millennia; Arellano, J.; Boley, L.; Garcia, Y.; Saile, L.; Walton, M.; Kerstman, E.; Reyes, D.; Goodenow, D. A.; Myers, J. G.

    2016-01-01

    Lifetime Surveillance of Astronaut Health (LSAH) provided observed medical event data on 33 ISS and 111 STS person-missions for use in further improving and validating the Integrated Medical Model (IMM). Using only the crew characteristics from these observed missions, the newest development version, IMM v4.0, will simulate these missions to predict medical events and outcomes. Comparing IMM predictions to the actual observed medical event counts will provide external validation and identify areas of possible improvement. In an effort to improve the power of detecting differences in this validation study, the total over each program ISS and STS will serve as the main quantitative comparison objective, specifically the following parameters: total medical events (TME), probability of loss of crew life (LOCL), and probability of evacuation (EVAC). Scatter plots of observed versus median predicted TMEs (with error bars reflecting the simulation intervals) will graphically display comparisons while linear regression will serve as the statistical test of agreement. Two scatter plots will be analyzed 1) where each point reflects a mission and 2) where each point reflects a condition-specific total number of occurrences. The coefficient of determination (R2) resulting from a linear regression with no intercept bias (intercept fixed at zero) will serve as an overall metric of agreement between IMM and the real world system (RWS). In an effort to identify as many possible discrepancies as possible for further inspection, the -level for all statistical tests comparing IMM predictions to observed data will be set to 0.1. This less stringent criterion, along with the multiple testing being conducted, should detect all perceived differences including many false positive signals resulting from random variation. The results of these analyses will reveal areas of the model requiring adjustment to improve overall IMM output, which will thereby provide better decision support for mission critical applications.

  17. The optimisation, design and verification of feed horn structures for future Cosmic Microwave Background missions

    NASA Astrophysics Data System (ADS)

    McCarthy, Darragh; Trappe, Neil; Murphy, J. Anthony; O'Sullivan, Créidhe; Gradziel, Marcin; Doherty, Stephen; Huggard, Peter G.; Polegro, Arturo; van der Vorst, Maarten

    2016-05-01

    In order to investigate the origins of the Universe, it is necessary to carry out full sky surveys of the temperature and polarisation of the Cosmic Microwave Background (CMB) radiation, the remnant of the Big Bang. Missions such as COBE and Planck have previously mapped the CMB temperature, however in order to further constrain evolutionary and inflationary models, it is necessary to measure the polarisation of the CMB with greater accuracy and sensitivity than before. Missions undertaking such observations require large arrays of feed horn antennas to feed the detector arrays. Corrugated horns provide the best performance, however owing to the large number required (circa 5000 in the case of the proposed COrE+ mission), such horns are prohibitive in terms of thermal, mechanical and cost limitations. In this paper we consider the optimisation of an alternative smooth-walled piecewise conical profiled horn, using the mode-matching technique alongside a genetic algorithm. The technique is optimised to return a suitable design using efficient modelling software and standard desktop computing power. A design is presented showing a directional beam pattern and low levels of return loss, cross-polar power and sidelobes, as required by future CMB missions. This design is manufactured and the measured results compared with simulation, showing excellent agreement and meeting the required performance criteria. The optimisation process described here is robust and can be applied to many other applications where specific performance characteristics are required, with the user simply defining the beam requirements.

  18. Error Analysis for High Resolution Topography with Bi-Static Single-Pass SAR Interferometry

    NASA Technical Reports Server (NTRS)

    Muellerschoen, Ronald J.; Chen, Curtis W.; Hensley, Scott; Rodriguez, Ernesto

    2006-01-01

    We present a flow down error analysis from the radar system to topographic height errors for bi-static single pass SAR interferometry for a satellite tandem pair. Because of orbital dynamics the baseline length and baseline orientation evolve spatially and temporally, the height accuracy of the system is modeled as a function of the spacecraft position and ground location. Vector sensitivity equations of height and the planar error components due to metrology, media effects, and radar system errors are derived and evaluated globally for a baseline mission. Included in the model are terrain effects that contribute to layover and shadow and slope effects on height errors. The analysis also accounts for nonoverlapping spectra and the non-overlapping bandwidth due to differences between the two platforms' viewing geometries. The model is applied to a 514 km altitude 97.4 degree inclination tandem satellite mission with a 300 m baseline separation and X-band SAR. Results from our model indicate that global DTED level 3 can be achieved.

  19. Radiation Hardness Assurance (RHA) for Small Missions

    NASA Technical Reports Server (NTRS)

    Campola, Michael J.

    2016-01-01

    Varied mission life and complexity is growing for small spacecraft. Small missions benefit from detailed hazard definition and evaluation as done in the past. Requirements need to flow from the system down to the parts level and aid system level radiation tolerance. RHA is highlighted with increasing COTS usage.

  20. Space transfer vehicle concepts and requirements study. Volume 2, book 3: STV system interfaces

    NASA Technical Reports Server (NTRS)

    Weber, Gary A.

    1991-01-01

    This report presents the results of systems analyses and conceptual design of space transfer vehicles (STV). The missions examined included piloted and unpiloted lunar outpost support and spacecraft servicing, and unpiloted payload delivery to various earth and solar orbits. The study goal was to examine the mission requirements and provide a decision data base for future programmatic development plans. The final lunar transfer vehicles provided a wide range of capabilities and interface requirements while maintaining a constant payload mission model. Launch vehicle and space station sensitivity was examined, with the final vehicles as point design covering the range of possible options. Development programs were defined and technology readiness levels for different options were determined. Volume 1 presents the executive summary, volume 2 provides the study results, and volume 3 the cost and WBS data.

  1. Long-term Preservation of Data Analysis Capabilities

    NASA Astrophysics Data System (ADS)

    Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.

    2015-09-01

    While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.

  2. [Medium-term forecast of solar cosmic rays radiation risk during a manned Mars mission].

    PubMed

    Petrov, V M; Vlasov, A G

    2006-01-01

    Medium-term forecasting radiation hazard from solar cosmic rays will be vital in a manned Mars mission. Modern methods of space physics lack acceptable reliability in medium-term forecasting the SCR onset and parameters. The proposed estimation of average radiation risk from SCR during the manned Mars mission is made with the use of existing SCR fluence and spectrum models and correlation of solar particle event frequency with predicted Wolf number. Radiation risk is considered an additional death probability from acute radiation reactions (ergonomic component) or acute radial disease in flight. The algorithm for radiation risk calculation is described and resulted risk levels for various periods of the 23-th solar cycle are presented. Applicability of this method to advance forecasting and possible improvements are being investigated. Recommendations to the crew based on risk estimation are exemplified.

  3. Calculation of Operations Efficiency Factors for Mars Surface Missions

    NASA Technical Reports Server (NTRS)

    Layback, Sharon L.

    2014-01-01

    For planning of Mars surface missions, to be operated on a sol-by-sol basis by a team on Earth (where a "sol" is a Martian day), activities are described in terms of "sol types" that are strung together to build a surface mission scenario. Some sol types require ground decisions based on a previous sol's results to feed into the activity planning ("ground in the loop"), while others do not. Due to the differences in duration between Earth days and Mars sols, for a given Mars local solar time, the corresponding Earth time "walks" relative to the corresponding times on the prior sol/day. In particular, even if a communication window has a fixed Mars local solar time, the Earth time for that window will be approximately 40 minutes later each succeeding day. Further complexity is added for non-Mars synchronous communication relay assets, and when there are multiple control centers in different Earth time zones. The solution is the development of "ops efficiency factors" that reflect the efficiency of a given operations configuration (how many and location of control centers, types of communication windows, synchronous or non-synchronous nature of relay assets, sol types, more-or-less sustainable operations schedule choices) against a theoretical "optimal" operations configuration for the mission being studied. These factors are then incorporated into scenario models in order to determine the surface duration (and therefore minimum spacecraft surface lifetime) required to fulfill scenario objectives. The resulting model is used to perform "what-if" analyses for variations in scenario objectives. The ops efficiency factor is the ratio of the figure of merit for a given operations factor to the figure of merit for the theoretical optimal configuration. The current implementation is a pair of models in Excel. The first represents a ground operations schedule for 500 sols in each operations configuration for the mission being studied (500 sols was chosen as being a long enough time to capture variations in relay asset interactions, Earth/Mars time phasing, and seasonal variations in holidays). This model is used to estimate the ops efficiency factor for each operations configuration. The second model in a separate Excel spreadsheet is a scenario model, which uses the sol types to rack up the total number of "scenario sols" for that scenario (in other words, the ideal number of sols it would take to perform the scenario objectives). Then, the number of sols requiring ground in the loop is calculated based on the soil types contained in the given scenario. Next, the scenario contains a description of what sequence of operations configurations is used, for how many days each, and this is used with the corresponding ops efficiency factors for each configuration to calculate the "ops duration" corresponding to that scenario. Finally, a margin is applied to determine the minimum surface lifetime required for that scenario. Typically, this level of analysis has not been performed until much later in the mission, and has not been able to influence mission design. Further, the notion of moving to sustainable operations during Prime Mission - and the effect that that move would have on surface mission productivity and mission objective choices - has not been encountered until the most recent rover missions (MSL and Mars 2018).

  4. Utilizing Mars Global Reference Atmospheric Model (Mars-GRAM 2005) to Evaluate Entry Probe Mission Sites

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.; Justus, C. G.

    2008-01-01

    Engineering-level atmospheric model widely used for diverse mission applications. Mars-GRAM s perturbation modeling capability is commonly used, in a Monte-Carlo mode, to perform high fidelity engineering end-to-end simulations for entry, descent, and landing (EDL)1. Traditional Mars-GRAM options for representing the mean atmosphere along entry corridors include: a) TES Mapping Years 1 and 2, with Mars-GRAM data coming from MGCM model results driven by observed TES dust optical depth; and b) TES Mapping Year 0, with user-controlled dust optical depth and Mars-GRAM data interpolated from MGCM model results driven by selected values of globally-uniform dust optical depth. From the surface to 80 km altitude, Mars-GRAM is based on NASA Ames Mars General Circulation Model (MGCM). Mars-GRAM and MGCM use surface topography from Mars Global Surveyor Mars Orbiter Laser Altimeter (MOLA), with altitudes referenced to the MOLA areoid, or constant potential surface. Mars-GRAM 2005 has been validated2 against Radio Science data, and both nadir and limb data from the Thermal Emission Spectrometer (TES)

  5. Recent and upcoming observations of the CARacterisation et Modelisation de l'ENvironnement (CARMEN) mission

    NASA Astrophysics Data System (ADS)

    Ecoffet, Robert; Maget, Vincent; Rolland, Guy; Lorfevre, Eric; Bourdarie, Sébastien; Boscher, Daniel

    2016-07-01

    We have developed a series of instruments for energetic particle measurements, associated with component test beds "MEX". The aim of this program is to check and improve space radiation engineering models and techniques. The first series of instruments, "ICARE" has flown on the MIR space station (SPICA mission), the ISS (SPICA-S mission) and the SAC-C low Earth polar orbiting satellite (ICARE mission 2001-2011) in cooperation with the Argentinian space agency CONAE. A second series of instruments "ICARE-NG" was and is flown as: - CARMEN-1 mission on CONAE's SAC-D, 650 km, 98°, 2011-2015, along with three "SODAD" space micro-debris detectors - CARMEN-2 mission on the JASON-2 satellite (CNES, JPL, EUMETSAT, NOAA), 1336 km, 66°, 2008-now, along with JAXA's LPT energetic particle detector - CARMEN-3 mission on the JASON-3 satellite in the same orbit as JASON-2, launched 17 January 2016, along with a plasma detector "AMBRE", and JAXA's LPT again. The ICARE-NG is spectrometer composed of a set of three fully depleted silicon solid state detectors used in single and coincident mode. The on-board measurements consist in accumulating energy loss spectra in the detectors over a programmable accumulation period. The spectra are generated through signal amplitude classification using 8 bit ADCs and resulting in 128/256 channels histograms. The discriminators reference levels, amplifier gain and accumulation time for the spectra are programmable to provide for possible on-board tuning optimization. Ground level calibrations have been made at ONERA-DESP using radioactive source emitting alpha particles in order to determine the exact correspondence between channel number and particle energy. To obtain the response functions to particles, a detailed sectoring analysis of the satellite associated with GEANT-4/MCNP-X calculations has been performed to characterize the geometrical factors of the each detector for p+ as well as for e- with different energies. The component test bed "MEX" is equipped with two different types of active dosimeters, P-MOS silicon dosimeters and OSL (optically stimulated luminescence). Those dosimeters provide independent measurements of ionizing and displacement damage doses and consolidate spectrometers' observations. The data sets obtained cover more than one solar cycle. Dynamics of the radiation belts, effects of solar particle events, coronal mass ejections and coronal holes were observed. Spectrometer measurements and dosimeter readings were used to evaluate current engineering models, and helped in developing improved ones, along with "space weather" radiation belt indices. The presentation will provide a comprehensive review of detector features and mission results.

  6. Effect of Voltage Level on Power System Design for Solar Electric Propulsion Missions

    NASA Technical Reports Server (NTRS)

    Kerslake, Thomas W.

    2003-01-01

    This paper presents study results quantifying the benefits of higher voltage, electric power system designs for a typical solar electric propulsion spacecraft Earth orbiting mission. A conceptual power system architecture was defined and design points were generated for system voltages of 28-V, 50-V, 120-V, and 300-V using state-of-the-art or advanced technologies. A 300-V 'direct-drive' architecture was also analyzed to assess the benefits of directly powering the electric thruster from the photovoltaic array without up-conversion. Fortran and spreadsheet computational models were exercised to predict the performance and size power system components to meet spacecraft mission requirements. Pertinent space environments, such as electron and proton radiation, were calculated along the spiral trajectory. In addition, a simplified electron current collection model was developed to estimate photovoltaic array losses for the orbital plasma environment and that created by the thruster plume. The secondary benefits of power system mass savings for spacecraft propulsion and attitude control systems were also quantified. Results indicate that considerable spacecraft wet mass savings were achieved by the 300-V and 300-V direct-drive architectures.

  7. Lidar remote sensing of savanna biophysical attributes

    NASA Astrophysics Data System (ADS)

    Gwenzi, David

    Although savanna ecosystems cover approximately 20 % of the terrestrial land surface and can have productivity equal to some closed forests, their role in the global carbon cycle is poorly understood. This study explored the applicability of a past spaceborne Lidar mission and the potential of future missions to estimate canopy height and carbon storage in these biomes. The research used data from two Oak savannas in California, USA: the Tejon Ranch Conservancy in Kern County and the Tonzi Ranch in Santa Clara County. In the first paper we used non-parametric regression techniques to estimate canopy height from waveform parameters derived from the Ice Cloud and land Elevation Satellite's Geoscience Laser Altimeter System (ICESat-GLAS) data. Merely adopting the methods derived for forests did not produce adequate results but the modeling was significantly improved by incorporating canopy cover information and interaction terms to address the high structural heterogeneity inherent to savannas. Paper 2 explored the relationship between canopy height and aboveground biomass. To accomplish this we developed generalized models using the classical least squares regression modeling approach to relate canopy height to above ground woody biomass and then employed Hierarchical Bayesian Analysis (HBA) to explore the implications of using generalized instead of species composition-specific models. Models that incorporated canopy cover proxies performed better than those that did not. Although the model parameters indicated interspecific variability, the distribution of the posterior densities of the differences between composition level and global level parameter values showed a high support for the use of global parameters, suggesting that these canopy height-biomass models are universally (large scale) applicable. As the spatial coverage of spaceborne lidar will remain limited for the immediate future, our objective in paper 3 was to explore the best means of extrapolating plot level biomass into wall-to-wall maps that provide more ecological information. We evaluated the utility of three spatial modeling approaches to address this problem: deterministic methods, geostatistical methods and an image segmentation approach. Overall, the mean pixel biomass estimated by the 3 approaches did not differ significantly but the output maps showed marked differences in the estimation precision and ability of each model to mimic the primary variable's trend across the landscape. The results emphasized the need for future satellite lidar missions to consider increasing the sampling intensity across track so that biomass observations are made and characterized at the scale at which they vary. We used data from the Multiple Altimeter Beam Experimental Lidar (MABEL), an airborne photon counting lidar sensor developed by NASA Goddard to simulate ICESat-2 data. We segmented each transect into different block sizes and calculated canopy top and mean ground elevation based on the structure of the histogram of the block's aggregated photons. Our algorithm was able to compute canopy height and generate visually meaningful vegetation profiles at MABEL's signal and noise levels but a simulation of the expected performance of ICESat-2 by adjusting MABEL data's detected number of signal and noise photons to that predicted using ATLAS instrument model design cases indicated that signal photons will be substantially lower. The lower data resolution reduces canopy height estimation precision especially in areas of low density vegetation cover. Given the clear difficulties in processing simulated ATLAS data, it appears unlikely that it will provide the kind of data required for mapping of the biophysical properties of savanna vegetation. Rather, resources are better concentrated on preparing for the Global Ecosystem Dynamics Investigation (GEDI) mission, a waveform lidar mission scheduled to launch by the end of this decade. In addition to the full waveform technique, GEDI will collect data from 25 m diameter contiguous footprints with a high across track density, a requirement that we identified as critically necessary in paper 3. (Abstract shortened by UMI.).

  8. Statistical Analysis of a Large Sample Size Pyroshock Test Data Set Including Post Flight Data Assessment. Revision 1

    NASA Technical Reports Server (NTRS)

    Hughes, William O.; McNelis, Anne M.

    2010-01-01

    The Earth Observing System (EOS) Terra spacecraft was launched on an Atlas IIAS launch vehicle on its mission to observe planet Earth in late 1999. Prior to launch, the new design of the spacecraft's pyroshock separation system was characterized by a series of 13 separation ground tests. The analysis methods used to evaluate this unusually large amount of shock data will be discussed in this paper, with particular emphasis on population distributions and finding statistically significant families of data, leading to an overall shock separation interface level. The wealth of ground test data also allowed a derivation of a Mission Assurance level for the flight. All of the flight shock measurements were below the EOS Terra Mission Assurance level thus contributing to the overall success of the EOS Terra mission. The effectiveness of the statistical methodology for characterizing the shock interface level and for developing a flight Mission Assurance level from a large sample size of shock data is demonstrated in this paper.

  9. Early Formulation Model-centric Engineering on Nasa's Europa Mission Concept Study

    NASA Technical Reports Server (NTRS)

    Bayer, Todd; Chung, Seung; Cole, Bjorn; Cooke, Brian; Dekens, Frank; Delp, Chris; Gontijo, I.; Lewis, Kari; Moshir, Mehrdad; Rasmussen, Robert; hide

    2012-01-01

    By leveraging the existing Model-Based Systems Engineering (MBSE) infrastructure at JPL and adding a modest investment, the Europa Mission Concept Study made striking advances in mission concept capture and analysis. This effort has reaffirmed the importance of architecting and successfully harnessed the synergistic relationship of system modeling to mission architecting. It clearly demonstrated that MBSE can provide greater agility than traditional systems engineering methods. This paper will describe the successful application of MBSE in the dynamic environment of early mission formulation, the significant results produced and lessons learned in the process.

  10. The neutron star interior composition explorer (NICER): mission definition

    NASA Astrophysics Data System (ADS)

    Arzoumanian, Z.; Gendreau, K. C.; Baker, C. L.; Cazeau, T.; Hestnes, P.; Kellogg, J. W.; Kenyon, S. J.; Kozon, R. P.; Liu, K.-C.; Manthripragada, S. S.; Markwardt, C. B.; Mitchell, A. L.; Mitchell, J. W.; Monroe, C. A.; Okajima, T.; Pollard, S. E.; Powers, D. F.; Savadkin, B. J.; Winternitz, L. B.; Chen, P. T.; Wright, M. R.; Foster, R.; Prigozhin, G.; Remillard, R.; Doty, J.

    2014-07-01

    Over a 10-month period during 2013 and early 2014, development of the Neutron star Interior Composition Explorer (NICER) mission [1] proceeded through Phase B, Mission Definition. An external attached payload on the International Space Station (ISS), NICER is scheduled to launch in 2016 for an 18-month baseline mission. Its prime scientific focus is an in-depth investigation of neutron stars—objects that compress up to two Solar masses into a volume the size of a city—accomplished through observations in 0.2-12 keV X-rays, the electromagnetic band into which the stars radiate significant fractions of their thermal, magnetic, and rotational energy stores. Additionally, NICER enables the Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) demonstration of spacecraft navigation using pulsars as beacons. During Phase B, substantive refinements were made to the mission-level requirements, concept of operations, and payload and instrument design. Fabrication and testing of engineering-model components improved the fidelity of the anticipated scientific performance of NICER's X-ray Timing Instrument (XTI), as well as of the payload's pointing system, which enables tracking of science targets from the ISS platform. We briefly summarize advances in the mission's formulation that, together with strong programmatic performance in project management, culminated in NICER's confirmation by NASA into Phase C, Design and Development, in March 2014.

  11. Building a Shared Definitional Model of Long Duration Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Orr, M.; Whitmire, A.; Sandoval, L.; Leveton, L.; Arias, D.

    2011-01-01

    In 1956, on the eve of human space travel Strughold first proposed a simple classification of the present and future stages of manned flight that identified key factors, risks and developmental stages for the evolutionary journey ahead. As we look to optimize the potential of the ISS as a gateway to new destinations, we need a current shared working definitional model of long duration human space flight to help guide our path. Initial search of formal and grey literature augmented by liaison with subject matter experts. Search strategy focused on both the use of term long duration mission and long duration spaceflight, and also broader related current and historical definitions and classification models of spaceflight. The related sea and air travel literature was also subsequently explored with a view to identifying analogous models or classification systems. There are multiple different definitions and classification systems for spaceflight including phase and type of mission, craft and payload and related risk management models. However the frequently used concepts of long duration mission and long duration spaceflight are infrequently operationally defined by authors, and no commonly referenced classical or gold standard definition or model of these terms emerged from the search. The categorization (Cat) system for sailing was found to be of potential analogous utility, with its focus on understanding the need for crew and craft autonomy at various levels of potential adversity and inability to gain outside support or return to a safe location, due to factors of time, distance and location.

  12. Lessons Learned During Instrument Testing for the Thermal Infrared Sensor (TIRS)

    NASA Technical Reports Server (NTRS)

    Peabody, Hume L.; Otero, Veronica; Neuberger, David

    2013-01-01

    The Themal InfraRed Sensor (TIRS) instrument, set to launch on the Landsat Data Continuity Mission in 2013, features a passively cooled telescope and IR detectors which are actively cooled by a two stage cryocooler. In order to proceed to the instrument level test campaign, at least one full functional test was required, necessitating a thermal vacuum test to sufficiently cool the detectors and demonstrate performance. This was fairly unique in that this test occurred before the Pre Environmental Review, but yielded significant knowledge gains before the planned instrument level test. During the pre-PER test, numerous discrepancies were found between the model and the actual hardware, which were revealed by poor correlation between model predictions and test data. With the inclusion of pseudo-balance points, the test also provided an opportunity to perform a pre-correlation to test data prior to the instrument level test campaign. Various lessons were learned during this test related to modeling and design of both the flight hardware and the Ground Support Equipment and test setup. The lessons learned in the pre-PER test resulted in a better test setup for the nstrument level test and the completion of the final instrument model correlation in a shorter period of time. Upon completion of the correlation, the flight predictions were generated including the full suite of off-nominal cases, including some new cases defined by the spacecraft. For some of these ·new cases, some components now revealed limit exceedances, in particular for a portion of the hardware that could not be tested due to its size and chamber limitations.. Further lessons were learned during the completion of flight predictions. With a correlated detalled instrument model, significant efforts were made to generate a reduced model suitable for observatory level analyses. This proved a major effort both to generate an appropriate network as well as to convert to the final model to the required format and yielded additional lessons learned. In spite of all the challenges encountered by TIRS, the instrument was successfully delivered to the spacecraft and will soon be tested at observatory level in preparation for a successful mission launch.

  13. The Integrated Mission Design Center (IMDC) at NASA Goddard Space Flight Center

    NASA Technical Reports Server (NTRS)

    Karpati, Gabriel; Martin, John; Steiner, Mark; Reinhardt, K.

    2002-01-01

    NASA Goddard has used its Integrated Mission Design Center (IMDC) to perform more than 150 mission concept studies. The IMDC performs rapid development of high-level, end-to-end mission concepts, typically in just 4 days. The approach to the studies varies, depending on whether the proposed mission is near-future using existing technology, mid-future using new technology being actively developed, or far-future using technology which may not yet be clearly defined. The emphasis and level of detail developed during any particular study depends on which timeframe (near-, mid-, or far-future) is involved and the specific needs of the study client. The most effective mission studies are those where mission capabilities required and emerging technology developments can synergistically work together; thus both enhancing mission capabilities and providing impetus for ongoing technology development.

  14. Evaluation of GOCE-based Global Geoid Models in Finnish Territory

    NASA Astrophysics Data System (ADS)

    Saari, Timo; Bilker-Koivula, Mirjam

    2015-04-01

    The gravity satellite mission GOCE made its final observations in the fall of 2013. By then it had exceeded its expected lifespan of one year with more than three additional years. Thus, the mission collected more data from the Earth's gravitational field than expected, and more comprehensive global geoid models have been derived ever since. The GOCE High-level Processing Facility (HPF) by ESA has published GOCE global gravity field models annually. We compared all of the 12 HPF-models as well as 3 additional GOCE, 11 GRACE and 6 combined GOCE+GRACE models with GPS-levelling data and gravity observations in Finland. The most accurate models were compared against high resolution global geoid models EGM96 and EGM2008. The models were evaluated up to three different degrees and order: 150 (the common maximum for the GRACE models), 240 (the common maximum for the GOCE models) and maximum. When coefficients up to degree and order 150 are used, the results of the GOCE models are comparable with the results of the latest GRACE models. Generally, all of the latest GOCE and GOCE+GRACE models give standard deviations of the height anomaly differences of around 15 cm and of gravity anomaly differences of around 10 mgal over Finland. The best solutions were not always achieved with the highest maximum degree and order of the satellite gravity field models, since the highest coefficients (above 240) may be less accurately determined. Over Finland, the latest GOCE and GOCE+GRACE models give similar results as the high resolution models EGM96 and EGM2008 when coefficients up to degree and order 240 are used. This is mainly due to the high resolution terrestrial data available in the area of Finland, which was used in the high resolution models.

  15. Toward a Climate OSSE for NASA Earth Sciences

    NASA Astrophysics Data System (ADS)

    Leroy, S. S.; Collins, W. D.; Feldman, D.; Field, R. D.; Ming, Y.; Pawson, S.; Sanderson, B.; Schmidt, G. A.

    2016-12-01

    In the Continuity Study, the National Academy of Sciences advised that future space missions be rated according to five categories: the importance of a well-defined scientific objective, the utility of the observation in addressing the scientific objective, the quality with which the observation can be made, the probability of the mission's success, and the mission's affordability. The importance, probability, and affordability are evaluated subjectively by scientific consensus, by engineering review panels, and by cost models; however, the utility and quality can be evaluated objectively by a climate observation system simulation experiment (COSSE). A discussion of the philosophical underpinnings of a COSSE for NASA Earth Sciences will be presented. A COSSE is built upon a perturbed physics ensemble of a sophisticated climate model that can simulate a mission's prospective observations and its well-defined quantitative scientific objective and that can capture the uncertainty associated with each. A strong correlation between observation and scientific objective after consideration of physical uncertainty leads to a high quality. Persistence of a high correlation after inclusion of the proposed measurement error leads to a high utility. There are five criteria that govern that nature of a particular COSSE: (1) whether the mission's scientific objective is one of hypothesis testing or climate prediction, (2) whether the mission is empirical or inferential, (3) whether the core climate model captures essential physical uncertainties, (4) the level of detail of the simulated observations, and (5) whether complementarity or redundancy of information is to be valued. Computation of the quality and utility is done using Bayesian statistics, as has been done previously for multi-decadal climate prediction conditioned on existing data. We advocate for a new program within NASA Earth Sciences to establish a COSSE capability. Creation of a COSSE program within NASA Earth Sciences will require answers from the climate research community to basic questions, such as whether a COSSE capability should be centralized or de-centralized. Most importantly, the quantified scientific objective of a proposed mission must be defined with extreme specificity for a COSSE to be applied.

  16. Integrated Medical Model Project - Overview and Summary of Historical Application

    NASA Technical Reports Server (NTRS)

    Myers, J.; Boley, L.; Butler, D.; Foy, M.; Goodenow, D.; Griffin, D.; Keenan, A.; Kerstman, E.; Melton, S.; McGuire, K.; hide

    2015-01-01

    Introduction: The Integrated Medical Model (IMM) Project represents one aspect of NASA's Human Research Program (HRP) to quantitatively assess medical risks to astronauts for existing operational missions as well as missions associated with future exploration and commercial space flight ventures. The IMM takes a probabilistic approach to assessing the likelihood and specific outcomes of one hundred medical conditions within the envelope of accepted space flight standards of care over a selectable range of mission capabilities. A specially developed Integrated Medical Evidence Database (iMED) maintains evidence-based, organizational knowledge across a variety of data sources. Since becoming operational in 2011, version 3.0 of the IMM, the supporting iMED, and the expertise of the IMM project team have contributed to a wide range of decision and informational processes for the space medical and human research community. This presentation provides an overview of the IMM conceptual architecture and range of application through examples of actual space flight community questions posed to the IMM project. Methods: Figure 1 [see document] illustrates the IMM modeling system and scenario process. As illustrated, the IMM computational architecture is based on Probabilistic Risk Assessment techniques. Nineteen assumptions and limitations define the IMM application domain. Scenario definitions include crew medical attributes and mission specific details. The IMM forecasts probabilities of loss of crew life (LOCL), evacuation (EVAC), quality time lost during the mission, number of medical resources utilized and the number and type of medical events by combining scenario information with in-flight, analog, and terrestrial medical information stored in the iMED. In addition, the metrics provide the integrated information necessary to estimate optimized in-flight medical kit contents under constraints of mass and volume or acceptable level of mission risk. Results and Conclusions: Historically, IMM simulations support Science and Technology planning, Exploration mission planning, and ISS program operations by supplying simulation support, iMED data information, and subject matter expertise to Crew Health and Safety and the HRP. Upcoming release of IMM version 4.0 seeks to provide enhanced functionality to increase the quality of risk decisions made using the IMM through a more accurate representation of the real world system.

  17. Capability-Based Modeling Methodology: A Fleet-First Approach to Architecture

    DTIC Science & Technology

    2014-02-01

    reconnaissance (ISR) aircraft , or unmanned systems . Accordingly, a mission architecture used to model SAG operations for a given Fleet unit should include all...would use an ISR aircraft to increase fidelity of a targeting solution; another mission thread to show how unmanned systems can augment targeting... unmanned systems . Therefore, an architect can generate, from a comprehensive SAG mission architecture, individual mission threads that model how a SAG

  18. Spacecraft Complexity Subfactors and Implications on Future Cost Growth

    NASA Technical Reports Server (NTRS)

    Leising, Charles J.; Wessen, Randii; Ellyin, Ray; Rosenberg, Leigh; Leising, Adam

    2013-01-01

    During the last ten years the Jet Propulsion Laboratory has used a set of cost-risk subfactors to independently estimate the magnitude of development risks that may not be covered in the high level cost models employed during early concept development. Within the last several years the Laboratory has also developed a scale of Concept Maturity Levels with associated criteria to quantitatively assess a concept's maturity. This latter effort has been helpful in determining whether a concept is mature enough for accurate costing but it does not provide any quantitative estimate of cost risk. Unfortunately today's missions are significantly more complex than when the original cost-risk subfactors were first formulated. Risks associated with complex missions are not being adequately evaluated and future cost growth is being underestimated. The risk subfactor process needed to be updated.

  19. Engineering Risk Assessment of Space Thruster Challenge Problem

    NASA Technical Reports Server (NTRS)

    Mathias, Donovan L.; Mattenberger, Christopher J.; Go, Susie

    2014-01-01

    The Engineering Risk Assessment (ERA) team at NASA Ames Research Center utilizes dynamic models with linked physics-of-failure analyses to produce quantitative risk assessments of space exploration missions. This paper applies the ERA approach to the baseline and extended versions of the PSAM Space Thruster Challenge Problem, which investigates mission risk for a deep space ion propulsion system with time-varying thruster requirements and operations schedules. The dynamic mission is modeled using a combination of discrete and continuous-time reliability elements within the commercially available GoldSim software. Loss-of-mission (LOM) probability results are generated via Monte Carlo sampling performed by the integrated model. Model convergence studies are presented to illustrate the sensitivity of integrated LOM results to the number of Monte Carlo trials. A deterministic risk model was also built for the three baseline and extended missions using the Ames Reliability Tool (ART), and results are compared to the simulation results to evaluate the relative importance of mission dynamics. The ART model did a reasonable job of matching the simulation models for the baseline case, while a hybrid approach using offline dynamic models was required for the extended missions. This study highlighted that state-of-the-art techniques can adequately adapt to a range of dynamic problems.

  20. Mitigating Adverse Effects of a Human Mission On Possible Martian Indigenous Ecosystems

    NASA Technical Reports Server (NTRS)

    Lupisella, Mark L.

    2000-01-01

    Although human beings are, by most standards, the most capable agents to search for and detect extraterrestrial life, we are also potentially the most harmful. While there has been substantial work regarding forward contamination with respect to robotic missions, the issue of potential adverse effects on possible indigenous Martian ecosystems, such as biological contamination, due to a human mission has remained relatively unexplored and may require our attention now as this presentation will try to demonstrate by exploring some of the relevant scientific questions, mission planning challenges, and policy issues. An informal, high-level mission planning decision tree will be discussed and is included as the next page of this abstract. Some of the questions to be considered are: To what extent could contamination due to a human presence compromise possible indigenous life forms? To what extent can we control contamination? For example, will it be local or global? What are the criteria for assessing the biological status of Mars, both regionally and globally? For example, can we adequately extrapolate from a few strategic missions such as sample return missions? What should our policies be regarding our mission planning and possible interaction with what are likely to be microbial forms of extraterrestrial life? Central to the science and mission planning issues is the role and applicability of terrestrial analogs, such as Lake Vostok for assessing drilling issues, and modeling techniques. Central to many of the policy aspects are scientific value, international law, public concern, and ethics. Exploring this overall issue responsibly requires an examination of all these aspects and how they interrelate.

  1. Overview of NASA GRCs Green Propellant Infusion Mission Thruster Testing and Plume Diagnostics

    NASA Technical Reports Server (NTRS)

    Deans, Matthew C.; Reed, Brian D.; Yim, John T.; Arrington, Lynn A.; Williams, George J.; Kojima, Jun J.; McLean, Christopher H.

    2014-01-01

    The Green Propellant Infusion Mission (GPIM) is sponsored by NASA's Space Technology Mission Directorate (STMD) Technology Demonstration Mission (TDM) office. The goal of GPIM is to advance the technology readiness level of a green propulsion system, specifically, one using the monopropellant, AF-M315E, by demonstrating ground handling, spacecraft processing, and on-orbit operations. One of the risks identified for GPIM is potential contamination of sensitive spacecraft surfaces from the effluents in the plumes of AF-M315E thrusters. NASA Glenn Research Center (GRC) is conducting activities to characterize the effects of AF-M315E plume impingement and deposition. GRC has established individual plume models of the 22-N and 1-N thrusters that will be used on the GPIM spacecraft. The models describe the pressure, temperature, density, Mach number, and species concentration of the AF-M315E thruster exhaust plumes. The models are being used to assess the impingement effects of the AF-M315E thrusters on the GPIM spacecraft. The model simulations will be correlated with plume measurement data from Laboratory and Engineering Model 22-N, AF-M315E thrusters. The thrusters will be tested in a small rocket, altitude facility at NASA GRC. The GRC thruster testing will be conducted at duty cycles representatives of the planned GPIM maneuvers. A suite of laser-based diagnostics, including Raman spectroscopy, Rayleigh spectroscopy, Schlieren imaging, and physical probes will be used to acquire plume measurements of AFM315E thrusters. Plume data will include temperature, velocity, relative density, and species concentration. The plume measurement data will be compared to the corresponding simulations of the plume model. The GRC effort will establish a data set of AF-M315E plume measurements and a plume model that can be used for future AF-M315E applications.

  2. Composite load spectra for select space propulsion structural components

    NASA Technical Reports Server (NTRS)

    Newell, J. F.; Ho, H. W.; Kurth, R. E.

    1991-01-01

    The work performed to develop composite load spectra (CLS) for the Space Shuttle Main Engine (SSME) using probabilistic methods. The three methods were implemented to be the engine system influence model. RASCAL was chosen to be the principal method as most component load models were implemented with the method. Validation of RASCAL was performed. High accuracy comparable to the Monte Carlo method can be obtained if a large enough bin size is used. Generic probabilistic models were developed and implemented for load calculations using the probabilistic methods discussed above. Each engine mission, either a real fighter or a test, has three mission phases: the engine start transient phase, the steady state phase, and the engine cut off transient phase. Power level and engine operating inlet conditions change during a mission. The load calculation module provides the steady-state and quasi-steady state calculation procedures with duty-cycle-data option. The quasi-steady state procedure is for engine transient phase calculations. In addition, a few generic probabilistic load models were also developed for specific conditions. These include the fixed transient spike model, the poison arrival transient spike model, and the rare event model. These generic probabilistic load models provide sufficient latitude for simulating loads with specific conditions. For SSME components, turbine blades, transfer ducts, LOX post, and the high pressure oxidizer turbopump (HPOTP) discharge duct were selected for application of the CLS program. They include static pressure loads and dynamic pressure loads for all four components, centrifugal force for the turbine blade, temperatures of thermal loads for all four components, and structural vibration loads for the ducts and LOX posts.

  3. Space Mission Concept Development Using Concept Maturity Levels

    NASA Technical Reports Server (NTRS)

    Wessen, Randii R.; Borden, Chester; Ziemer, John; Kwok, Johnny

    2013-01-01

    Over the past five years, pre-project formulation experts at the Jet Propulsion Laboratory (JPL) has developed and implemented a method for measuring and communicating the maturity of space mission concepts. Mission concept development teams use this method, and associated tools, prior to concepts entering their Formulation Phases (Phase A/B). The organizing structure is Concept Maturity Level (CML), which is a classification system for characterizing the various levels of a concept's maturity. The key strength of CMLs is the ability to evolve mission concepts guided by an incremental set of assessment needs. The CML definitions have been expanded into a matrix form to identify the breadth and depth of analysis needed for a concept to reach a specific level of maturity. This matrix enables improved assessment and communication by addressing the fundamental dimensions (e.g., science objectives, mission design, technical risk, project organization, cost, export compliance, etc.) associated with mission concept evolution. JPL's collaborative engineering, dedicated concept development, and proposal teams all use these and other CML-appropriate design tools to advance their mission concept designs. This paper focuses on mission concept's early Pre-Phase A represented by CMLs 1- 4. The scope was limited due to the fact that CMLs 5 and 6 are already well defined based on the requirements documented in specific Announcement of Opportunities (AO) and Concept Study Report (CSR) guidelines, respectively, for competitive missions; and by NASA's Procedural Requirements NPR 7120.5E document for Projects in their Formulation Phase.

  4. Mars orbiter conceptual systems design study

    NASA Technical Reports Server (NTRS)

    Dixon, W.; Vogl, J.

    1982-01-01

    Spacecraft system and subsystem designs at the conceptual level to perform either of two Mars Orbiter missions, a Climatology Mission and an Aeronomy Mission were developed. The objectives of these missions are to obtain and return data.

  5. Exoplanet Community Report on Direct Infrared Imaging of Exoplanets

    NASA Technical Reports Server (NTRS)

    Danchi, William C.; Lawson, Peter R.

    2009-01-01

    Direct infrared imaging and spectroscopy of exoplanets will allow for detailed characterization of the atmospheric constituents of more than 200 nearby Earth-like planets, more than is possible with any other method under consideration. A flagship mission based on larger passively cooled infrared telescopes and formation flying technologies would have the highest angular resolution of any concept under consideration. The 2008 Exoplanet Forum committee on Direct Infrared Imaging of Exoplanets recommends: (1) a vigorous technology program including component development, integrated testbeds, and end-to-end modeling in the areas of formation flying and mid-infrared nulling; (2) a probe-scale mission based on a passively cooled structurally connected interferometer to be started within the next two to five years, for exoplanetary system characterization that is not accessible from the ground, and which would provide transformative science and lay the engineering groundwork for the flagship mission with formation flying elements. Such a mission would enable a complete exozodiacal dust survey (<1 solar system zodi) in the habitable zone of all nearby stars. This information will allow for a more efficient strategy of spectral characterization of Earth-sized planets for the flagship missions, and also will allow for optimization of the search strategy of an astrometric mission if such a mission were delayed due to cost or technology reasons. (3) Both the flagship and probe missions should be pursued with international partners if possible. Fruitful collaboration with international partners on mission concepts and relevant technology should be continued. (4) Research and Analysis (R&A) should be supported for the development of preliminary science and mission designs. Ongoing efforts to characterize the the typical level of exozodiacal light around Sun-like stars with ground-based nulling technology should be continued.

  6. Rapid Cost Assessment of Space Mission Concepts Through Application of Complexity-Based Cost Indices

    NASA Technical Reports Server (NTRS)

    Peterson, Craig E.; Cutts, James; Balint, Tibor; Hall, James B.

    2008-01-01

    This slide presentation reviews the development of a rapid cost assessment models for evaluation of exploration missions through the application of complexity based cost indices. In Fall of 2004, NASA began developing 13 documents, known as "strategic roadmaps," intended to outline a strategy for space exploration over the next 30 years. The Third Strategic Roadmap, The Strategic Roadmap for Solar System Exploration, focused on strategy for robotic exploration of the Solar System. Development of the Strategic Roadmap for Solar System Exploration led to the investigation of a large variety of missions. However, the necessity of planning around scientific inquiry and budgetary constraints made it necessary for the roadmap development team to evaluate potential missions not only for scientific return but also cost. Performing detailed cost studies for each of the large number of missions was impractical given the time constraints involved and lack of detailed mission studies; so a method of rapid cost assessment was developed by us to allow preliminary analysis. It has been noted that there is a strong correlation between complexity and cost and schedule of planetary missions. While these correlations were made after missions had been built and flown (successfully or otherwise), it seemed likely that a similar approach could provide at least some relative cost ranking. Cost estimation relationships (CERs) have been developed based on subsystem design choices. These CERs required more detailed information than available, forcing the team to adopt a more high level approach. Costing by analogy has been developed for small satellites, however, planetary exploration missions provide such varying spacecraft requirements that there is a lack of adequately comparable missions that can be used for analogy.

  7. Mesoscale resolution capability of altimetry: Present and future

    NASA Astrophysics Data System (ADS)

    Dufau, Claire; Orsztynowicz, Marion; Dibarboure, Gérald; Morrow, Rosemary; Le Traon, Pierre-Yves

    2016-07-01

    Wavenumber spectra of along-track Sea Surface Height from the most recent satellite radar altimetry missions [Jason-2, Cryosat-2, and SARAL/Altika) are used to determine the size of ocean dynamical features observable with the present altimetry constellation. A global analysis of the along-track 1-D mesoscale resolution capability of the present-day altimeter missions is proposed, based on a joint analysis of the spectral slopes in the mesoscale band and the error levels observed for horizontal wavelengths lower than 20km. The global sea level spectral slope distribution provided by Xu and Fu with Jason-1 data is revisited with more recent altimeter missions, and maps of altimeter error levels are provided and discussed for each mission. Seasonal variations of both spectral slopes and altimeter error levels are also analyzed for Jason-2. SARAL/Altika, with its lower error levels, is shown to detect smaller structures everywhere. All missions show substantial geographical and temporal variations in their mesoscale resolution capabilities, with variations depending mostly on the error level change but also on slight regional changes in the spectral slopes. In western boundary currents where the signal to noise ratio is favorable, the along-track mesoscale resolution is approximately 40 km for SARAL/AltiKa, 45 km for Cryosat-2, and 50 km for Jason-2. Finally, a prediction of the future 2-D mesoscale sea level resolution capability of the Surface Water and Ocean Topography (SWOT) mission is given using a simulated error level.

  8. SRG110 Stirling Generator Dynamic Simulator Vibration Test Results and Analysis Correlation

    NASA Technical Reports Server (NTRS)

    Suarez, Vicente J.; Lewandowski, Edward J.; Callahan, John

    2006-01-01

    The U.S. Department of Energy (DOE), Lockheed Martin (LM), and NASA Glenn Research Center (GRC) have been developing the Stirling Radioisotope Generator (SRG110) for use as a power system for space science missions. The launch environment enveloping potential missions results in a random input spectrum that is significantly higher than historical RPS launch levels and is a challenge for designers. Analysis presented in prior work predicted that tailoring the compliance at the generator-spacecraft interface reduced the dynamic response of the system thereby allowing higher launch load input levels and expanding the range of potential generator missions. To confirm analytical predictions, a dynamic simulator representing the generator structure, Stirling convertors and heat sources was designed and built for testing with and without a compliant interface. Finite element analysis was performed to guide the generator simulator and compliant interface design so that test modes and frequencies were representative of the SRG110 generator. This paper presents the dynamic simulator design, the test setup and methodology, test article modes and frequencies and dynamic responses, and post-test analysis results. With the compliant interface, component responses to an input environment exceeding the SRG110 qualification level spectrum were all within design allowables. Post-test analysis included finite element model tuning to match test frequencies and random response analysis using the test input spectrum. Analytical results were in good overall agreement with the test results and confirmed previous predictions that the SRG110 power system may be considered for a broad range of potential missions, including those with demanding launch environments.

  9. SRG110 Stirling Generator Dynamic Simulator Vibration Test Results and Analysis Correlation

    NASA Technical Reports Server (NTRS)

    Lewandowski, Edward J.; Suarez, Vicente J.; Goodnight, Thomas W.; Callahan, John

    2007-01-01

    The U.S. Department of Energy (DOE), Lockheed Martin (LM), and NASA Glenn Research Center (GRC) have been developing the Stirling Radioisotope Generator (SRG110) for use as a power system for space science missions. The launch environment enveloping potential missions results in a random input spectrum that is significantly higher than historical radioisotope power system (RPS) launch levels and is a challenge for designers. Analysis presented in prior work predicted that tailoring the compliance at the generator-spacecraft interface reduced the dynamic response of the system thereby allowing higher launch load input levels and expanding the range of potential generator missions. To confirm analytical predictions, a dynamic simulator representing the generator structure, Stirling convertors and heat sources were designed and built for testing with and without a compliant interface. Finite element analysis was performed to guide the generator simulator and compliant interface design so that test modes and frequencies were representative of the SRG110 generator. This paper presents the dynamic simulator design, the test setup and methodology, test article modes and frequencies and dynamic responses, and post-test analysis results. With the compliant interface, component responses to an input environment exceeding the SRG110 qualification level spectrum were all within design allowables. Post-test analysis included finite element model tuning to match test frequencies and random response analysis using the test input spectrum. Analytical results were in good overall agreement with the test results and confirmed previous predictions that the SRG110 power system may be considered for a broad range of potential missions, including those with demanding launch environments.

  10. The Evolving Community College Mission in the Context of State Governance.

    ERIC Educational Resources Information Center

    Tollefson, Terrence A.

    State-level governance of community colleges has become increasingly common in the United States, with governance decisions affecting budget appropriations, rules on how appropriations can be spent, and the missions that colleges must strive to fulfill. The most common elements of state-level community college mission statements over the past 100…

  11. Model-Based Verification and Validation of the SMAP Uplink Processes

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Dubos, Gregory F.; Tirona, Joseph; Standley, Shaun

    2013-01-01

    This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V&V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process.Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based V&V development efforts.

  12. Track structure model for damage to mammalian cell cultures during solar proton events

    NASA Technical Reports Server (NTRS)

    Cucinotta, F. A.; Wilson, J. W.; Townsend, L. W.; Shinn, J. L.; Katz, R.

    1992-01-01

    Solar proton events (SPEs) occur infrequently and unpredictably, thus representing a potential hazard to interplanetary space missions. Biological damage from SPEs will be produced principally through secondary electron production in tissue, including important contributions due to delta rays from nuclear reaction products. We review methods for estimating the biological effectiveness of SPEs using a high energy proton model and the parametric cellular track model. Results of the model are presented for several of the historically largest flares using typical levels and body shielding.

  13. Mars Orbiter Study. Volume 2: Mission Design, Science Instrument Accommodation, Spacecraft Design

    NASA Technical Reports Server (NTRS)

    Drean, R.; Macpherson, D.; Steffy, D.; Vargas, T.; Shuman, B.; Anderson, K.; Richards, B.

    1982-01-01

    Spacecraft system and subsystem designs were developed at the conceptual level to perform either of two Mars Orbiter Missions, a Climatology Mission and an Aeronomy Mission. The objectives of these missions are to obtain and return data to increase knowledge of Mars.

  14. Province/Ministry-Coordinated Industry-University-Institute Cooperation and University Development: Based on the Experiences of Guangdong Province

    ERIC Educational Resources Information Center

    Yang, Liu

    2016-01-01

    The industry S&T missioners, industry-university-institute innovation alliances, industry-university-institute regional model bases, and other provincial-level industry-university-institute cooperation mechanisms that Guangdong Province has formed through its practical efforts play an important role in training a large batch of practical…

  15. Does Mission Matter? An Analysis of Private School Achievement Differences

    ERIC Educational Resources Information Center

    Boerema, Albert J.

    2009-01-01

    Using student achievement data from British Columbia, Canada, this study is an exploration of the differences that lie within the private school sector using hierarchical linear modeling to analyze the data. The analysis showed that when controlling for language, parents' level of educational attainment, and prior achievement, the private school…

  16. A decision model for planetary missions

    NASA Technical Reports Server (NTRS)

    Hazelrigg, G. A., Jr.; Brigadier, W. L.

    1976-01-01

    Many techniques developed for the solution of problems in economics and operations research are directly applicable to problems involving engineering trade-offs. This paper investigates the use of utility theory for decision making in planetary exploration space missions. A decision model is derived that accounts for the objectives of the mission - science - the cost of flying the mission and the risk of mission failure. A simulation methodology for obtaining the probability distribution of science value and costs as a function spacecraft and mission design is presented and an example application of the decision methodology is given for various potential alternatives in a comet Encke mission.

  17. Mission planning for autonomous systems

    NASA Technical Reports Server (NTRS)

    Pearson, G.

    1987-01-01

    Planning is a necessary task for intelligent, adaptive systems operating independently of human controllers. A mission planning system that performs task planning by decomposing a high-level mission objective into subtasks and synthesizing a plan for those tasks at varying levels of abstraction is discussed. Researchers use a blackboard architecture to partition the search space and direct the focus of attention of the planner. Using advanced planning techniques, they can control plan synthesis for the complex planning tasks involved in mission planning.

  18. The SMART Theory and Modeling Team: An Integrated Element of Mission Development and Science Analysis

    NASA Technical Reports Server (NTRS)

    Hesse, Michael; Birn, J.; Denton, Richard E.; Drake, J.; Gombosi, T.; Hoshino, M.; Matthaeus, B.; Sibeck, D.

    2005-01-01

    When targeting physical understanding of space plasmas, our focus is gradually shifting away from discovery-type investigations to missions and studies that address our basic understanding of processes we know to be important. For these studies, theory and models provide physical predictions that need to be verified or falsified by empirical evidence. Within this paradigm, a tight integration between theory, modeling, and space flight mission design and execution is essential. NASA's Magnetospheric MultiScale (MMS) mission is a pathfinder in this new era of space research. The prime objective of MMS is to understand magnetic reconnection, arguably the most fundamental of plasma processes. In particular, MMS targets the microphysical processes, which permit magnetic reconnection to operate in the collisionless plasmas that permeate space and astrophysical systems. More specifically, MMS will provide closure to such elemental questions as how particles become demagnetized in the reconnection diffusion region, which effects determine the reconnection rate, and how reconnection is coupled to environmental conditions such as magnetic shear angles. Solutions to these problems have remained elusive in past and present spacecraft missions primarily due to instrumental limitations - yet they are fundamental to the large-scale dynamics of collisionless plasmas. Owing to the lack of measurements, most of our present knowledge of these processes is based on results from modern theory and modeling studies of the reconnection process. Proper design and execution of a mission targeting magnetic reconnection should include this knowledge and have to ensure that all relevant scales and effects can be resolved by mission measurements. The SMART mission has responded to this need through a tight integration between instrument and theory and modeling teams. Input from theory and modeling is fed into all aspects of science mission design, and theory and modeling activities are tailored to SMART needs during mission development and science analysis. In this presentation, we will present an overview of SMART theory and modeling team activities. In particular, we will provide examples of science objectives derived from state-of-the art models, and of recent research results that continue to be utilized in SMART mission development.

  19. Ocean Surface Topography Mission/Jason 2 Artist Concept

    NASA Image and Video Library

    2008-09-23

    An artist concept of the Ocean Surface Topography Mission/Jason 2 Earth satellite. The Ocean Surface Topography Mission/Jason 2 is an Earth satellite designed to make observations of ocean topography for investigations into sea-level rise and the relationship between ocean circulation and climate change. The satellite also provides data on the forces behind such large-scale climate phenomena as El Niño and La Niña. The mission is a follow-on to the French-American Jason 1 mission, which began collecting data on sea-surface levels in 1992. http://photojournal.jpl.nasa.gov/catalog/PIA18158

  20. Scheduling Algorithm for Mission Planning and Logistics Evaluation (SAMPLE). Volume 2: Mission payloads subsystem description

    NASA Technical Reports Server (NTRS)

    Dupnick, E.; Wiggins, D.

    1980-01-01

    The scheduling algorithm for mission planning and logistics evaluation (SAMPLE) is presented. Two major subsystems are included: The mission payloads program; and the set covering program. Formats and parameter definitions for the payload data set (payload model), feasible combination file, and traffic model are documented.

  1. Exact Schwarzschild-like solution in a bumblebee gravity model

    NASA Astrophysics Data System (ADS)

    Casana, R.; Cavalcante, A.; Poulis, F. P.; Santos, E. B.

    2018-05-01

    We obtain an exact vacuum solution from the gravity sector contained in the minimal standard-model extension. The theoretical model assumes a Riemann spacetime coupled to the bumblebee field which is responsible for the spontaneous Lorentz symmetry breaking. The solution achieved in a static and spherically symmetric scenario establishes a Schwarzschild-like black hole. In order to study the effects of the spontaneous Lorentz symmetry breaking we investigate some classic tests, including the advance of perihelion, the bending of light, and Shapiro's time delay. Furthermore, we compute some upper bounds, among which the most stringent associated with existing experimental data provides a sensitivity at the 10-15 level and that for future missions at the 10-19 level.

  2. Dynamical modeling approach to risk assessment for radiogenic leukemia among astronauts engaged in interplanetary space missions.

    PubMed

    Smirnova, Olga A; Cucinotta, Francis A

    2018-02-01

    A recently developed biologically motivated dynamical model of the assessment of the excess relative risk (ERR) for radiogenic leukemia among acutely/continuously irradiated humans (Smirnova, 2015, 2017) is applied to estimate the ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions. Numerous scenarios of space radiation exposure during space missions are used in the modeling studies. The dependence of the ERR for leukemia among astronauts on several mission parameters including the dose equivalent rates of galactic cosmic rays (GCR) and large solar particle events (SPEs), the number of large SPEs, the time interval between SPEs, mission duration, the degree of astronaut's additional shielding during SPEs, the degree of their additional 12-hour's daily shielding, as well as the total mission dose equivalent, is examined. The results of the estimation of ERR for radiogenic leukemia among astronauts, which are obtained in the framework of the developed dynamical model for various scenarios of space radiation exposure, are compared with the corresponding results, computed by the commonly used linear model. It is revealed that the developed dynamical model along with the linear model can be applied to estimate ERR for radiogenic leukemia among astronauts engaged in long-term interplanetary space missions in the range of applicability of the latter. In turn, the developed dynamical model is capable of predicting the ERR for leukemia among astronauts for the irradiation regimes beyond the applicability range of the linear model in emergency cases. As a supplement to the estimations of cancer incidence and death (REIC and REID) (Cucinotta et al., 2013, 2017), the developed dynamical model for the assessment of the ERR for leukemia can be employed on the pre-mission design phase for, e.g., the optimization of the regimes of astronaut's additional shielding in the course of interplanetary space missions. The developed model can also be used on the phase of the real-time responses during the space mission to make the decisions on the operational application of appropriate countermeasures to minimize the risks of occurrences of leukemia, especially, for emergency cases. Copyright © 2017 The Committee on Space Research (COSPAR). Published by Elsevier Ltd. All rights reserved.

  3. Hydrodynamic modeling in the Peace-Athabasca Delta for the upcoming Surface Water and Ocean Topography (SWOT) mission

    NASA Astrophysics Data System (ADS)

    Bergeron, J.; Carter, T.; Langlois, S.; Leconte, R.; Peters, D.; Pietroniro, A.; Russell, M.; Saint-Jean, R.; Siles, G. L.; Trudel, M.

    2017-12-01

    The upcoming Surface Water and Ocean Topography (SWOT) mission aims to retrieve water levels elevations via satellite remote sensing. In anticipation of the launch, scheduled for 2021, multiple regions are selected for calibration/validation purposes. The Peace-Athabasca Delta (PAD), a fresh water wetland complex located in the northeastern part of Alberta, Canada, is one of those regions. The PAD comprises numerous lakes and rivers, including Lake Mamawi and the Athabasca River presented in this study. Since it is a region of interest for many projects, including this one, the region has been monitored via multiple types of observations over time, including airborne LiDAR, water level, discharge, bathymetric surveys retrieved from traditional point-measurements tied to Global Positioning System and from an acoustic Doppler current profiler, and more recently, the airborne support instrument AirSWOT. Using a SWOT imagery simulator and a 2D hydrodynamic model (H2D2), we model the hydrologic steady-state conditions of Lake Mamawi and the Athabasca River, as well as the simulated SWOT imagery resulting from a virtual overpass. A digital terrain model derived from airborne LiDAR and bathymetric surveys, as well as water level and discharge measurements collected during the summers of 2016 and 2017, are used to provide a calibrated H2D2 model, from which simulated SWOT images are generated. The objectives of the research are to explore the capabilities of the simulated SWOT data to 1) calibrate and validate the H2D2 model over the PAD, and 2) to improve the water balance of the PAD in a synthetic context.

  4. A mitigation strategy for commercial aviation impact on NOx-related O3 change

    NASA Astrophysics Data System (ADS)

    Wasiuk, D. K.; Khan, M. A. H.; Shallcross, D. E.; Derwent, R. G.; Lowenberg, M. H.

    2016-07-01

    An operational mitigation strategy for commercial aircraft impact on atmospheric composition, referred to as the turboprop replacement strategy (TRS), is described in this paper. The global air traffic between 2005 and 2011 was modeled with the TRS in which turbofan powered aircraft were replaced with nine chosen turboprop powered aircraft on all routes up to 1700 nautical miles (NM) in range. The results of this TRS double the global number of departures, as well as global mission distance, while global mission time grows by nearly a factor of 3. However, the global mission fuel and the emissions of aviation CO2, H2O, and SOx remain approximately unchanged, and the total global aviation CO, hydrocarbons (HC), and NOx emissions are reduced by 79%, 21%, and 11% on average between 2005 and 2011. The TRS lowers the global mean cruise altitude of flights up to 1700 NM by 2.7 km which leads to a significant decrease in global mission fuel burn, mission time, distance flown, and the aircraft emissions of CO2, CO, H2O, NOx, SOx, and HC above 9.2 km. The replacement of turbofans with turboprops in regional fleets on a global scale leads to an overall reduction in levels of tropospheric O3 at the current estimated mean cruise altitude near the tropopause where the radiative forcing of O3 is strongest. Further, the replacement strategy results in a reduction of ground-level aviation CO and NOx emissions by 33 and 29%, respectively, between 2005 and 2011.

  5. Scheduling algorithm for mission planning and logistics evaluation users' guide

    NASA Technical Reports Server (NTRS)

    Chang, H.; Williams, J. M.

    1976-01-01

    The scheduling algorithm for mission planning and logistics evaluation (SAMPLE) program is a mission planning tool composed of three subsystems; the mission payloads subsystem (MPLS), which generates a list of feasible combinations from a payload model for a given calendar year; GREEDY, which is a heuristic model used to find the best traffic model; and the operations simulation and resources scheduling subsystem (OSARS), which determines traffic model feasibility for available resources. The SAMPLE provides the user with options to allow the execution of MPLS, GREEDY, GREEDY-OSARS, or MPLS-GREEDY-OSARS.

  6. Research study on neutral thermodynamic atmospheric model. [for space shuttle mission and abort trajectory

    NASA Technical Reports Server (NTRS)

    Hargraves, W. R.; Delulio, E. B.; Justus, C. G.

    1977-01-01

    The Global Reference Atmospheric Model is used along with the revised perturbation statistics to evaluate and computer graph various atmospheric statistics along a space shuttle reference mission and abort trajectory. The trajectory plots are height vs. ground range, with height from ground level to 155 km and ground range along the reentry trajectory. Cross sectional plots, height vs. latitude or longitude, are also generated for 80 deg longitude, with heights from 30 km to 90 km and latitude from -90 deg to +90 deg, and for 45 deg latitude, with heights from 30 km to 90 km and longitudes from 180 deg E to 180 deg W. The variables plotted are monthly average pressure, density, temperature, wind components, and wind speed and standard deviations and 99th inter-percentile range for each of these variables.

  7. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  8. Evaluations of Risks from the Lunar and Mars Radiation Environments

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee; Hayat, Matthew J.; Feiveson, Alan H.; Cucinotta, Francis A.

    2008-01-01

    Protecting astronauts from the space radiation environments requires accurate projections of radiation in future space missions. Characterization of the ionizing radiation environment is challenging because the interplanetary plasma and radiation fields are modulated by solar disturbances and the radiation doses received by astronauts in interplanetary space are likewise influenced. The galactic cosmic radiation (GCR) flux for the next solar cycle was estimated as a function of interplanetary deceleration potential, which has been derived from GCR flux and Climax neutron monitor rate measurements over the last 4 decades. For the chaotic nature of solar particle event (SPE) occurrence, the mean frequency of SPE at any given proton fluence threshold during a defined mission duration was obtained from a Poisson process model using proton fluence measurements of SPEs during the past 5 solar cycles (19-23). Analytic energy spectra of 34 historically large SPEs were constructed over broad energy ranges extending to GeV. Using an integrated space radiation model (which includes the transport codes HZETRN [1] and BRYNTRN [2], and the quantum nuclear interaction model QMSFRG[3]), the propagation and interaction properties of the energetic nucleons through various media were predicted. Risk assessment from GCR and SPE was evaluated at the specific organs inside a typical spacecraft using CAM [4] model. The representative risk level at each event size and their standard deviation were obtained from the analysis of 34 SPEs. Risks from different event sizes and their frequency of occurrences in a specified mission period were evaluated for the concern of acute health effects especially during extra-vehicular activities (EVA). The results will be useful for the development of an integrated strategy of optimizing radiation protection on the lunar and Mars missions. Keywords: Space Radiation Environments; Galactic Cosmic Radiation; Solar Particle Event; Radiation Risk; Risk Analysis; Radiation Protection.

  9. Low-Power Operation and Plasma Characterization of a Qualification Model SPT-140 Hall Thruster for NASA Science Missions

    NASA Technical Reports Server (NTRS)

    Garner, Charles E.; Jorns, Benjamin A.; van Derventer, Steven; Hofer, Richard R.; Rickard, Ryan; Liang, Raymond; Delgado, Jorge

    2015-01-01

    Hall thruster systems based on commercial product lines can potentially lead to lower cost electric propulsion (EP) systems for deep space science missions. A 4.5-kW SPT-140 Hall thruster presently under qualification testing by SSL leverages the substantial heritage of the SPT-100 being flown on Russian and US commercial satellites. The Jet Propulsion Laboratory is exploring the use of commercial EP systems, including the SPT-140, for deep space science missions, and initiated a program to evaluate the SPT-140 in the areas of low power operation and thruster operating life. A qualification model SPT-140 designated QM002 was evaluated for operation and plasma properties along channel centerline, from 4.5 kW to 0.8 kW. Additional testing was performed on a development model SPT-140 designated DM4 to evaluate operation with a Moog proportional flow control valve (PFCV). The PFCV was commanded by an SSL engineering model PPU-140 Power Processing Unit (PPU). Performance measurements on QM002 at 0.8 kW discharge power were 50 mN of thrust at a total specific impulse of 1250 s, a total thruster efficiency of 0.38, and discharge current oscillations of under 3% of the mean current. Steady-state operation at 0.8 kW was demonstrated during a 27 h firing. The SPT-140 DM4 was operated in closed-loop control of the discharge current with the PFCV and PPU over discharge power levels of 0.8-4.5 kW. QM002 and DM4 test data indicate that the SPT-140 design is a viable candidate for NASA missions requiring power throttling down to low thruster input power.

  10. How well can future CMB missions constrain cosmic inflation?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martin, Jérôme; Vennin, Vincent; Ringeval, Christophe, E-mail: jmartin@iap.fr, E-mail: christophe.ringeval@uclouvain.be, E-mail: vennin@iap.fr

    2014-10-01

    We study how the next generation of Cosmic Microwave Background (CMB) measurement missions (such as EPIC, LiteBIRD, PRISM and COrE) will be able to constrain the inflationary landscape in the hardest to disambiguate situation in which inflation is simply described by single-field slow-roll scenarios. Considering the proposed PRISM and LiteBIRD satellite designs, we simulate mock data corresponding to five different fiducial models having values of the tensor-to-scalar ratio ranging from 10{sup -1} down to 10{sup -7}. We then compute the Bayesian evidences and complexities of all Encyclopædia Inflationaris models in order to assess the constraining power of PRISM alone andmore » LiteBIRD complemented with the Planck 2013 data. Within slow-roll inflation, both designs have comparable constraining power and can rule out about three quarters of the inflationary scenarios, compared to one third for Planck 2013 data alone. However, we also show that PRISM can constrain the scalar running and has the capability to detect a violation of slow roll at second order. Finally, our results suggest that describing an inflationary model by its potential shape only, without specifying a reheating temperature, will no longer be possible given the accuracy level reached by the future CMB missions.« less

  11. Mission Assessment of the Faraday Accelerator with Radio-frequency Assisted Discharge (FARAD)

    NASA Technical Reports Server (NTRS)

    Dankanich, John W.; Polzin, Kurt A.

    2008-01-01

    Pulsed inductive thrusters have typically been considered for future, high-power, missions requiring nuclear electric propulsion. These high-power systems, while promising equivalent or improved performance over state-of-the-art propulsion systems, presently have no planned missions for which they are well suited. The ability to efficiently operate an inductive thruster at lower energy and power levels may provide inductive thrusters near term applicability and mission pull. The Faraday Accelerator with Radio-frequency Assisted Discharge concept demonstrated potential for a high-efficiency, low-energy pulsed inductive thruster. The added benefits of energy recapture and/or pulse compression are shown to enhance the performance of the pulsed inductive propulsion system, yielding a system that con compete with and potentially outperform current state-of-the-art electric propulsion technologies. These enhancements lead to mission-level benefits associated with the use of a pulsed inductive thruster. Analyses of low-power near to mid-term missions and higher power far-term missions are undertaken to compare the performance of pulsed inductive thrusters with that delivered by state-of-the-art and development-level electric propulsion systems.

  12. A technique for evaluating the application of the pin-level stuck-at fault model to VLSI circuits

    NASA Technical Reports Server (NTRS)

    Palumbo, Daniel L.; Finelli, George B.

    1987-01-01

    Accurate fault models are required to conduct the experiments defined in validation methodologies for highly reliable fault-tolerant computers (e.g., computers with a probability of failure of 10 to the -9 for a 10-hour mission). Described is a technique by which a researcher can evaluate the capability of the pin-level stuck-at fault model to simulate true error behavior symptoms in very large scale integrated (VLSI) digital circuits. The technique is based on a statistical comparison of the error behavior resulting from faults applied at the pin-level of and internal to a VLSI circuit. As an example of an application of the technique, the error behavior of a microprocessor simulation subjected to internal stuck-at faults is compared with the error behavior which results from pin-level stuck-at faults. The error behavior is characterized by the time between errors and the duration of errors. Based on this example data, the pin-level stuck-at fault model is found to deliver less than ideal performance. However, with respect to the class of faults which cause a system crash, the pin-level, stuck-at fault model is found to provide a good modeling capability.

  13. Improving LiDAR Biomass Model Uncertainty through Non-Destructive Allometry and Plot-level 3D Reconstruction with Terrestrial Laser Scanning

    NASA Astrophysics Data System (ADS)

    Stovall, A. E.; Shugart, H. H., Jr.

    2017-12-01

    Future NASA and ESA satellite missions plan to better quantify global carbon through detailed observations of forest structure, but ultimately rely on uncertain ground measurement approaches for calibration and validation. A significant amount of the uncertainty in estimating plot-level biomass can be attributed to inadequate and unrepresentative allometric relationships used to convert plot-level tree measurements to estimates of aboveground biomass. These allometric equations are known to have high errors and biases, particularly in carbon rich forests because they were calibrated with small and often biased samples of destructively harvested trees. To overcome this issue, a non-destructive methodology for estimating tree and plot-level biomass has been proposed through the use of Terrestrial Laser Scanning (TLS). We investigated the potential for using TLS as a ground validation approach in LiDAR-based biomass mapping though virtual plot-level tree volume reconstruction and biomass estimation. Plot-level biomass estimates were compared on the Virginia-based Smithsonian Conservation Biology Institute's SIGEO forest with full 3D reconstruction, TLS allometry, and Jenkins et al. (2003) allometry. On average, full 3D reconstruction ultimately provided the lowest uncertainty estimate of plot-level biomass (9.6%), followed by TLS allometry (16.9%) and the national equations (20.2%). TLS offered modest improvements to the airborne LiDAR empirical models, reducing RMSE from 16.2% to 14%. Our findings suggest TLS plot acquisitions and non-destructive allometry can play a vital role for reducing uncertainty in calibration and validation data for biomass mapping in the upcoming NASA and ESA missions.

  14. NATO Stanag Language Proficiency Levels for Joint Missions and Its Implementations at a State Organization

    ERIC Educational Resources Information Center

    Solak, Ekrem

    2013-01-01

    Turkish Armed Forces have been participating in joint missions together with other nations for decades. Since English is the medium of instruction in these missions, participating members should have NATO Standards in terms of language proficiency levels in four skills. Therefore, this study aims to specify personnel's views and their language…

  15. Continuous assimilation of simulated Geosat altimetric sea level into an eddy-resolving numerical ocean model. I - Sea level differences. II - Referenced sea level differences

    NASA Technical Reports Server (NTRS)

    White, Warren B.; Tai, Chang-Kou; Holland, William R.

    1990-01-01

    The optimal interpolation method of Lorenc (1981) was used to conduct continuous assimilation of altimetric sea level differences from the simulated Geosat exact repeat mission (ERM) into a three-layer quasi-geostrophic eddy-resolving numerical ocean box model that simulates the statistics of mesoscale eddy activity in the western North Pacific. Assimilation was conducted continuously as the Geosat tracks appeared in simulated real time/space, with each track repeating every 17 days, but occurring at different times and locations within the 17-day period, as would have occurred in a realistic nowcast situation. This interpolation method was also used to conduct the assimilation of referenced altimetric sea level differences into the same model, performing the referencing of altimetric sea sevel differences by using the simulated sea level. The results of this dynamical interpolation procedure are compared with those of a statistical (i.e., optimum) interpolation procedure.

  16. Image quality validation of Sentinel 2 Level-1 products: performance status at the beginning of the constellation routine phase

    NASA Astrophysics Data System (ADS)

    Francesconi, Benjamin; Neveu-VanMalle, Marion; Espesset, Aude; Alhammoud, Bahjat; Bouzinac, Catherine; Clerc, Sébastien; Gascon, Ferran

    2017-09-01

    Sentinel-2 is an Earth Observation mission developed by the European Space Agency (ESA) in the frame of the Copernicus program of the European Commission. The mission is based on a constellation of 2-satellites: Sentinel-2A launched in June 2015 and Sentinel-2B launched in March 2017. It offers an unprecedented combination of systematic global coverage of land and coastal areas, a high revisit of five days at the equator and 2 days at mid-latitudes under the same viewing conditions, high spatial resolution, and a wide field of view for multispectral observations from 13 bands in the visible, near infrared and short wave infrared range of the electromagnetic spectrum. The mission performances are routinely and closely monitored by the S2 Mission Performance Centre (MPC), including a consortium of Expert Support Laboratories (ESL). This publication focuses on the Sentinel-2 Level-1 product quality validation activities performed by the MPC. It presents an up-to-date status of the Level-1 mission performances at the beginning of the constellation routine phase. Level-1 performance validations routinely performed cover Level-1 Radiometric Validation (Equalisation Validation, Absolute Radiometry Vicarious Validation, Absolute Radiometry Cross-Mission Validation, Multi-temporal Relative Radiometry Vicarious Validation and SNR Validation), and Level-1 Geometric Validation (Geolocation Uncertainty Validation, Multi-spectral Registration Uncertainty Validation and Multi-temporal Registration Uncertainty Validation). Overall, the Sentinel-2 mission is proving very successful in terms of product quality thereby fulfilling the promises of the Copernicus program.

  17. Thermal System Verification and Model Validation for NASA's Cryogenic Passively Cooled James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cleveland, Paul E.; Parrish, Keith A.

    2005-01-01

    A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.

  18. Mars Gravity Field Model Development from Mars Global Surveyor Tracking Data

    NASA Technical Reports Server (NTRS)

    Lemoine, F. G.; Zuber, M. T.

    1999-01-01

    Since Feb. 99 the MGS spacecraft has been in a near circular orbit at about 400 km altitude. The MGS has been regularly tracked by the Deep Space Network (DSN) at X-band and for a 3 week period in February was tracked almost continuously for an intensive gravity modeling activity that would form the basis of the orbital computations for the rest of the mission. The data collected during this calibration period and the earlier SPO and Hiatus periods have now been used to develop a new gravity field model for Mars that is showing considerable new detail in both the northern and southern hemispheres. Until February no data at 400 km altitude or lower had been acquired on any previous mission south of about 35S and all the previous data were of significantly lower quality. Low altitude data (-170 km) were obtained over the higher latitudes of the northern hemisphere during the SPO periods but because of the high eccentricity of the orbit nothing of similar quality was obtainable for the southern hemisphere. The new models are of spherical harmonic degree and order 70 or higher and are suggesting large anomalies are often associated with the large impact features. Gravity data have also been obtained over both the northern and southern polar ice caps. The MGS orbit quality resulting from the use of these newer models is better than any previous Mars missions and is approaching the ten's of meter level that had been hoped would be eventually realizable.

  19. Advanced Chemical Propulsion

    NASA Technical Reports Server (NTRS)

    Alexander, Leslie, Jr.

    2006-01-01

    Advanced Chemical Propulsion (ACP) provides near-term incremental improvements in propulsion system performance and/or cost. It is an evolutionary approach to technology development that produces useful products along the way to meet increasingly more demanding mission requirements while focusing on improving payload mass fraction to yield greater science capability. Current activities are focused on two areas: chemical propulsion component, subsystem, and manufacturing technologies that offer measurable system level benefits; and the evaluation of high-energy storable propellants with enhanced performance for in-space application. To prioritize candidate propulsion technology alternatives, a variety of propulsion/mission analyses and trades have been conducted for SMD missions to yield sufficient data for investment planning. They include: the Advanced Chemical Propulsion Assessment; an Advanced Chemical Propulsion System Model; a LOx-LH2 small pumps conceptual design; a space storables propellant study; a spacecraft cryogenic propulsion study; an advanced pressurization and mixture ratio control study; and a pump-fed vs. pressure-fed study.

  20. Entry, Descent and Landing Systems Analysis: Exploration Feed Forward Internal Peer Review Slide Package

    NASA Technical Reports Server (NTRS)

    Dwyer Cianciolo, Alicia M. (Editor)

    2011-01-01

    NASA senior management commissioned the Entry, Descent and Landing Systems Analysis (EDL-SA) Study in 2008 to identify and roadmap the Entry, Descent and Landing (EDL) technology investments that the agency needed to successfully land large payloads at Mars for both robotic and human-scale missions. Year 1 of the study focused on technologies required for Exploration-class missions to land payloads of 10 to 50 mt. Inflatable decelerators, rigid aeroshell and supersonic retro-propulsion emerged as the top candidate technologies. In Year 2 of the study, low TRL technologies identified in Year 1, inflatables aeroshells and supersonic retropropulsion, were combined to create a demonstration precursor robotic mission. This part of the EDL-SA Year 2 effort, called Exploration Feed Forward (EFF), took much of the systems analysis simulation and component model development from Year 1 to the next level of detail.

  1. Entry, Descent and Landing Systems Analysis Study: Phase 2 Report on Exploration Feed-Forward Systems

    NASA Technical Reports Server (NTRS)

    Dwyer Ciancolo, Alicia M.; Davis, Jody L.; Engelund, Walter C.; Komar, D. R.; Queen, Eric M.; Samareh, Jamshid A.; Way, David W.; Zang, Thomas A.; Murch, Jeff G.; Krizan, Shawn A.; hide

    2011-01-01

    NASA senior management commissioned the Entry, Descent and Landing Systems Analysis (EDL-SA) Study in 2008 to identify and roadmap the Entry, Descent and Landing (EDL) technology investments that the agency needed to successfully land large payloads at Mars for both robotic and human-scale missions. Year 1 of the study focused on technologies required for Exploration-class missions to land payloads of 10 to 50 t. Inflatable decelerators, rigid aeroshell and supersonic retro-propulsion emerged as the top candidate technologies. In Year 2 of the study, low TRL technologies identified in Year 1, inflatables aeroshells and supersonic retropropulsion, were combined to create a demonstration precursor robotic mission. This part of the EDL-SA Year 2 effort, called Exploration Feed Forward (EFF), took much of the systems analysis simulation and component model development from Year 1 to the next level of detail.

  2. Jack of all trades, master of none?: an alternative to clinical psychology's market-driven mission creep.

    PubMed

    Heesacker, Martin

    2005-09-01

    The authors C.R. Snyder and T.R. Elliott of this special issue's target article, "Twenty-First Century Graduate Education in Clinical Psychology: A Four Level Matrix Model" (this issue, pp. 1033-1054), are right that scientific distinctions should sometimes be de-emphasized in service of understanding the larger scientific vision. However, they take their combining too far, arrogating unto clinical psychology elements best left to their original scholarly disciplines. Snyder and Elliott simply present the next logical step in clinical psychology's longstanding tradition of "mission creep," broadening its focus to encompass new potential markets. Instead, the keeping and sharpening of disciplinary and subdisciplinary boundaries might best serve clinical psychology. The emphasis would shift from mission creep to building links with complementary disciplines and subdisciplines, to tackle issues that require true interdisciplinary scholarship. (c) 2005 Wiley Periodicals, Inc.

  3. Use of Model Payload for Europa Mission Development

    NASA Technical Reports Server (NTRS)

    Lewis, Kari; Klaasan, Ken; Susca, Sara; Oaida, Bogdan; Larson, Melora; Vanelli, Tony; Murray, Alex; Jones, Laura; Thomas, Valerie; Frank, Larry

    2016-01-01

    This paper discusses the basis for the Model Payload and how it was used to develop the mission design, observation and data acquisition strategy, needed spacecraft capabilities, spacecraft-payload interface needs, mission system requirements and operational scenarios.

  4. Plasma Vehicle Charging Analysis for Orion Flight Test 1

    NASA Technical Reports Server (NTRS)

    Lallement, L.; McDonald, T.; Norgard, J.; Scully, B.

    2014-01-01

    In preparation for the upcoming experimental test flight for the Orion crew module, considerable interest was raised over the possibility of exposure to elevated levels of plasma activity and vehicle charging both externally on surfaces and internally on dielectrics during the flight test orbital operations. Initial analysis using NASCAP-2K indicated very high levels of exposure, and this generated additional interest in refining/defining the plasma and spacecraft models used in the analysis. This refinement was pursued, resulting in the use of specific AE8 and AP8 models, rather than SCATHA models, as well as consideration of flight trajectory, time duration, and other parameters possibly affecting the levels of exposure and the magnitude of charge deposition. Analysis using these refined models strongly indicated that, for flight test operations, no special surface coatings were necessary for the thermal protection system, but would definitely be required for future GEO, trans-lunar, and extra-lunar missions...

  5. Plasma Vehicle Charging Analysis for Orion Flight Test 1

    NASA Technical Reports Server (NTRS)

    Scully, B.; Norgard, J.

    2015-01-01

    In preparation for the upcoming experimental test flight for the Orion crew module, considerable interest was raised over the possibility of exposure to elevated levels of plasma activity and vehicle charging both externally on surfaces and internally on dielectrics during the flight test orbital operations. Initial analysis using NASCAP-2K indicated very high levels of exposure, and this generated additional interest in refining/defining the plasma and spacecraft models used in the analysis. This refinement was pursued, resulting in the use of specific AE8 and AP8 models, rather than SCATHA models, as well as consideration of flight trajectory, time duration, and other parameters possibly affecting the levels of exposure and the magnitude of charge deposition. Analysis using these refined models strongly indicated that, for flight test operations, no special surface coatings were necessary for the Thermal Protection System (TPS), but would definitely be required for future GEO, trans-lunar, and extra-lunar missions.

  6. Automated Planning and Scheduling for Orbital Express (151)

    NASA Technical Reports Server (NTRS)

    Knight, Russell

    2008-01-01

    The challenging timeline for DARPA's Orbital Express mission demanded a flexible, responsive, and (above all) safe approach to mission planning. Because the mission was a technology demonstration, pertinent planning information was learned during actual mission execution. This information led to amendments to procedures, which led to changes in the mission plan. In general, we used the ASPEN planner scheduler to generate and validate the mission plans. We enhanced ASPEN to enable it to reason about uncertainty. We also developed a model generator that would read the text of a procedure and translate it into an ASPEN model. These technologies had a significant impact on the success of the Orbital Express mission.

  7. Martian Feeling: An Analogue Study to Simulate a Round-Trip to Mars using the International Space Station

    NASA Astrophysics Data System (ADS)

    Felix, C. V.; Gini, A.

    When talking about human space exploration, Mars missions are always present. It is clear that sooner or later, humanity will take this adventure. Arguably the most important aspect to consider for the success of such an endeavour is the human element. The safety of the crew throughout a Martian mission is a top priority for all space agencies. Therefore, such a mission should not take place until all the risks have been fully understood and mitigated. A mission to Mars presents unique human and technological challenges in terms of isolation, confinement, autonomy, reliance on mission control, communication delays and adaptation to different gravity levels. Analogue environments provide the safest way to simulate these conditions, mitigate the risks and evaluate the effects of long-term space travel on the crew. Martian Feeling is one of nine analogue studies, from the Mars Analogue Path (MAP) report [1], proposed by the TP Analogue group of ISU Masters class 2010. It is an integrated analogue study which simulates the psychological, physiological and operational conditions that an international, six-person, mixed gender crew would experience on a mission to Mars. Set both onboard the International Space Station (ISS) and on Earth, the Martian Feeling study will perform a ``dress rehearsal'' of a mission to Mars. The study proposes to test both human performance and operational procedures in a cost-effective manner. Since Low Earth Orbit (LEO) is more accessible than other space-based locations, an analogue studies in LEO would provide the required level of realism to a simulated transit mission to Mars. The sustained presence of microgravity and other elements of true spaceflight are features of LEO that are neither currently feasible nor possible to study in terrestrial analogue sites. International collaboration, economics, legal and ethical issues were considered when the study was proposed. As an example of international collaboration, the ISS would demonstrate an effective model for an international effort to send humans to Mars. The proposed starting date is the year 2017, before the planned retirement of the ISS, which is currently scheduled for 2020.

  8. Mission Level Autonomy for USSV

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Stirb, Robert C.; Brizzolara, Robert

    2011-01-01

    On-water demonstration of a wide range of mission-proven, advanced technologies at TRL 5+ that provide a total integrated, modular approach to effectively address the majority of the key needs for full mission-level autonomous, cross-platform control of USV s. Wide baseline stereo system mounted on the ONR USSV was shown to be an effective sensing modality for tracking of dynamic contacts as a first step to automated retrieval operations. CASPER onboard planner/replanner successfully demonstrated realtime, on-water resource-based analysis for mission-level goal achievement and on-the-fly opportunistic replanning. Full mixed mode autonomy was demonstrated on-water with a seamless transition between operator over-ride and return to current mission plan. Autonomous cooperative operations for fixed asset protection and High Value Unit escort using 2 USVs (AMN1 & 14m RHIB) were demonstrated during Trident Warrior 2010 in JUN 2010

  9. Handling the Diversity in the Coming Flood of InSAR Data with the InSAR Scientific Computing Environment

    NASA Astrophysics Data System (ADS)

    Rosen, P. A.; Gurrola, E. M.; Sacco, G. F.; Agram, P. S.; Lavalle, M.; Zebker, H. A.

    2014-12-01

    The NASA ESTO-developed InSAR Scientific Computing Environment (ISCE) provides acomputing framework for geodetic image processing for InSAR sensors that ismodular, flexible, and extensible, enabling scientists to reduce measurementsdirectly from a diverse array of radar satellites and aircraft to newgeophysical products. ISCE can serve as the core of a centralized processingcenter to bring Level-0 raw radar data up to Level-3 data products, but isadaptable to alternative processing approaches for science users interested innew and different ways to exploit mission data. This is accomplished throughrigorous componentization of processing codes, abstraction and generalization ofdata models, and a xml-based input interface with multi-level prioritizedcontrol of the component configurations depending on the science processingcontext. The proposed NASA-ISRO SAR (NISAR) Mission would deliver data ofunprecedented quantity and quality, making possible global-scale studies inclimate research, natural hazards, and Earth's ecosystems. ISCE is planned tobecome a key element in processing projected NISAR data into higher level dataproducts, enabling a new class of analyses that take greater advantage of thelong time and large spatial scales of these new data than current approaches.NISAR would be but one mission in a constellation of radar satellites in thefuture delivering such data. ISCE has been incorporated into two prototypecloud-based systems that have demonstrated its elasticity to addressing largerdata processing problems in a "production" context and its ability to becontrolled by individual science users on the cloud for large data problems.

  10. NASA CYGNSS Mission Overview

    NASA Astrophysics Data System (ADS)

    Ruf, C. S.; Balasubramaniam, R.; Gleason, S.; McKague, D. S.; O'Brien, A.

    2017-12-01

    The CYGNSS constellation of eight satellites was successfully launched on 15 December 2016 into a low inclination (tropical) Earth orbit. Each satellite carries a four-channel bi-static radar receiver that measures GPS signals scattered by the ocean, from which ocean surface roughness, near surface wind speed, and air-sea latent heat flux are estimated. The measurements are unique in several respects, most notably in their ability to penetrate through all levels of precipitation, made possible by the low frequency at which GPS operates, and in the frequent sampling of tropical cyclone intensification and of the diurnal cycle of winds, made possible by the large number of satellites. Engineering commissioning of the constellation was successfully completed in March 2017 and the mission is currently in the early phase of science operations. Level 2 science data products have been developed for near surface (10 m referenced) ocean wind speed, ocean surface roughness (mean square slope) and latent heat flux. Level 3 gridded versions of the L2 products have also been developed. A set of Level 4 products have also been developed specifically for direct tropical cyclone overpasses. These include the storm intensity (peak sustained winds) and size (radius of maximum winds), its extent (34, 50 and 64 knot wind radii), and its integrated kinetic energy. Assimilation of CYGNSS L2 wind speed data into the HWRF hurricane weather prediction model has also been developed. An overview and the current status of the mission will be presented, together with highlights of early on-orbit performance and scientific results.

  11. Medical Systems Engineering to Support Mars Mission Crew Autonomy

    NASA Technical Reports Server (NTRS)

    Antonsen, Erik; Mindock, Jennifer

    2017-01-01

    Human spaceflight missions to Mars face exceptionally challenging resource limitations that far exceed those faced before. Increasing transit times, decreasing opportunity for resupply, communications challenges, and extended time to evacuate a crew to definitive medical care dictate a level of crew autonomy in medical care that is beyond the current medical model. To approach this challenge, a medical systems engineering approach is proposed that relies on a clearly articulated Concept of Operations and risk analysis tools that are in development at NASA. This paper proposes an operational clinical model with key terminology and concepts translated to a controls theory paradigm to frame a common language between clinical and engineering teams. This common language will be used for design and validation of an exploration medical system that is fully integrated into a Mars transit vehicle. This approach merges medical simulation, human factors evaluation techniques, and human-in-the-loop testing in ground based analogs to tie medical hardware and software subsystem performance and overall medical system functionality to metrics of operational medical autonomy. Merging increases in operational clinical autonomy with a more restricted vehicle system resource scenario in interplanetary spaceflight will require an unprecedented level of medical and engineering integration. Full integration of medical capabilities into a Mars vehicle system may require a new approach to integrating medical system design and operations into the vehicle Program structure. Prior to the standing-up of a Mars Mission Program, proof of concept is proposed through the Human Research Program.

  12. Immersive Environment Technologies for Mars Exploration

    NASA Technical Reports Server (NTRS)

    Wright, John R.; Hartman, Frank

    2000-01-01

    JPL's charter includes the unmanned exploration of the Solar System. One of the tools for exploring other planets is the rover as exemplified by Sojourner on the Mars Pathfinder mission. The light speed turnaround time between Earth and the outer planets precludes the use of teleoperated rovers so autonomous operations are built in to the current and upcoming generation devices. As the level of autonomy increases, the mode of operations shifts from low-level specification of activities to a higher-level specification of goals. To support this higher-level activity, it is necessary to provide the operator with an effective understanding of the in-situ environment and also the tools needed to specify the higher-level goals. Immersive environments provide the needed sense of presence to achieve this goal. Use of immersive environments at JPL has two main thrusts that will be discussed in this talk. One is the generation of 3D models of the in-situ environment, in particular the merging of models from different sensors, different modes (orbital, descent, and lander), and even different missions. The other is the use of various tools to visualize the environment within which the rover will be operating to maximize the understanding by the operator. A suite of tools is under development which provide an integrated view into the environment while providing a variety of modes of visualization. This allows the operator to smoothly switch from one mode to another depending on the information and presentation desired.

  13. Ground-Water Flow in the Vicinity of the Ho-Chunk Nation Communities of Indian Mission and Sand Pillow, Jackson County, Wisconsin

    USGS Publications Warehouse

    Dunning, Charles P.; Mueller, Gregory D.; Juckem, Paul F.

    2008-01-01

    An analytic element ground-water-flow model was constructed to help understand the ground-water-flow system in the vicinity of the Ho-Chunk Nation communities of Indian Mission and Sand Pillow in Jackson County, Wisconsin. Data from interpretive reports, well-drillers' construction reports, and an exploratory augering program in 2003 indicate that sand and gravel of varying thickness (0-150 feet[ft]) and porous sandstone make up a composite aquifer that overlies Precambrian crystalline rock. The geometric mean values for horizontal hydraulic conductivity were estimated from specific-capacity data to be 61.3 feet per day (ft/d) for sand and gravel, 6.6 ft/d for sandstone, and 12.0 ft/d for the composite aquifer. A ground-water flow model was constructed, the near field of which encompassed the Levis and Morrison Creeks Watershed. The flow model was coupled to the parameter-estimation program UCODE to obtain a best fit between simulated and measured values of ground-water levels and estimated Q50 flow duration (base flow). Calibration of the model with UCODE provided a ground-water recharge rate of 9 inches per year and a horizontal hydraulic conductivity of 13 ft/d for the composite aquifer. Using these calibrated parameter values, simulated heads from the model were on average within 5 ft of the measured water levels. In addition, these parameter values provided an acceptable base-flow calibration for Hay, Dickey, and Levis Creeks; the calibration was particularly close for Levis Creek, which was the most frequently measured stream in the study area. The calibrated model was used to simulate ground-water levels and to determine the direction of ground-water flow in the vicinity of Indian Mission and Sand Pillow communities. Backward particle tracking was conducted for Sand Pillow production wells under two pumping simulations to determine their 20-year contributing areas. In the first simulation, new production wells 6, 7, and 8 were each pumped at 50 gallons per minute (gal/min). In the second simulation, new production wells 6, 7, and 8 and existing production well 5 were each pumped at 50 gal/min. The second simulation demonstrated interference between the existing production well 5 and the new production wells when all were pumping at 50 gal/min.

  14. Probabilistic Forecast of Solar Particle Fluence for Mission Durations and Exposure Assessment in Consideration of Integral Proton Fluence at High Energies

    NASA Astrophysics Data System (ADS)

    Kim, M. Y.; Tylka, A. J.; Dietrich, W. F.; Cucinotta, F. A.

    2012-12-01

    The occasional occurrence of solar particle events (SPEs) with large amounts of energy is non-predictable, while the expected frequency is strongly influenced by solar cycle activity. The potential for exposure to large SPEs with high energy levels is the major concern during extra-vehicular activities (EVAs) on the Moon, near Earth object, and Mars surface for future long duration space missions. We estimated the propensity for SPE occurrence with large proton fluence as a function of time within a typical future solar cycle from a non-homogeneous Poisson model using the historical database for measurements of protons with energy > 30 MeV, Φ30. The database includes a comprehensive collection of historical data set for the past 5 solar cycles. Using all the recorded proton fluence of SPEs, total fluence distributions of Φ30, Φ60, and Φ100 were simulated ranging from its 5th to 95th percentile for each mission durations. In addition to the total particle intensity of SPEs, the detailed energy spectra of protons, especially at high energy levels, were recognized as extremely important for assessing the radiation cancer risk associated with energetic particles for large events. For radiation exposure assessments of major SPEs, we used the spectral functional form of a double power law in rigidity (the so-called Band function), which have provided a satisfactory representation of the combined satellite and neutron monitor data from ~10 MeV to ~10 GeV. The dependencies of exposure risk were evaluated as a function of proton fluence at a given energy threshold of 30, 60, and 100 MeV, and overall risk prediction was improved as the energy level threshold increases from 30 to 60 to 100 MeV. The results can be applied to the development of approaches of improved radiation protection for astronauts, as well as the optimization of mission planning and shielding for future space missions.

  15. Using semantic data modeling techniques to organize an object-oriented database for extending the mass storage model

    NASA Technical Reports Server (NTRS)

    Campbell, William J.; Short, Nicholas M., Jr.; Roelofs, Larry H.; Dorfman, Erik

    1991-01-01

    A methodology for optimizing organization of data obtained by NASA earth and space missions is discussed. The methodology uses a concept based on semantic data modeling techniques implemented in a hierarchical storage model. The modeling is used to organize objects in mass storage devices, relational database systems, and object-oriented databases. The semantic data modeling at the metadata record level is examined, including the simulation of a knowledge base and semantic metadata storage issues. The semantic data model hierarchy and its application for efficient data storage is addressed, as is the mapping of the application structure to the mass storage.

  16. An Overview Of NASA's Solar Sail Propulsion Project

    NASA Technical Reports Server (NTRS)

    Garbe, Gregory; Montgomery, Edward E., IV

    2003-01-01

    Research conducted by the In-Space Propulsion (ISP) Technologies Projects is at the forefront of NASA's efforts to mature propulsion technologies that will enable or enhance a variety of space science missions. The ISP Program is developing technologies from a Technology Readiness Level (TRL) of 3 through TRL 6. Activities under the different technology areas are selected through the NASA Research Announcement (NRA) process. The ISP Program goal is to mature a suite of reliable advanced propulsion technologies that will promote more cost efficient missions through the reduction of interplanetary mission trip time, increased scientific payload mass fraction, and allowing for longer on-station operations. These propulsion technologies will also enable missions with previously inaccessible orbits (e.g., non-Keplerian, high solar latitudes). The ISP Program technology suite has been prioritized by an agency wide study. Solar Sail propulsion is one of ISP's three high-priority technology areas. Solar sail propulsion systems will be required to meet the challenge of monitoring and predicting space weather by the Office of Space Science s (OSS) Living with a Star (LWS) program. Near-to-mid-term mission needs include monitoring of solar activity and observations at high solar latitudes. Near-term work funded by the ISP solar sail propulsion project is centered around the quantitative demonstration of scalability of present solar sail subsystem designs and concepts to future mission requirements through ground testing, computer modeling and analytical simulations. This talk will review the solar sail technology roadmap, current funded technology development work, future funding opportunities, and mission applications.

  17. Flight Control Development for the ARH-70 Armed Reconnaissance Helicopter Program

    NASA Technical Reports Server (NTRS)

    Christensen, Kevin T.; Campbell, Kip G.; Griffith, Carl D.; Ivler, Christina M.; Tischler, Mark B.; Harding, Jeffrey W.

    2008-01-01

    In July 2005, Bell Helicopter won the U.S. Army's Armed Reconnaissance Helicopter competition to produce a replacement for the OH-58 Kiowa Warrior capable of performing the armed reconnaissance mission. To meet the U.S. Army requirement that the ARH-70A have Level 1 handling qualities for the scout rotorcraft mission task elements defined by ADS-33E-PRF, Bell equipped the aircraft with their generic automatic flight control system (AFCS). Under the constraints of the tight ARH-70A schedule, the development team used modem parameter identification and control law optimization techniques to optimize the AFCS gains to simultaneously meet multiple handling qualities design criteria. This paper will show how linear modeling, control law optimization, and simulation have been used to produce a Level 1 scout rotorcraft for the U.S. Army, while minimizing the amount of flight testing required for AFCS development and handling qualities evaluation of the ARH-70A.

  18. Wicked problems in space technology development at NASA

    NASA Astrophysics Data System (ADS)

    Balint, Tibor S.; Stevens, John

    2016-01-01

    Technological innovation is key to enable future space exploration missions at NASA. Technology development, however, is not only driven by performance and resource considerations, but also by a broad range of directly or loosely interconnected factors. These include, among others, strategy, policy and politics at various levels, tactics and programmatics, interactions between stakeholders, resource requirements, performance goals from component to system level, mission infusion targets, portfolio execution and tracking, and technology push or mission pull. Furthermore, at NASA, these influences occur on varying timescales and at diverse geographic locations. Such a complex and interconnected system could impede space technology innovation in this examined segment of the government environment. Hence, understanding the process through NASA's Planning, Programming, Budget and Execution cycle could benefit strategic thinking, planning and execution. Insights could be gained through suitable models, for example assessing the key drivers against the framework of Wicked Problems. This paper discusses NASA specific space technology innovation and innovation barriers in the government environment through the characteristics of Wicked Problems; that is, they do not have right or wrong solutions, only improved outcomes that can be reached through authoritative, competitive, or collaborative means. We will also augment the Wicked Problems model to account for the temporally and spatially coupled, and cyclical nature of this NASA specific case, and propose how appropriate models could improve understanding of the key influencing factors. In turn, such understanding may subsequently lead to reducing innovation barriers, and stimulating technology innovation at NASA. Furthermore, our approach can be adopted for other government-directed environments to gain insights into their structures, hierarchies, operational flow, and interconnections to facilitate circular dialogs towards preferred outcomes.

  19. Supportability Issues and Approaches for Exploration Missions

    NASA Technical Reports Server (NTRS)

    Watson, J. K.; Ivins, M. S.; Cunningham, R. A.

    2006-01-01

    Maintaining and repairing spacecraft systems hardware to achieve required levels of operational availability during long-duration exploration missions will be challenged by limited resupply opportunities, constraints on the mass and volume available for spares and other maintenance-related provisions, and extended communications times. These factors will force the adoption of new approaches to the integrated logistics support of spacecraft systems hardware. For missions beyond the Moon, all spares, equipment, and supplies must either be prepositioned prior to departure from Earth of human crews or carried with the crews. The mass and volume of spares must be minimized by enabling repair at the lowest hardware levels, imposing commonality and standardization across all mission elements at all hardware levels, and providing the capability to fabricate structural and mechanical spares as required. Long round-trip communications times will require increasing levels of autonomy by the crews for most operations including spacecraft maintenance. Effective implementation of these approaches will only be possible when their need is recognized at the earliest stages of the program, when they are incorporated in operational concepts and programmatic requirements, and when diligence is applied in enforcing these requirements throughout system design in an integrated way across all contractors and suppliers. These approaches will be essential for the success of missions to Mars. Although limited duration lunar missions may be successfully accomplished with more traditional approaches to supportability, those missions will offer an opportunity to refine these concepts, associated technologies, and programmatic implementation methodologies so that they can be most effectively applied to later missions.

  20. An investigation of the use of temporal decomposition in space mission scheduling

    NASA Technical Reports Server (NTRS)

    Bullington, Stanley E.; Narayanan, Venkat

    1994-01-01

    This research involves an examination of techniques for solving scheduling problems in long-duration space missions. The mission timeline is broken up into several time segments, which are then scheduled incrementally. Three methods are presented for identifying the activities that are to be attempted within these segments. The first method is a mathematical model, which is presented primarily to illustrate the structure of the temporal decomposition problem. Since the mathematical model is bound to be computationally prohibitive for realistic problems, two heuristic assignment procedures are also presented. The first heuristic method is based on dispatching rules for activity selection, and the second heuristic assigns performances of a model evenly over timeline segments. These heuristics are tested using a sample Space Station mission and a Spacelab mission. The results are compared with those obtained by scheduling the missions without any problem decomposition. The applicability of this approach to large-scale mission scheduling problems is also discussed.

  1. Distributed intelligence for ground/space systems

    NASA Technical Reports Server (NTRS)

    Aarup, Mads; Munch, Klaus Heje; Fuchs, Joachim; Hartmann, Ralf; Baud, Tim

    1994-01-01

    DI is short for Distributed Intelligence for Ground/Space Systems and the DI Study is one in a series of ESA projects concerned with the development of new concepts and architectures for future autonomous spacecraft systems. The kick-off of DI was in January 1994 and the planned duration is three years. The background of DI is the desire to design future ground/space systems with a higher degree of autonomy than seen in today's missions. The aim of introducing autonomy in spacecraft systems is to: (1) lift the role of the spacecraft operators from routine work and basic troubleshooting to supervision; (2) ease access to and increase availability of spacecraft resources; (3) carry out basic mission planning for users; (4) enable missions which have not yet been feasible due to eg. propagation delays, insufficient ground station coverage etc.; and (5) possibly reduce mission cost. The study serves to identify the feasibility of using state-of-the-art technologies in the area of planning, scheduling, fault detection using model-based diagnosis and knowledge processing to obtain a higher level of autonomy in ground/space systems.

  2. NASA DC-8 Mission Manager Walter Klein and Chilean Air Force Advisor Captain Saez review maps of the Antarctic Peninsula during an AirSAR 2004 mission

    NASA Image and Video Library

    2004-03-13

    NASA DC-8 Mission Manager Walter Klein and Chilean Air Force Advisor Captain Saez review maps of the Antarctic Peninsula during an AirSAR 2004 mission. AirSAR 2004 is a three-week expedition in Central and South America by an international team of scientists that is using an all-weather imaging tool, called the Airborne Synthetic Aperture Radar (AirSAR), located onboard NASA's DC-8 airborne laboratory. Scientists from many parts of the world are combining ground research with NASA's AirSAR technology to improve and expand on the quality of research they are able to conduct. These photos are from the DC-8 aircraft while flying an AirSAR mission over Antarctica. The Antarctic Peninsula is more similar to Alaska and Patagonia than to the rest of the Antarctic continent. It is drained by fast glaciers, receives abundant precipitation, and melts significantly in the summer months. In recent decades, the Peninsula has experienced significant atmospheric warming (about 2 degrees C since 1950), which has triggered a vast and spectacular retreat of its floating ice shelves, glacier reduction, a decrease in permanent snow cover and a lengthening of the melt season. As a result, the contribution to sea level from this region could be rapid and substantial. With an area of 120,000 km, or ten times the Patagonia ice fields, the Peninsula could contribute as much as 0.4mm/yr sea level rise, which would be the largest single contribution to sea level from anywhere in the world. This region is being studied by NASA using a DC-8 equipped with the Airborne Synthetic Aperture Radar developed by scientists from NASA’s Jet Propulsion Laboratory. AirSAR will provide a baseline model and unprecedented mapping of the region. This data will make it possible to determine whether the warming trend is slowing, continuing or accelerating. AirSAR will also provide reliable information on ice shelf thickness to measure the contribution of the glaciers to sea level.

  3. Sleep and cognitive function of crewmembers and mission controllers working 24-h shifts during a simulated 105-day spaceflight mission

    NASA Astrophysics Data System (ADS)

    Barger, Laura K.; Wright, Kenneth P.; Burke, Tina M.; Chinoy, Evan D.; Ronda, Joseph M.; Lockley, Steven W.; Czeisler, Charles A.

    2014-01-01

    The success of long-duration space missions depends on the ability of crewmembers and mission support specialists to be alert and maintain high levels of cognitive function while operating complex, technical equipment. We examined sleep, nocturnal melatonin levels and cognitive function of crewmembers and the sleep and cognitive function of mission controllers who participated in a high-fidelity 105-day simulated spaceflight mission at the Institute of Biomedical Problems (Moscow). Crewmembers were required to perform daily mission duties and work one 24-h extended duration work shift every sixth day. Mission controllers nominally worked 24-h extended duration shifts. Supplemental lighting was provided to crewmembers and mission controllers. Participants' sleep was estimated by wrist-actigraphy recordings. Overall, results show that crewmembers and mission controllers obtained inadequate sleep and exhibited impaired cognitive function, despite countermeasure use, while working extended duration shifts. Crewmembers averaged 7.04±0.92 h (mean±SD) and 6.94±1.08 h (mean±SD) in the two workdays prior to the extended duration shifts, 1.88±0.40 h (mean±SD) during the 24-h work shift, and then slept 10.18±0.96 h (mean±SD) the day after the night shift. Although supplemental light was provided, crewmembers' average nocturnal melatonin levels remained elevated during extended 24-h work shifts. Naps and caffeine use were reported by crewmembers during ˜86% and 45% of extended night work shifts, respectively. Even with reported use of wake-promoting countermeasures, significant impairments in cognitive function were observed. Mission controllers slept 5.63±0.95 h (mean±SD) the night prior to their extended duration work shift. On an average, 89% of night shifts included naps with mission controllers sleeping an average of 3.4±1.0 h (mean±SD) during the 24-h extended duration work shift. Mission controllers also showed impaired cognitive function during extended duration work shifts. These findings indicate that extended duration work shifts present a significant challenge to crewmembers and mission support specialists during long-duration space mission operations. Future research is needed to evaluate the efficacy of alternative work schedules and the development and implementation of more effective countermeasures will be required to maintain high levels of performance.

  4. FGPA Mission Assurance Center (FMAC) Support Activity at the University of New Mexico

    DTIC Science & Technology

    2013-10-31

    5 4.2 Additive Manufacturing ...BFM) to model Approved for public release; distribution unlimited 13 the high level behavior of the system. BFM have the additional advantage of a...theory, this extends the FPGA’s physical resources infinitely. The second implication is that DPR can mitigate FPGA’s high power consumption (by trading

  5. Education and Public Outreach and Engagement at NASA's Analog Missions in 2012

    NASA Technical Reports Server (NTRS)

    Watkins, Wendy L.; Janoiko, Barbara A.; Mahoney, Erin; Hermann, Nicole B.

    2013-01-01

    Analog missions are integrated, multi-disciplinary activities that test key features of future human space exploration missions in an integrated fashion to gain a deeper understanding of system-level interactions and operations early in conceptual development. These tests often are conducted in remote and extreme environments that are representative in one or more ways to that of future spaceflight destinations. They may also be conducted at NASA facilities, using advanced modeling and human-in-the-loop scenarios. As NASA develops a capability driven framework to transport crew to a variety of space environments, it will use analog missions to gather requirements and develop the technologies necessary to ensure successful exploration beyond low Earth orbit. NASA s Advanced Exploration Systems (AES) Division conducts these high-fidelity integrated tests, including the coordination and execution of a robust education and public outreach (EPO) and engagement program for each mission. Conducting these mission scenarios in unique environments not only provides an opportunity to test the EPO concepts for the particular future-mission scenario, such as the best methods for conducting events with a communication time delay, but it also provides an avenue to deliver NASA s human space exploration key messages. These analogs are extremely exciting to students and the public, and they are performed in such a way that the public can feel like part of the mission. They also provide an opportunity for crew members to obtain training in education and public outreach activities similar to what they would perform in space. The analog EPO team is responsible for the coordination and execution of the events, the overall social media component for each mission, and public affairs events such as media visits and interviews. They also create new and exciting ways to engage the public, manage and create website content, coordinate video footage for missions, and coordinate and integrate each activity into the mission timeline. In 2012, the AES Analog Missions Project performed three distinct missions - NASA Extreme Environment Mission Operations (NEEMO), which simulated a mission to an asteroid using an undersea laboratory; In-Situ Resource Utilization (ISRU) Field Test, which simulated a robotic mission to the moon searching and drilling for water; and Research and Technology Studies (RATS) integrated tests, which also simulated a mission to an asteroid. This paper will discuss the education and public engagement that occurred during these missions.

  6. Sensitivity analysis for future space missions with segmented telescopes for high-contrast imaging

    NASA Astrophysics Data System (ADS)

    Leboulleux, Lucie; Pueyo, Laurent; Sauvage, Jean-François; Mazoyer, Johan; Soummer, Remi; Fusco, Thierry; Sivaramakrishnan, Anand

    2018-01-01

    The detection and analysis of biomarkers on earth-like planets using direct-imaging will require both high-contrast imaging and spectroscopy at very close angular separation (10^10 star to planet flux ratio at a few 0.1”). This goal can only be achieved with large telescopes in space to overcome atmospheric turbulence, often combined with a coronagraphic instrument with wavefront control. Large segmented space telescopes such as studied for the LUVOIR mission will generate segment-level instabilities and cophasing errors in addition to local mirror surface errors and other aberrations of the overall optical system. These effects contribute directly to the degradation of the final image quality and contrast. We present an analytical model that produces coronagraphic images of a segmented pupil telescope in the presence of segment phasing aberrations expressed as Zernike polynomials. This model relies on a pair-based projection of the segmented pupil and provides results that match an end-to-end simulation with an rms error on the final contrast of ~3%. This analytical model can be applied both to static and dynamic modes, and either in monochromatic or broadband light. It retires the need for end-to-end Monte-Carlo simulations that are otherwise needed to build a rigorous error budget, by enabling quasi-instantaneous analytical evaluations. The ability to invert directly the analytical model provides direct constraints and tolerances on all segments-level phasing and aberrations.

  7. Modeling to Mars: a NASA Model Based Systems Engineering Pathfinder Effort

    NASA Technical Reports Server (NTRS)

    Phojanamongkolkij, Nipa; Lee, Kristopher A.; Miller, Scott T.; Vorndran, Kenneth A.; Vaden, Karl R.; Ross, Eric P.; Powell, Bobby C.; Moses, Robert W.

    2017-01-01

    The NASA Engineering Safety Center (NESC) Systems Engineering (SE) Technical Discipline Team (TDT) initiated the Model Based Systems Engineering (MBSE) Pathfinder effort in FY16. The goals and objectives of the MBSE Pathfinder include developing and advancing MBSE capability across NASA, applying MBSE to real NASA issues, and capturing issues and opportunities surrounding MBSE. The Pathfinder effort consisted of four teams, with each team addressing a particular focus area. This paper focuses on Pathfinder team 1 with the focus area of architectures and mission campaigns. These efforts covered the timeframe of February 2016 through September 2016. The team was comprised of eight team members from seven NASA Centers (Glenn Research Center, Langley Research Center, Ames Research Center, Goddard Space Flight Center IV&V Facility, Johnson Space Center, Marshall Space Flight Center, and Stennis Space Center). Collectively, the team had varying levels of knowledge, skills and expertise in systems engineering and MBSE. The team applied their existing and newly acquired system modeling knowledge and expertise to develop modeling products for a campaign (Program) of crew and cargo missions (Projects) to establish a human presence on Mars utilizing In-Situ Resource Utilization (ISRU). Pathfinder team 1 developed a subset of modeling products that are required for a Program System Requirement Review (SRR)/System Design Review (SDR) and Project Mission Concept Review (MCR)/SRR as defined in NASA Procedural Requirements. Additionally, Team 1 was able to perform and demonstrate some trades and constraint analyses. At the end of these efforts, over twenty lessons learned and recommended next steps have been identified.

  8. A SLAM II simulation model for analyzing space station mission processing requirements

    NASA Technical Reports Server (NTRS)

    Linton, D. G.

    1985-01-01

    Space station mission processing is modeled via the SLAM 2 simulation language on an IBM 4381 mainframe and an IBM PC microcomputer with 620K RAM, two double-sided disk drives and an 8087 coprocessor chip. Using a time phased mission (payload) schedule and parameters associated with the mission, orbiter (space shuttle) and ground facility databases, estimates for ground facility utilization are computed. Simulation output associated with the science and applications database is used to assess alternative mission schedules.

  9. Precise orbit determination for the most recent altimeter missions: towards the 1 mm/y stability of the radial orbit error at regional scales

    NASA Astrophysics Data System (ADS)

    Couhert, Alexandre

    The reference Ocean Surface Topography Mission/Jason-2 satellite (CNES/NASA) has been in orbit for six years (since June 2008). It extends the continuous record of highly accurate sea surface height measurements begun in 1992 by the Topex/Poseidon mission and continued in 2001 by the Jason-1 mission. The complementary missions CryoSat-2 (ESA), HY-2A (CNSA) and SARAL/AltiKa (CNES/ISRO), with lower altitudes and higher inclinations, were launched in April 2010, August 2011 and February 2013, respectively. Although the three last satellites fly in different orbits, they contribute to the altimeter constellation while enhancing the global coverage. The CNES Precision Orbit Determination (POD) Group delivers precise and homogeneous orbit solutions for these independent altimeter missions. The focus of this talk will be on the long-term stability of the orbit time series for mean sea level applications on a regional scale. We discuss various issues related to the assessment of radial orbit error trends; in particular orbit errors dependant on the tracking technique, the reference frame accuracy and stability, the modeling of the temporal variations of the geopotential. Strategies are then explored to meet a 1 mm/y radial orbit stability over decadal periods at regional scales, and the challenge of evaluating such an improvement is discussed.

  10. Space and Earth Sciences, Computer Systems, and Scientific Data Analysis Support, Volume 1

    NASA Technical Reports Server (NTRS)

    Estes, Ronald H. (Editor)

    1993-01-01

    This Final Progress Report covers the specific technical activities of Hughes STX Corporation for the last contract triannual period of 1 June through 30 Sep. 1993, in support of assigned task activities at Goddard Space Flight Center (GSFC). It also provides a brief summary of work throughout the contract period of performance on each active task. Technical activity is presented in Volume 1, while financial and level-of-effort data is presented in Volume 2. Technical support was provided to all Division and Laboratories of Goddard's Space Sciences and Earth Sciences Directorates. Types of support include: scientific programming, systems programming, computer management, mission planning, scientific investigation, data analysis, data processing, data base creation and maintenance, instrumentation development, and management services. Mission and instruments supported include: ROSAT, Astro-D, BBXRT, XTE, AXAF, GRO, COBE, WIND, UIT, SMM, STIS, HEIDI, DE, URAP, CRRES, Voyagers, ISEE, San Marco, LAGEOS, TOPEX/Poseidon, Pioneer-Venus, Galileo, Cassini, Nimbus-7/TOMS, Meteor-3/TOMS, FIFE, BOREAS, TRMM, AVHRR, and Landsat. Accomplishments include: development of computing programs for mission science and data analysis, supercomputer applications support, computer network support, computational upgrades for data archival and analysis centers, end-to-end management for mission data flow, scientific modeling and results in the fields of space and Earth physics, planning and design of GSFC VO DAAC and VO IMS, fabrication, assembly, and testing of mission instrumentation, and design of mission operations center.

  11. KSC00pp0044

    NASA Image and Video Library

    2000-01-13

    KENNEDY SPACE CENTER, Fla. -- At the 167-foot level of the Fixed Service Structure on Launch Pad 39A, the STS-99 crew pose for a photograph during Terminal Countdown Demonstration Test (TCDT) activities. Standing left to right are Mission Specialist Janet Lynn Kavandi (Ph.D.), Commander Kevin Kregel, Mission Specialists Janice Voss (Ph.D.), Gerhard Thiele and Mamoru Mohri, and Pilot Dominic Gorie. Thiele is with the European Space Agency and Mohri is with the National Space Development Agency (NASDA) of Japan. Behind them (left) are visible the top of a solid rocket booster (white) and external tank (orange). The TCDT provides the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  12. KSC-00pp0044

    NASA Image and Video Library

    2000-01-13

    KENNEDY SPACE CENTER, Fla. -- At the 167-foot level of the Fixed Service Structure on Launch Pad 39A, the STS-99 crew pose for a photograph during Terminal Countdown Demonstration Test (TCDT) activities. Standing left to right are Mission Specialist Janet Lynn Kavandi (Ph.D.), Commander Kevin Kregel, Mission Specialists Janice Voss (Ph.D.), Gerhard Thiele and Mamoru Mohri, and Pilot Dominic Gorie. Thiele is with the European Space Agency and Mohri is with the National Space Development Agency (NASDA) of Japan. Behind them (left) are visible the top of a solid rocket booster (white) and external tank (orange). The TCDT provides the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  13. KSC-00pp0043

    NASA Image and Video Library

    2000-01-13

    KENNEDY SPACE CENTER, Fla. -- At the 167-foot level of the Fixed Service Structure on Launch Pad 39A, the STS-99 crew pose for a photograph during Terminal Countdown Demonstration Test (TCDT) activities. Standing left to right are Mission Specialist Janet Lynn Kavandi (Ph.D.), Commander Kevin Kregel, Mission Specialists Janice Voss (Ph.D.), Gerhard Thiele and Mamoru Mohri, and Pilot Dominic Gorie. Thiele is with the European Space Agency and Mohri is with the National Space Development Agency (NASDA) of Japan. Behind them are visible the top of a solid rocket booster (white) and external tank (orange). The TCDT provides the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  14. KSC00pp0043

    NASA Image and Video Library

    2000-01-13

    KENNEDY SPACE CENTER, Fla. -- At the 167-foot level of the Fixed Service Structure on Launch Pad 39A, the STS-99 crew pose for a photograph during Terminal Countdown Demonstration Test (TCDT) activities. Standing left to right are Mission Specialist Janet Lynn Kavandi (Ph.D.), Commander Kevin Kregel, Mission Specialists Janice Voss (Ph.D.), Gerhard Thiele and Mamoru Mohri, and Pilot Dominic Gorie. Thiele is with the European Space Agency and Mohri is with the National Space Development Agency (NASDA) of Japan. Behind them are visible the top of a solid rocket booster (white) and external tank (orange). The TCDT provides the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  15. Seismic detectability of meteorite impacts on Europa

    NASA Astrophysics Data System (ADS)

    Tsuji, Daisuke; Teanby, Nicholas

    2016-04-01

    Europa, the second of Jupiter's Galilean satellites, has an icy outer shell, beneath which there is probably liquid water in contact with a rocky core. Europa, may thus provide an example of a sub-surface habitable environment so is an attractive object for future lander missions. In fact, the Jupiter Icy Moon Explorer (JUICE) mission has been selected for the L1 launch slot of ESA's Cosmic Vision science programme with the aim of launching in 2022 to explore Jupiter and its potentially habitable icy moons. One of the best ways to probe icy moon interiors in any future mission will be with a seismic investigation. Previously, the Apollo seismic experiment, installed by astronauts, enhanced our knowledge of the lunar interior. For a recent mission, NASA's 2016 InSight Mars lander aims to obtain seismic data and will deploy a seismometer directly onto Mars' surface. Motivated by these works, in this study we show how many meteorite impacts will be detected using a single seismic station on Europa, which will be useful for planning the next generation of outer solar system missions. To this end, we derive: (1) the current small impact flux on Europa from Jupiter impact rate models; (2) a crater diameter versus impactor energy scaling relation for ice by merging previous experiments and simulations; (3) scaling relations for seismic signals as a function of distance from an impact site for a given crater size based on analogue explosive data obtained on Earth's icy surfaces. Finally, resultant amplitudes are compared to the noise level of a likely seismic instrument (based on the NASA InSight mission seismometers) and the number of detectable impacts are estimated. As a result, 0.5-3.0 local/regional small impacts (i.e., direct P-waves through the ice crust) are expected to be detected per year, while global-scale impact events (i.e., PKP-waves refracted through the mantle) are rare and unlikely to be detected by a short duration mission. We note that our results are only appropriate for order of magnitude calculations because of considerable uncertainties in the small impactor source population, internal structure, and ambient noise level. However, our results suggest that probing the deep interior using impacts will be challenging and require an extended mission duration and low noise levels to give a reasonable chance of detection. Therefore, for future seismic exploration, faulting due to stresses in the rigid outer ice shell is likely to be much more viable mechanism for probing the interior.

  16. Mars MetNet Mission - Martian Atmospheric Observational Post Network

    NASA Astrophysics Data System (ADS)

    Harri, Ari-Matti; Aleksashkin, Sergey; Arruego, Ignacio; Schmidt, Walter; Ponomarenko, Andrey; Apestigue, Victor; Genzer, Maria; Vazquez, Luis; Uspensky, Mikhail; Haukka, Harri

    2016-04-01

    A new kind of planetary exploration mission for Mars is under development in collaboration between the Finnish Meteorological Institute (FMI), Lavochkin Association (LA), Space Research Institute (IKI) and Institutio Nacional de Tecnica Aerospacial (INTA). The Mars MetNet mission is based on a new semi-hard landing vehicle called MetNet Lander (MNL). The scientific payload of the Mars MetNet Precursor [1] mission is divided into three categories: Atmospheric instruments, Optical devices and Composition and structure devices. Each of the payload instruments will provide significant insights in to the Martian atmospheric behavior. The key technologies of the MetNet Lander have been qualified and the electrical qualification model (EQM) of the payload bay has been built and successfully tested. MetNet Lander The MetNet landing vehicles are using an inflatable entry and descent system instead of rigid heat shields and parachutes as earlier semi-hard landing devices have used. This way the ratio of the payload mass to the overall mass is optimized. The landing impact will burrow the payload container into the Martian soil providing a more favorable thermal environment for the electronics and a suitable orientation of the telescopic boom with external sensors and the radio link antenna. It is planned to deploy several tens of MNLs on the Martian surface operating at least partly at the same time to allow meteorological network science. Strawman Scientific Payload The strawman payload of the two MNL precursor models includes the following instruments: Atmospheric instruments: • MetBaro Pressure device • MetHumi Humidity device • MetTemp Temperature sensors Optical devices: • PanCam Panoramic • MetSIS Solar irradiance sensor with OWLS optical wireless system for data transfer • DS Dust sensor Composition and Structure Devices: • Tri-axial magnetometer MOURA • Tri-axial System Accelerometer The descent processes dynamic properties are monitored by a special 3-axis accelerometer combined with a 3-axis gyrometer. The data will be sent via auxiliary beacon antenna throughout the descent phase starting shortly after separation from the spacecraft. MetNet Mission payload instruments are specially designed to operate under very low power conditions. MNL flexible solar panels provides a total of approximately 0.7-0.8 W of electric power during the daylight time. As the provided power output is insufficient to operate all instruments simultaneously they are activated sequentially according to a specially designed cyclogram table which adapts itself to the different environmental constraints. Mission Status Full Qualification Model (QM) of the MetNet landing unit with the Precursor Mission payload is currently under functional tests. In the near future the QM unit will be exposed to environmental tests with qualification levels including vibrations, thermal balance, thermal cycling and mechanical impact shock. One complete flight unit of the entry, descent and landing systems (EDLS) has been manufactured and tested with acceptance levels. Another flight-like EDLS has been exposed to most of the qualification tests, and hence it may be used for flight after refurbishments. Accordingly two flight-capable EDLS systems exist. The eventual goal is to create a network of atmospheric observational posts around the Martian surface. Even if the MetNet mission is focused on the atmospheric science, the mission payload will also include additional kinds of geophysical instrumentation. The next step in the MetNet Precursor Mission is the demonstration of the technical robustness and scientific capabilities of the MetNet type of landing vehicle. Definition of the Precursor Mission and discussions on launch opportunities are currently under way. The baseline program development funding exists for the next five years. Flight unit manufacture of the payload bay takes about 18 months, and it will be commenced after the Precursor Mission has been defined. References [1] http://metnet.fmi.fi

  17. Mars MetNet Mission Status

    NASA Astrophysics Data System (ADS)

    Harri, Ari-Matti; Aleksashkin, Sergei; Arruego, Ignacio; Schmidt, Walter; Genzer, Maria; Vazquez, Luis; Haukka, Harri

    2015-04-01

    New kind of planetary exploration mission for Mars is under development in collaboration between the Finnish Meteorological Institute (FMI), Lavochkin Association (LA), Space Research Institute (IKI) and Institutio Nacional de Tecnica Aerospacial (INTA). The Mars MetNet mission is based on a new semi-hard landing vehicle called MetNet Lander (MNL). The scientific payload of the Mars MetNet Precursor [1] mission is divided into three categories: Atmospheric instruments, Optical devices and Composition and structure devices. Each of the payload instruments will provide significant insights in to the Martian atmospheric behavior. The key technologies of the MetNet Lander have been qualified and the electrical qualification model (EQM) of the payload bay has been built and successfully tested. 1. MetNet Lander The MetNet landing vehicles are using an inflatable entry and descent system instead of rigid heat shields and parachutes as earlier semi-hard landing devices have used. This way the ratio of the payload mass to the overall mass is optimized. The landing impact will burrow the payload container into the Martian soil providing a more favorable thermal environment for the electronics and a suitable orientation of the telescopic boom with external sensors and the radio link antenna. It is planned to deploy several tens of MNLs on the Martian surface operating at least partly at the same time to allow meteorological network science. 2. Scientific Payload The payload of the two MNL precursor models includes the following instruments: Atmospheric instruments: 1. MetBaro Pressure device 2. MetHumi Humidity device 3. MetTemp Temperature sensors Optical devices: 1. PanCam Panoramic 2. MetSIS Solar irradiance sensor with OWLS optical wireless system for data transfer 3. DS Dust sensor The descent processes dynamic properties are monitored by a special 3-axis accelerometer combined with a 3-axis gyrometer. The data will be sent via auxiliary beacon antenna throughout the descent phase starting shortly after separation from the spacecraft. MetNet Mission payload instruments are specially designed to operate in very low power conditions. MNL flexible solar panels provides a total of approximately 0.7-0.8 W of electric power during the daylight time. As the provided power output is insufficient to operate all instruments simultaneously they are activated sequentially according to a specially designed cyclogram table which adapts itself to the different environmental constraints. 3. Mission Status Full Qualification Model (QM) of the MetNet landing unit with the Precursor Mission payload is currently under functional tests. In near future the QM unit will be exposed to environmental tests with qualification levels including vibrations, thermal balance, thermal cycling and mechanical impact shock. One complete flight unit of the entry, descent and landing systems (EDLS) has been manufactured and tested with acceptance levels. Another flight-like EDLS has been exposed to most of the qualification tests, and hence it may be used for flight after refurbishments. Accordingly two flight-capable EDLS systems exist. The eventual goal is to create a network of atmospheric observational posts around the Martian surface. Even if the MetNet mission is focused on the atmospheric science, the mission payload will also include additional kinds of geophysical instrumentation. The next step in the MetNet Precursor Mission to demonstrate the technical robustness and scientific capabilities of the MetNet type of landing vehicle. Definition of the Precursor Mission and discussions on launch opportunities are currently under way. The baseline program development funding exists for the next five years. Flight unit manufacture of the payload bay takes about 18 months, and it will be commenced after the Precursor Mission has been defined. References [1] http://metnet.fmi.fi

  18. Blood and small intestine cell kinetics under radiation exposures: Mathematical modeling

    NASA Astrophysics Data System (ADS)

    Smirnova, Olga

    Biophysical models, which describe the dynamics of vital body systems (namely, hematopoiesis and small intestinal epithelium) in mammals exposed to acute and chronic radiation, are developed. These models, based on conventional biological theories, are realized as the systems of nonlinear differential equations. Their variables and constant parameters have real biological meaning, that provides successful identification and verification of the models in hand. The explanation of a number of radiobiological effects, including those of the low-level long-term exposures, is proposed proceeding from the modeling results. It is proved that the predictions the models agree with the respective experimental data at both qualitative and quantitative levels. All this testifies to the efficiency of employment of the developed models in investigation and prediction of radiation effects on the hematopoietic and small intestinal epithelium systems, that can be used for the radiation risk assessment in the long-term space missions such as lunar colony and Mars voyage.

  19. Implementing Badhwar-O'Neill Galactic Cosmic Ray Model for the Analysis of Space Radiation Exposure

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; O'Neill, Patrick M.; Slaba, Tony C.

    2014-01-01

    For the analysis of radiation risks to astronauts and planning exploratory space missions, accurate energy spectrum of galactic cosmic radiation (GCR) is necessary. Characterization of the ionizing radiation environment is challenging because the interplanetary plasma and radiation fields are modulated by solar disturbances and the radiation doses received by astronauts in interplanetary space are likewise influenced. A model of the Badhwar-O'Neill 2011 (BO11) GCR environment, which is represented by GCR deceleration potential theta, has been derived by utilizing all of the GCR measurements from balloons, satellites, and the newer NASA Advanced Composition Explorer (ACE). In the BO11 model, the solar modulation level is derived from the mean international sunspot numbers with time-delay, which has been calibrated with actual flight instrument measurements to produce better GCR flux data fit during solar minima. GCR fluxes provided by the BO11 model were compared with various spacecraft measurements at 1 AU, and further comparisons were made for the tissue equivalent proportional counters measurements at low Earth orbits using the high-charge and energy transport (HZETRN) code and various GCR models. For the comparison of the absorbed dose and dose equivalent calculations with the measurements by Radiation Assessment Detector (RAD) at Gale crater on Mars, the intensities and energies of GCR entering the heliosphere were calculated by using the BO11 model, which accounts for time-dependent attenuation of the local interstellar spectrum of each element. The BO11 model, which has emphasized for the last 24 solar minima, showed in relatively good agreement with the RAD data for the first 200 sols, but it was resulted in to be less well during near the solar maximum of solar cycle 24 due to subtleties in the changing heliospheric conditions. By performing the error analysis of the BO11 model and the optimization in reducing overall uncertainty, the resultant BO13 model corrects the fit at solar maxima as well as being accurate at solar minima. The BO13 model is implemented to the NASA Space Cancer Risk model for the assessment of radiation risks. Overall cumulative probability distribution of solar modulation parameters represents the percentile rank of the average interplanetary GCR environment, and the probabilistic radiation risks can be assessed for various levels of GCR environment to support mission design and operational planning for future manned space exploration missions.

  20. Mitigating Adverse Effects of a Human Mission on Possible Martian Indigenous Ecosystems

    NASA Technical Reports Server (NTRS)

    Lupisella, M. L.

    2000-01-01

    Although human beings are, by most standards, the most capable agents to search for and detect extraterrestrial life, we are also potentially the most harmful. While there has been substantial work regarding forward contamination with respect to robotic missions, the issue of potential adverse effects on possible indigenous Martian ecosystems, such as biological contamination, due to a human mission has remained relatively unexplored and may require our attention now as this presentation will try to demonstrate by exploring some of the relevant scientific questions, mission planning challenges, and policy issues. An informal, high-level mission planning decision tree will be discussed and is included as the next page of this abstract. Some of the questions to be considered are: (1) To what extent could contamination due to a human presence compromise possible indigenous life forms? (2) To what extent can we control contamination? For example, will it be local or global? (3) What are the criteria for assessing the biological status of Mars, both regionally and globally? For example, can we adequately extrapolate from a few strategic missions such as sample return missions? (4) What should our policies be regarding our mission planning and possible interaction with what are likely to be microbial forms of extraterrestrial life? (5) Central to the science and mission planning issues is the role and applicability of terrestrial analogs, such as Lake Vostok for assessing drilling issues, and modeling techniques. Central to many of the policy aspects are scientific value, international law, public concern, and ethics. Exploring this overall issue responsibly requires an examination of all these aspects and how they interrelate. A chart is included, titled 'Mission Planning Decision Tree for Mitigating Adverse Effects to Possible Indigenous Martian Ecosystems due to a Human Mission'. It outlines what questions scientists should ask and answer before sending humans to Mars.

  1. Mitigating Adverse Effects of a Human Mission on Possible Martian Indigenous Ecosystems

    NASA Astrophysics Data System (ADS)

    Lupisella, M. L.

    2000-07-01

    Although human beings are, by most standards, the most capable agents to search for and detect extraterrestrial life, we are also potentially the most harmful. While there has been substantial work regarding forward contamination with respect to robotic missions, the issue of potential adverse effects on possible indigenous Martian ecosystems, such as biological contamination, due to a human mission has remained relatively unexplored and may require our attention now as this presentation will try to demonstrate by exploring some of the relevant scientific questions, mission planning challenges, and policy issues. An informal, high-level mission planning decision tree will be discussed and is included as the next page of this abstract. Some of the questions to be considered are: (1) To what extent could contamination due to a human presence compromise possible indigenous life forms? (2) To what extent can we control contamination? For example, will it be local or global? (3) What are the criteria for assessing the biological status of Mars, both regionally and globally? For example, can we adequately extrapolate from a few strategic missions such as sample return missions? (4) What should our policies be regarding our mission planning and possible interaction with what are likely to be microbial forms of extraterrestrial life? (5) Central to the science and mission planning issues is the role and applicability of terrestrial analogs, such as Lake Vostok for assessing drilling issues, and modeling techniques. Central to many of the policy aspects are scientific value, international law, public concern, and ethics. Exploring this overall issue responsibly requires an examination of all these aspects and how they interrelate. A chart is included, titled 'Mission Planning Decision Tree for Mitigating Adverse Effects to Possible Indigenous Martian Ecosystems due to a Human Mission'. It outlines what questions scientists should ask and answer before sending humans to Mars.

  2. Model-Based Systems Engineering With the Architecture Analysis and Design Language (AADL) Applied to NASA Mission Operations

    NASA Technical Reports Server (NTRS)

    Munoz Fernandez, Michela Miche

    2014-01-01

    The potential of Model Model Systems Engineering (MBSE) using the Architecture Analysis and Design Language (AADL) applied to space systems will be described. AADL modeling is applicable to real-time embedded systems- the types of systems NASA builds. A case study with the Juno mission to Jupiter showcases how this work would enable future missions to benefit from using these models throughout their life cycle from design to flight operations.

  3. Big Software for SmallSats: Adapting cFS to CubeSat Missions

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan P.; Crum, Gary Alex; Sheikh, Salman; Marshall, James

    2015-01-01

    Expanding capabilities and mission objectives for SmallSats and CubeSats is driving the need for reliable, reusable, and robust flight software. While missions are becoming more complicated and the scientific goals more ambitious, the level of acceptable risk has decreased. Design challenges are further compounded by budget and schedule constraints that have not kept pace. NASA's Core Flight Software System (cFS) is an open source solution which enables teams to build flagship satellite level flight software within a CubeSat schedule and budget. NASA originally developed cFS to reduce mission and schedule risk for flagship satellite missions by increasing code reuse and reliability. The Lunar Reconnaissance Orbiter, which launched in 2009, was the first of a growing list of Class B rated missions to use cFS.

  4. Dynamic analysis of space structures including elastic, multibody, and control behavior

    NASA Technical Reports Server (NTRS)

    Pinson, Larry; Soosaar, Keto

    1989-01-01

    The problem is to develop analysis methods, modeling stategies, and simulation tools to predict with assurance the on-orbit performance and integrity of large complex space structures that cannot be verified on the ground. The problem must incorporate large reliable structural models, multi-body flexible dynamics, multi-tier controller interaction, environmental models including 1g and atmosphere, various on-board disturbances, and linkage to mission-level performance codes. All areas are in serious need of work, but the weakest link is multi-body flexible dynamics.

  5. Benchmark Problems for Space Mission Formation Flying

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Leitner, Jesse A.; Folta, David C.; Burns, Richard

    2003-01-01

    To provide a high-level focus to distributed space system flight dynamics and control research, several benchmark problems are suggested for space mission formation flying. The problems cover formation flying in low altitude, near-circular Earth orbit, high altitude, highly elliptical Earth orbits, and large amplitude lissajous trajectories about co-linear libration points of the Sun-Earth/Moon system. These problems are not specific to any current or proposed mission, but instead are intended to capture high-level features that would be generic to many similar missions that are of interest to various agencies.

  6. Noise levels and their effects on Shuttle crewmembers' performance: Operational concerns

    NASA Technical Reports Server (NTRS)

    Koros, Anton S.; Adam, Susan C.; Wheelwright, Charles D.

    1993-01-01

    When excessive, noise can result in sleep interference, fatigue, interference with verbal communication, and hearing damage. Shuttle crewmembers are exposed to noise throughout their mission. The contribution of noise to decrements in crew performance over these extended exposure durations was the focus of this study. On the STS-40/SLS-1, mission noise levels were evaluated through the use of a sound level meter and a crew questionnaire. Crewmembers noted that sleep, concentration, and relaxation were negatively impacted by high noise levels. Speech Interference Levels (SIL's), calculated from the sound level measurements, suggested that crewmembers were required to raise their voice in order to be heard. No difficulty detecting caution and warning alarms was noted. The higher than desirable noise levels in Spacelab were attributed to flight specific payloads for which acoustic waivers were granted. It is recommended that current noise levels be reduced in Spacelab and the Orbiter Middeck especially as longer missions are planned for the buildup of Space Station Freedom. Levels of NC 50 are recommended in areas where speech communication is required and NC 40 in sleep areas. These levels are in accordance with the NASA Man-Systems Integration Standards. Measurements proposed for subsequent orbiter missions are discussed.

  7. Optimizing Decadal and Precursor Science on Small Solar System Bodies with Spacecraft/Rover Hybrids

    NASA Astrophysics Data System (ADS)

    Pavone, M.; Castillo, J. C.; Hoffman, J. A.; Nesnas, I. A.; Strange, N. J.

    2012-12-01

    In this paper we present a mission architecture for the systematic and affordable in-situ exploration of small Solar System bodies (such as asteroids, comets, and Martian moons). The proposed mission architecture stems from a paradigm-shifting approach whereby small bodies' low gravity is directly exploited in the design process, rather than being faced as a constraint. At a general level, a mother spacecraft (of the type of JPL's NEOSurveyor) would deploy on the surface of a small body one, or several, spacecraft/rover hybrids, which are small (<5Kg, ~10W), multi-faceted robots enclosing three mutually orthogonal flywheels and surrounded by external spikes (in particular, there is no external propulsion). By accelerating/decelerating the flywheels and by exploiting the low gravity environment, the hybrids would be capable of performing both long excursions (by hopping) and short traverses to specific locations (through a sequence of controlled "tumbles"). Their control would rely on synergistic operations with the mother spacecraft (where most of hybrids' perception and localization functionalities would be hosted), which would make the platforms minimalistic and in turn the entire mission architecture affordable. A fundamental aspect of this mission architecture is that the responsibility for primary science would be shared between the mothership and the hybrids, in particular, the mothership would provide broad area coverage, while the hybrid would zoom in on specific areas and conduct in-situ measurements. Specifically, in the first part of the paper we discuss the scientific rationale behind the proposed mission architecture (including traceability matrices for both the mothership and the hybrids for a number of potential targets), we present preliminary models and laboratory experiments for the hybrids, we present first-order estimates for critical subsystems (e.g., communication, power, thermal) and a preliminary study for synergistic mission operations, and we discuss high-level mission trades (including deployment strategies). In the second part, we tailor our mission architecture to the exploration of Mars' moon Phobos. The mission aims at exploring Phobos' Stickney crater, whose spectral similarities with C-type asteroids and variety of terrain properties make it a particularly interesting exploration target to address both high-priority science for the Martian system and strategic knowledge gaps for the future Human exploration of Mars.

  8. MCC level C formulation requirements. Shuttle TAEM targeting

    NASA Technical Reports Server (NTRS)

    Carman, G. L.; Montez, M. N.

    1980-01-01

    The level C requirements for the shuttle orbiter terminal area energy management (TAEM) guidance and flight control functions to be incorporated into the Mission Control Center entry profile planning processor are described. This processor is used for preentry evaluation of the entry through landing maneuvers, and includes a simplified three degree-of-freedom model of the body rotational dynamics that is necessary to account for the effects of attitude response on the trajectory dynamics. This simulation terminates at TAEM-autoland interface.

  9. Methodology for conceptual remote sensing spacecraft technology: insertion analysis balancing performance, cost, and risk

    NASA Astrophysics Data System (ADS)

    Bearden, David A.; Duclos, Donald P.; Barrera, Mark J.; Mosher, Todd J.; Lao, Norman Y.

    1997-12-01

    Emerging technologies and micro-instrumentation are changing the way remote sensing spacecraft missions are developed and implemented. Government agencies responsible for procuring space systems are increasingly requesting analyses to estimate cost, performance and design impacts of advanced technology insertion for both state-of-the-art systems as well as systems to be built 5 to 10 years in the future. Numerous spacecraft technology development programs are being sponsored by Department of Defense (DoD) and National Aeronautics and Space Administration (NASA) agencies with the goal of enhancing spacecraft performance, reducing mass, and reducing cost. However, it is often the case that technology studies, in the interest of maximizing subsystem-level performance and/or mass reduction, do not anticipate synergistic system-level effects. Furthermore, even though technical risks are often identified as one of the largest cost drivers for space systems, many cost/design processes and models ignore effects of cost risk in the interest of quick estimates. To address these issues, the Aerospace Corporation developed a concept analysis methodology and associated software tools. These tools, collectively referred to as the concept analysis and design evaluation toolkit (CADET), facilitate system architecture studies and space system conceptual designs focusing on design heritage, technology selection, and associated effects on cost, risk and performance at the system and subsystem level. CADET allows: (1) quick response to technical design and cost questions; (2) assessment of the cost and performance impacts of existing and new designs/technologies; and (3) estimation of cost uncertainties and risks. These capabilities aid mission designers in determining the configuration of remote sensing missions that meet essential requirements in a cost- effective manner. This paper discuses the development of CADET modules and their application to several remote sensing satellite mission concepts.

  10. What have we gained from GOCE, and what is still to be expected?

    NASA Astrophysics Data System (ADS)

    Pail, R.; Fecher, T.; Mayer-Gürr, T.; Rieser, D.; Schuh, W. D.; Brockmann, J. M.; Jäggi, A.; Höck, E.

    2012-04-01

    So far three releases of GOCE-only gravity field models applying the time-wise method have been computed in the frame of the ESA project "GOCE High-Level Processing Facility". They have been complemented by satellite-only combination models generated by the GOCO ("Gravity Observation Combination") consortium. Due to the fact that the processing strategy has remained practically unchanged for all releases, the continuous improvement by including more and more GOCE data can be analyzed. One of the basic features of the time-wise gravity field models (GOCE_TIM) is the fact, that no gravity field prior information is used, neither as reference model nor for constraining the solution. Therefore, the gain of knowledge on the Earth's gravity field derived purely from the GOCE mission can be evaluated. The idea of the complementary GOCO models is to improve the long to medium wavelengths of the gravity field solutions, which are rather weakly defined by GOCE orbit information, by inclusion of additional data from satellite sources such as GRACE, CHAMP and SLR, taking benefit from the individual strengths and favourable features of the individual data types. In this contribution, we will review which impact GOCE has achieved so far on global and regional gravity field modelling. Besides the gravity field modelling itself, the contributions of GOCE to several application fields, such as the computation of geodetic mean dynamic topography (MDT), and also for geophysical modelling of the lithosphere, will be highlighted. Special emphasis shall be given to the discussion to what extent the full variance-covariance information, representing very realistic error estimates of the gravity field accuracy, can be utilized. Finally, also a GOCE performance prediction shall be given. After the end of the extended mission phase by December 2012, currently several mission scenarios are discussed, such as either extending the mission period further as long as possible at the same altitude, or lowering the satellite by 10-20 km for a shorter period. Based on numerical simulation studies the pros and cons of several scenarios regarding the achievable gravity field accuracy shall be evaluated and quantified.

  11. Early Calibration Results of CYGNSS Mission

    NASA Astrophysics Data System (ADS)

    Balasubramaniam, R.; Ruf, C. S.; McKague, D. S.; Clarizia, M. P.; Gleason, S.

    2017-12-01

    The first of its kind, GNSS-R complete orbital mission, CYGNSS was successfully launched on Dec 15 2016. The goal of this mission is to accurately forecast the intensification of tropical cyclones by modelling its inner core. The 8 micro observatories of CYGNSS carry a passive instrument called Delay Doppler Mapping Instrument (DDMI). The DDMIs form a 2D representation called the Delay-Doppler Map (DDM) of the forward scattered power signal. Each DDMI outputs 4 DDMs per second which are compressed and sent to the ground resulting in a total of 32 sea-surface measurements produced by the CYGNSS constellation per second. These are subsequently used in the Level-2 wind retrieval algorithm to extract wind speed information. In this paper, we perform calibration and validation of CYGNSS measurements for accurate extraction of wind speed information. The calibration stage involves identification and correction for dependence of the CYGNSS observables namely Normalised Bistatic Radar Cross Section and Leading Edge Slope of the Integrated Delay Waveform over instrument parameters, geometry etc. The validation stage involves training of the Geophysical Model Function over a multitude of ground truth sources during the Atlantic hurricane season and also refined validation of high wind speed data products.

  12. Reliability-based trajectory optimization using nonintrusive polynomial chaos for Mars entry mission

    NASA Astrophysics Data System (ADS)

    Huang, Yuechen; Li, Haiyang

    2018-06-01

    This paper presents the reliability-based sequential optimization (RBSO) method to settle the trajectory optimization problem with parametric uncertainties in entry dynamics for Mars entry mission. First, the deterministic entry trajectory optimization model is reviewed, and then the reliability-based optimization model is formulated. In addition, the modified sequential optimization method, in which the nonintrusive polynomial chaos expansion (PCE) method and the most probable point (MPP) searching method are employed, is proposed to solve the reliability-based optimization problem efficiently. The nonintrusive PCE method contributes to the transformation between the stochastic optimization (SO) and the deterministic optimization (DO) and to the approximation of trajectory solution efficiently. The MPP method, which is used for assessing the reliability of constraints satisfaction only up to the necessary level, is employed to further improve the computational efficiency. The cycle including SO, reliability assessment and constraints update is repeated in the RBSO until the reliability requirements of constraints satisfaction are satisfied. Finally, the RBSO is compared with the traditional DO and the traditional sequential optimization based on Monte Carlo (MC) simulation in a specific Mars entry mission to demonstrate the effectiveness and the efficiency of the proposed method.

  13. Coordinating teams of autonomous vehicles: an architectural perspective

    NASA Astrophysics Data System (ADS)

    Czichon, Cary; Peterson, Robert W.; Mettala, Erik G.; Vondrak, Ivo

    2005-05-01

    In defense-related robotics research, a mission level integration gap exists between mission tasks (tactical) performed by ground, sea, or air applications and elementary behaviors enacted by processing, communications, sensors, and weaponry resources (platform specific). The gap spans ensemble (heterogeneous team) behaviors, automatic MOE/MOP tracking, and tactical task modeling/simulation for virtual and mixed teams comprised of robotic and human combatants. This study surveys robotic system architectures, compares approaches for navigating problem/state spaces by autonomous systems, describes an architecture for an integrated, repository-based modeling, simulation, and execution environment, and outlines a multi-tiered scheme for robotic behavior components that is agent-based, platform-independent, and extendable via plug-ins. Tools for this integrated environment, along with a distributed agent framework for collaborative task performance are being developed by a U.S. Army funded SBIR project (RDECOM Contract N61339-04-C-0005).

  14. Evaluation of global satellite gravity models using terrestrial gravity observations over the Kingdom of Saudi Arabia A. Alothman and B. Elsaka

    NASA Astrophysics Data System (ADS)

    Alothman, Abdulaziz; Elsaka, Basem

    The gravity field models from the GRACE and GOCE missions have increased the knowledge of the earth’s global gravity field. The latter GOCE mission has provided accuracies of about 1-2 cm and 1milli-Gal level in the global geoid and gravity anomaly, respectively. However, determining all wavelength ranges of the gravity field spectrum cannot be only achieved from satellite gravimetry but from the allowed terrestrial gravity data. In this contribution, we use a gravity network of 42 first-order absolute gravity stations, observed by LaCosta Romberg gravimeter during the period 1967-1969 by Ministry of Petroleum and Mineral Resources, to validate the GOCE gravity models in order to gain more detailed regional gravity information. The network stations are randomly distributed all over the country with a spacing of about 200 km apart. The results show that the geoid height and gravity anomaly determined from terrestrial gravity data agree with the GOCE based models and give additional information to the satellite gravity solutions.

  15. Venus Global Reference Atmospheric Model

    NASA Technical Reports Server (NTRS)

    Justh, Hilary L.

    2017-01-01

    Venus Global Reference Atmospheric Model (Venus-GRAM) is an engineering-level atmospheric model developed by MSFC that is widely used for diverse mission applications including: Systems design; Performance analysis; Operations planning for aerobraking, Entry, Descent and Landing, and aerocapture; Is not a forecast model; Outputs include density, temperature, pressure, wind components, and chemical composition; Provides dispersions of thermodynamic parameters, winds, and density; Optional trajectory and auxiliary profile input files Has been used in multiple studies and proposals including NASA Engineering and Safety Center (NESC) Autonomous Aerobraking and various Discovery proposals; Released in 2005; Available at: https://software.nasa.gov/software/MFS-32314-1.

  16. Literature Review on Modeling Cyber Networks and Evaluating Cyber Risks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kelic, Andjelka; Campbell, Philip L

    The National Infrastructure Simulations and Analysis Center (NISAC) conducted a literature review on modeling cyber networks and evaluating cyber risks. The literature review explores where modeling is used in the cyber regime and ways that consequence and risk are evaluated. The relevant literature clusters in three different spaces: network security, cyber-physical, and mission assurance. In all approaches, some form of modeling is utilized at varying levels of detail, while the ability to understand consequence varies, as do interpretations of risk. This document summarizes the different literature viewpoints and explores their applicability to securing enterprise networks.

  17. Aerocapture Technology Development for Planetary Science - Update

    NASA Technical Reports Server (NTRS)

    Munk, Michelle M.

    2006-01-01

    Within NASA's Science Mission Directorate is a technological program dedicated to improving the cost, mass, and trip time of future scientific missions throughout the Solar System. The In-Space Propulsion Technology (ISPT) Program, established in 2001, is charged with advancing propulsion systems used in space from Technology Readiness Level (TRL) 3 to TRL6, and with planning activities leading to flight readiness. The program's content has changed considerably since inception, as the program has refocused its priorities. One of the technologies that has remained in the ISPT portfolio through these changes is Aerocapture. Aerocapture is the use of a planetary body's atmosphere to slow a vehicle from hyperbolic velocity to a low-energy orbit suitable for science. Prospective use of this technology has repeatedly shown huge mass savings for missions of interest in planetary exploration, at Titan, Neptune, Venus, and Mars. With launch vehicle costs rising, these savings could be the key to mission viability. This paper provides an update on the current state of the Aerocapture technology development effort, summarizes some recent key findings, and highlights hardware developments that are ready for application to Aerocapture vehicles and entry probes alike. Description of Investments: The Aerocapture technology area within the ISPT program has utilized the expertise around NASA to perform Phase A-level studies of future missions, to identify technology gaps that need to be filled to achieve flight readiness. A 2002 study of the Titan Explorer mission concept showed that the combination of Aerocapture and a Solar Electric Propulsion system could deliver a lander and orbiter to Titan in half the time and on a smaller, less expensive launch vehicle, compared to a mission using chemical propulsion for the interplanetary injection and orbit insertion. The study also identified no component technology breakthroughs necessary to implement Aerocapture on such a mission. Similar studies of Aerocapture applications at Neptune, Venus, and Mars were studied in 2003 through 2005. All showed significant performance improvements for the missions studied. Findings from these studies were used to guide the technology development tasks originally solicited in a 2002 NASA ROSS Research Announcement. The tasks are now in their final year and have provided numerous improvements in modeling and hardware, for use in proposals or new mission starts. Major Accomplishments: Since validation of the Aerocapture maneuver requires a space flight, ground developments have focused on modeling and environment prediction, materials, and sensors. Lockheed Martin has designed and built a 2-meter Carbon-Carbon aeroshell "hot structure." The article utilizes co-cured stiffening ribs and advanced insulation to achieve large scale, and up to a 40% reduction in areal density over the Genesis probe construction. This concept would be an efficient solution for probes that experience heat rates near 800-1000 W/cm(exp 2), such as at Venus and Earth. Applied Research Associates has extensively tested a family of efficient ablative TPS materials that provide solutions for a range of heating conditions. These materials are being applied to high-temperature structures built by ATK Space Systems, led by Langley Research Center. One-meter aeroshells will be thermally tested to validate construction and demonstrate higher bondline temperatures, which can lead to mass savings of up to 30% over traditional heatshields. Ames Research Center has developed aeroshell instrumentation that could measure environmental conditions and material performance during atmospheric entry. Instruments to measure TPS recession, heat flux, and catalycity could be combined with traditional sensors to provide a "plug-and-play" system for minimal mass and power, that would acquire flight data for model improvement and risk reduction on future missions. Improved atmospheric and aerothermodynamic models ha also been a major focus of the program. Next Steps: Aerocapture is one of five technologies in competition for a flight validation opportunity through the New Millennium Program. If selected, a fully autonomous vehicle will perform an Aerocapture at Earth in 2010, and flight data will be used to validate the guidance system and the TPS material for science mission infusion.

  18. Robotic Access to Planetary Surfaces Capability Roadmap

    NASA Technical Reports Server (NTRS)

    2005-01-01

    A set of robotic access to planetary surfaces capability developments and supporting infrastructure have been identified. Reference mission pulls derived from ongoing strategic planning. Capability pushes to enable broader mission considerations. Facility and flight test capability needs. Those developments have been described to the level of detail needed for high-level planning. Content and approach. Readiness and metrics. Rough schedule and cost. Connectivity to mission concepts.

  19. Integrating O/S models during conceptual design, part 3

    NASA Technical Reports Server (NTRS)

    Ebeling, Charles E.

    1994-01-01

    Space vehicles, such as the Space Shuttle, require intensive ground support prior to, during, and after each mission. Maintenance is a significant part of that ground support. All space vehicles require scheduled maintenance to ensure operability and performance. In addition, components of any vehicle are not one-hundred percent reliable so they exhibit random failures. Once detected, a failure initiates unscheduled maintenance on the vehicle. Maintenance decreases the number of missions which can be completed by keeping vehicles out of service so that the time between the completion of one mission and the start of the next is increased. Maintenance also requires resources such as people, facilities, tooling, and spare parts. Assessing the mission capability and resource requirements of any new space vehicle, in addition to performance specification, is necessary to predict the life cycle cost and success of the vehicle. Maintenance and logistics support has been modeled by computer simulation to estimate mission capability and resource requirements for evaluation of proposed space vehicles. The simulation was written with Simulation Language for Alternative Modeling II (SLAM II) for execution on a personal computer. For either one or a fleet of space vehicles, the model simulates the preflight maintenance checks, the mission and return to earth, and the post flight maintenance in preparation to be sent back into space. THe model enables prediction of the number of missions possible and vehicle turn-time (the time between completion of one mission and the start of the next) given estimated values for component reliability and maintainability. The model also facilitates study of the manpower and vehicle requirements for the proposed vehicle to meet its desired mission rate. This is the 3rd part of a 3 part technical report.

  20. Space Operations Center system analysis study extension. Volume 4, book 1: SOC system analysis report

    NASA Technical Reports Server (NTRS)

    1982-01-01

    The Space Operations Center (SOC) orbital space station missions are analyzed. Telecommunications missions, space science, Earth sensing, and space testing missions, research and applications missions, defense missions, and satellite servicing missions are modeled and mission needs discussed. The satellite servicing missions are analyzed in detail, including construction and servicing equipment requirements, mission needs and benefits, differential drag characteristics of co-orbiting satellites, and satellite servicing transportation requirements.

  1. Trades Between Opposition and Conjunction Class Trajectories for Early Human Missions to Mars

    NASA Technical Reports Server (NTRS)

    Mattfeld, Bryan; Stromgren, Chel; Shyface, Hilary; Komar, David R.; Cirillo, William; Goodliff, Kandyce

    2014-01-01

    Candidate human missions to Mars, including NASA's Design Reference Architecture 5.0, have focused on conjunction-class missions with long crewed durations and minimum energy trajectories to reduce total propellant requirements and total launch mass. However, in order to progressively reduce risk and gain experience in interplanetary mission operations, it may be desirable that initial human missions to Mars, whether to the surface or to Mars orbit, have shorter total crewed durations and minimal stay times at the destination. Opposition-class missions require larger total energy requirements relative to conjunction-class missions but offer the potential for much shorter mission durations, potentially reducing risk and overall systems performance requirements. This paper will present a detailed comparison of conjunction-class and opposition-class human missions to Mars vicinity with a focus on how such missions could be integrated into the initial phases of a Mars exploration campaign. The paper will present the results of a trade study that integrates trajectory/propellant analysis, element design, logistics and sparing analysis, and risk assessment to produce a comprehensive comparison of opposition and conjunction exploration mission constructs. Included in the trade study is an assessment of the risk to the crew and the trade offs between the mission duration and element, logistics, and spares mass. The analysis of the mission trade space was conducted using four simulation and analysis tools developed by NASA. Trajectory analyses for Mars destination missions were conducted using VISITOR (Versatile ImpulSive Interplanetary Trajectory OptimizeR), an in-house tool developed by NASA Langley Research Center. Architecture elements were evaluated using EXploration Architecture Model for IN-space and Earth-to-orbit (EXAMINE), a parametric modeling tool that generates exploration architectures through an integrated systems model. Logistics analysis was conducted using NASA's Human Exploration Logistics Model (HELM), and sparing allocation predictions were generated via the Exploration Maintainability Analysis Tool (EMAT), which is a probabilistic simulation engine that evaluates trades in spacecraft reliability and sparing requirements based on spacecraft system maintainability and reparability.

  2. Advanced space power requirements and techniques. Task 1: Mission projections and requirements. Volume 3: Appendices. [cost estimates and computer programs

    NASA Technical Reports Server (NTRS)

    Wolfe, M. G.

    1978-01-01

    Contents: (1) general study guidelines and assumptions; (2) launch vehicle performance and cost assumptions; (3) satellite programs 1959 to 1979; (4) initiative mission and design characteristics; (5) satellite listing; (6) spacecraft design model; (7) spacecraft cost model; (8) mission cost model; and (9) nominal and optimistic budget program cost summaries.

  3. Study for analysis of benefit versus cost of low thrust propulsion system

    NASA Technical Reports Server (NTRS)

    Hamlyn, K. M.; Robertson, R. I.; Rose, L. J.

    1983-01-01

    The benefits and costs associated with placing large space systems (LSS) in operational orbits were investigated, and a flexible computer model for analyzing these benefits and costs was developed. A mission model for LSS was identified that included both NASA/Commercial and DOD missions. This model included a total of 68 STS launches for the NASA/Commercial missions and 202 launches for the DOD missions. The mission catalog was of sufficient depth to define the structure type, mass and acceleration limits of each LSS. Conceptual primary propulsion stages (PPS) designs for orbital transfer were developed for three low thrust LO2/LH2 engines baselined for the study. The performance characteristics for each of these PPS was compared to the LSS mission catalog to create a mission capture. The costs involved in placing the LSS in their operational orbits were identified. The two primary costs were that of the PPS and of the STS launch. The cost of the LSS was not included as it is not a function of the PPS performance. The basic relationships and algorithms that could be used to describe the costs were established. The benefit criteria for the mission model were also defined. These included mission capture, reliability, technical risk, development time, and growth potential. Rating guidelines were established for each parameter. For flexibility, each parameter is assigned a weighting factor.

  4. Near Earth Asteroid Scout Solar Sail Thrust and Torque Model

    NASA Technical Reports Server (NTRS)

    Heaton, Andy; Ahmad, Naeem; Miller, Kyle

    2017-01-01

    The Near Earth Asteroid (NEA) Scout is a solar sail mission whose objective is to scout at least one Near Earth Asteroid to help prepare for human missions to Near Earth Asteroids. NEA Scout will launch as a secondary payload on the first SLS-Orion mission. NEA Scout will perform a small trim maneuver shortly after deploy from the spent SLS upper stage using a cold gas propulsion system, but from that point on will depend entirely on the solar sail for thrust. As such, it is important to accurately characterize the thrust of the sail in order to achieve mission success. Additionally, the solar sail creates a relatively large solar disturbance torque that must be mitigated. For early mission design studies a flat plate model of the solar sail with a fixed center of pressure was adequate, but as mission concepts and the sail design matured, greater fidelity was required. Here we discuss the progress to a three-dimensional sail model that includes the effects of tension and thermal deformation that has been derived from a large structural Finite Element Model (FEM) developed by the Langley Research Center. We have found that the deformed sail membrane affects torque relatively much more than thrust; a flat plate model could potentially model thrust well enough to close mission design studies, but a three-dimensional solar sail is essential to control system design. The three-dimensional solar sail model revealed that thermal deformations of unshielded booms would create unacceptably large solar disturbance torques. The original large FEM model was used in control and mission simulations, but was resulted in simulations with prohibitive run times. This led us to adapt the Generalized Sail Model (GSM) of Rios-Reyes. A design reference sail model has been baselined for NEA Scout and has been used to design the mission and control system for the sailcraft. Additionally, since NEA Scout uses reaction wheels for attitude pointing and control, the solar torque model is essentially to successfully design the NEA Scout momentum management control system. We have also updated the estimate of diffusivity used for the aluminized sail material based on optical testing of wrinkled sail material. The model presented here represents the current state of the art of NASA's ability to model solar sail thrust and torque.

  5. Modeling and Simulation for Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Seah, Chin; Trimble, Jay P.; Sims, Michael H.

    2003-01-01

    Work System analysis and design is complex and non-deterministic. In this paper we describe Brahms, a multiagent modeling and simulation environment for designing complex interactions in human-machine systems. Brahms was originally conceived as a business process design tool that simulates work practices, including social systems of work. We describe our modeling and simulation method for mission operations work systems design, based on a research case study in which we used Brahms to design mission operations for a proposed discovery mission to the Moon. We then describe the results of an actual method application project-the Brahms Mars Exploration Rover. Space mission operations are similar to operations of traditional organizations; we show that the application of Brahms for space mission operations design is relevant and transferable to other types of business processes in organizations.

  6. KSC-00pp0050

    NASA Image and Video Library

    2000-01-13

    KENNEDY SPACE CENTER, Fla. -- At the 195-foot level on the Fixed Service Structure, Launch Pad 39A, the STS-99 crew receive instructions about emergency egress. From left (in uniform) are Mission Specialists Janice Voss (Ph.D.), Janet Lynn Kavandi (Ph.D.), Gerhard Thiele and Mamoru Mohri, Pilot Dominic Gorie and Commander Kevin Kregel. In the background can be seen the Vehicle Assembly Building at left and the waters of Banana Creek in between. The crew are taking part in Terminal Countdown Demonstration Test activities, which provide them with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  7. Systems Engineering for Space Exploration Medical Capabilities

    NASA Technical Reports Server (NTRS)

    Mindock, Jennifer; Reilly, Jeffrey; Rubin, David; Urbina, Michelle; Hailey, Melinda; Hanson, Andrea; Burba, Tyler; McGuire, Kerry; Cerro, Jeffrey; Middour, Chris; hide

    2017-01-01

    Human exploration missions that reach destinations beyond low Earth orbit, such as Mars, will present significant new challenges to crew health management. For the medical system, lack of consumable resupply, evacuation opportunities, and real-time ground support are key drivers toward greater autonomy. Recognition of the limited mission and vehicle resources available to carry out exploration missions motivates the Exploration Medical Capability (ExMC) Element's approach to enabling the necessary autonomy. The Element's work must integrate with the overall exploration mission and vehicle design efforts to successfully provide exploration medical capabilities. ExMC is applying systems engineering principles and practices to accomplish its goals. This paper discusses the structured and integrative approach that is guiding the medical system technical development. Assumptions for the required levels of care on exploration missions, medical system goals, and a Concept of Operations are early products that capture and clarify stakeholder expectations. Model-Based Systems Engineering techniques are then applied to define medical system behavior and architecture. Interfaces to other flight and ground systems, and within the medical system are identified and defined. Initial requirements and traceability are established, which sets the stage for identification of future technology development needs. An early approach for verification and validation, taking advantage of terrestrial and near-Earth exploration system analogs, is also defined to further guide system planning and development.

  8. KSC00pp0050

    NASA Image and Video Library

    2000-01-13

    KENNEDY SPACE CENTER, Fla. -- At the 195-foot level on the Fixed Service Structure, Launch Pad 39A, the STS-99 crew receive instructions about emergency egress. From left (in uniform) are Mission Specialists Janice Voss (Ph.D.), Janet Lynn Kavandi (Ph.D.), Gerhard Thiele and Mamoru Mohri, Pilot Dominic Gorie and Commander Kevin Kregel. In the background can be seen the Vehicle Assembly Building at left and the waters of Banana Creek in between. The crew are taking part in Terminal Countdown Demonstration Test activities, which provide them with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST

  9. A Risk-Based Approach for Aerothermal/TPS Analysis and Testing

    NASA Technical Reports Server (NTRS)

    Wright, Michael J.; Grinstead, Jay H.; Bose, Deepak

    2007-01-01

    The current status of aerothermal and thermal protection system modeling for civilian entry missions is reviewed. For most such missions, the accuracy of our simulations is limited not by the tools and processes currently employed, but rather by reducible deficiencies in the underlying physical models. Improving the accuracy of and reducing the uncertainties in these models will enable a greater understanding of the system level impacts of a particular thermal protection system and of the system operation and risk over the operational life of the system. A strategic plan will be laid out by which key modeling deficiencies can be identified via mission-specific gap analysis. Once these gaps have been identified, the driving component uncertainties are determined via sensitivity analyses. A Monte-Carlo based methodology is presented for physics-based probabilistic uncertainty analysis of aerothermodynamics and thermal protection system material response modeling. These data are then used to advocate for and plan focused testing aimed at reducing key uncertainties. The results of these tests are used to validate or modify existing physical models. Concurrently, a testing methodology is outlined for thermal protection materials. The proposed approach is based on using the results of uncertainty/sensitivity analyses discussed above to tailor ground testing so as to best identify and quantify system performance and risk drivers. A key component of this testing is understanding the relationship between the test and flight environments. No existing ground test facility can simultaneously replicate all aspects of the flight environment, and therefore good models for traceability to flight are critical to ensure a low risk, high reliability thermal protection system design. Finally, the role of flight testing in the overall thermal protection system development strategy is discussed.

  10. Cost Analysis in a Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Felton, Larry; Newhouse, Marilyn; Bornas, Nick; Botts, Dennis; Ijames, Gayleen; Montgomery, Patty; Roth, Karl

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single mission-type support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multi-mission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature service-oriented, multi-mission control centers to streamline or refine their cost analysis process.

  11. Adhesive Bonding for Optical Metrology Systems in Space Applications

    NASA Astrophysics Data System (ADS)

    Gohlke, Martin; Schuldt, Thilo; Döringshoff, Klaus; Peters, Achim; Johann, Ulrich; Weise, Dennis; Braxmaier, Claus

    2015-05-01

    Laser based metrology systems become more and more attractive for space applications and are the core elements of planned missions such as LISA (NGO, eLISA) or NGGM where laser interferometry is used for distance measurements between satellites. The GRACE-FO mission will for the first time demonstrate a Laser Ranging Instrument (LRI) in space, starting 2017. Laser based metrology also includes optical clocks/references, either as ultra-stable light source for high sensitivity interferometry or as scientific payload e.g. proposed in fundamental physics missions such as mSTAR (mini SpaceTime Asymmetry Research), a mission dedicated to perform a Kennedy-Thorndike experiment on a satellite in a low-Earth orbit. To enable the use of existing optical laboratory setups, optimization with respect to power consumption, weight and dimensions is necessary. At the same time the thermal and structural stability must be increased. Over the last few years we investigated adhesive bonding of optical components to thermally highly stable glass ceramics as an easy-to-handle assembly integration technology. Several setups were implemented and tested for potential later use in space applications. We realized a heterodyne LISA related interferometer with demonstrated noise levels in the pm-range for translation measurement and nano-radiant-range for tilt measurements and two iodine frequency references on Elegant Breadboard (EBB) and Engineering Model (EM) level with frequency stabilities in the 10-15 range for longer integration times. The EM setup was thermally cycled and vibration tested.

  12. NASA'S Earth Science Data Stewardship Activities

    NASA Technical Reports Server (NTRS)

    Lowe, Dawn R.; Murphy, Kevin J.; Ramapriyan, Hampapuram

    2015-01-01

    NASA has been collecting Earth observation data for over 50 years using instruments on board satellites, aircraft and ground-based systems. With the inception of the Earth Observing System (EOS) Program in 1990, NASA established the Earth Science Data and Information System (ESDIS) Project and initiated development of the Earth Observing System Data and Information System (EOSDIS). A set of Distributed Active Archive Centers (DAACs) was established at locations based on science discipline expertise. Today, EOSDIS consists of 12 DAACs and 12 Science Investigator-led Processing Systems (SIPS), processing data from the EOS missions, as well as the Suomi National Polar Orbiting Partnership mission, and other satellite and airborne missions. The DAACs archive and distribute the vast majority of data from NASA’s Earth science missions, with data holdings exceeding 12 petabytes The data held by EOSDIS are available to all users consistent with NASA’s free and open data policy, which has been in effect since 1990. The EOSDIS archives consist of raw instrument data counts (level 0 data), as well as higher level standard products (e.g., geophysical parameters, products mapped to standard spatio-temporal grids, results of Earth system models using multi-instrument observations, and long time series of Earth System Data Records resulting from multiple satellite observations of a given type of phenomenon). EOSDIS data stewardship responsibilities include ensuring that the data and information content are reliable, of high quality, easily accessible, and usable for as long as they are considered to be of value.

  13. NASA's Automated Rendezvous and Docking/Capture Sensor Development and Its Applicability to the GER

    NASA Technical Reports Server (NTRS)

    Hinkel, Heather; Cryan, Scott; DSouza, Christopher; Strube, Matthew

    2014-01-01

    This paper will address how a common Automated Rendezvous and Docking/Capture (AR&D/C) sensor suite can support Global Exploration Roadmap (GER) missions, and discuss how the model of common capability development to support multiple missions can enable system capability level partnerships and further GER objectives. NASA has initiated efforts to develop AR&D/C sensors, that are directly applicable to GER. NASA needs AR&D/C sensors for both the robotic and crewed segments of the Asteroid Redirect Mission (ARM). NASA recently conducted a commonality assessment of the concept of operations for the robotic Asteroid Redirect Vehicle (ARV) and the crewed mission segment using the Orion crew vehicle. The commonality assessment also considered several future exploration and science missions requiring an AR&D/C capability. Missions considered were asteroid sample return, satellite servicing, and planetary entry, descent, and landing. This assessment determined that a common sensor suite consisting of one or more visible wavelength cameras, a three-dimensional LIDAR along with long-wavelength infrared cameras for robustness and situational awareness could be used on each mission to eliminate the cost of multiple sensor developments and qualifications. By choosing sensor parameters at build time instead of at design time and, without having to requalify flight hardware, a specific mission can design overlapping bearing, range, relative attitude, and position measurement availability to suit their mission requirements with minimal nonrecurring engineering costs. The resulting common sensor specification provides the union of all performance requirements for each mission and represents an improvement over the current systems used for AR&D/C today. NASA's AR&D/C sensor development path could benefit the International Exploration Coordination Group (ISECG) and support the GER mission scenario by providing a common sensor suite upon which GER objectives could be achieved while minimizing development costs. The paper will describe the concepts of operations of these missions and how the common sensors are utilized by each mission. It will also detail the potential partnerships and contribution of the International community in the development of this common AR&D/C sensor suite.

  14. Goddard's Astrophysics Science Divsion Annual Report 2014

    NASA Technical Reports Server (NTRS)

    Weaver, Kimberly (Editor); Reddy, Francis (Editor); Tyler, Pat (Editor)

    2015-01-01

    The Astrophysics Science Division (ASD, Code 660) is one of the world's largest and most diverse astronomical organizations. Space flight missions are conceived, built and launched to observe the entire range of the electromagnetic spectrum, from gamma rays to centimeter waves. In addition, experiments are flown to gather data on high-energy cosmic rays, and plans are being made to detect gravitational radiation from space-borne missions. To enable these missions, we have vigorous programs of instrument and detector development. Division scientists also carry out preparatory theoretical work and subsequent data analysis and modeling. In addition to space flight missions, we have a vibrant suborbital program with numerous sounding rocket and balloon payloads in development or operation. The ASD is organized into five labs: the Astroparticle Physics Lab, the X-ray Astrophysics Lab, the Gravitational Astrophysics Lab, the Observational Cosmology Lab, and the Exoplanets and Stellar Astrophysics Lab. The High Energy Astrophysics Science Archive Research Center (HEASARC) is an Office at the Division level. Approximately 400 scientists and engineers work in ASD. Of these, 80 are civil servant scientists, while the rest are resident university-based scientists, contractors, postdoctoral fellows, graduate students, and administrative staff. We currently operate the Swift Explorer mission and the Fermi Gamma-ray Space Telescope. In addition, we provide data archiving and operational support for the XMM mission (jointly with ESA) and the Suzaku mission (with JAXA). We are also a partner with Caltech on the NuSTAR mission. The Hubble Space Telescope Project is headquartered at Goddard, and ASD provides Project Scientists to oversee operations at the Space Telescope Science Institute. Projects in development include the Neutron Interior Composition Explorer (NICER) mission, an X-ray timing experiment for the International Space Station; the Transiting Exoplanet Sky Survey (TESS) Explorer mission, in collaboration with MIT (Ricker, PI); the Soft X-ray Spectrometer (SXS) for the Astro-H mission in collaboration with JAXA, and the James Webb Space Telescope (JWST). The Wide-Field Infrared Survey Telescope (WFIRST), the highest ranked mission in the 2010 decadal survey, is in a pre-phase A study, and we are supplying study scientists for that mission.

  15. Topic Modeling of NASA Space System Problem Reports: Research in Practice

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Nikora, Allen P.; Meek, Joshua; Menzies, Tim

    2016-01-01

    Problem reports at NASA are similar to bug reports: they capture defects found during test, post-launch operational anomalies, and document the investigation and corrective action of the issue. These artifacts are a rich source of lessons learned for NASA, but are expensive to analyze since problem reports are comprised primarily of natural language text. We apply topic modeling to a corpus of NASA problem reports to extract trends in testing and operational failures. We collected 16,669 problem reports from six NASA space flight missions and applied Latent Dirichlet Allocation topic modeling to the document corpus. We analyze the most popular topics within and across missions, and how popular topics changed over the lifetime of a mission. We find that hardware material and flight software issues are common during the integration and testing phase, while ground station software and equipment issues are more common during the operations phase. We identify a number of challenges in topic modeling for trend analysis: 1) that the process of selecting the topic modeling parameters lacks definitive guidance, 2) defining semantically-meaningful topic labels requires nontrivial effort and domain expertise, 3) topic models derived from the combined corpus of the six missions were biased toward the larger missions, and 4) topics must be semantically distinct as well as cohesive to be useful. Nonetheless,topic modeling can identify problem themes within missions and across mission lifetimes, providing useful feedback to engineers and project managers.

  16. The Mission Planning Lab: A Visualization and Analysis Tool

    NASA Technical Reports Server (NTRS)

    Daugherty, Sarah C.; Cervantes, Benjamin W.

    2009-01-01

    Simulation and visualization are powerful decision making tools that are time-saving and cost-effective. Space missions pose testing and e valuation challenges that can be overcome through modeling, simulatio n, and visualization of mission parameters. The National Aeronautics and Space Administration?s (NASA) Wallops Flight Facility (WFF) capi talizes on the benefits of modeling, simulation, and visualization to ols through a project initiative called The Mission Planning Lab (MPL ).

  17. Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Licatta, Angelo; Griffin, Devon

    2007-01-01

    The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.

  18. Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Griffin, Devon

    2008-01-01

    The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.

  19. Autonomous Mission Design in Extreme Orbit Environments

    NASA Astrophysics Data System (ADS)

    Surovik, David Allen

    An algorithm for autonomous online mission design at asteroids, comets, and small moons is developed to meet the novel challenges of their complex non-Keplerian orbit environments, which render traditional methods inapplicable. The core concept of abstract reachability analysis, in which a set of impulsive maneuvering options is mapped onto a space of high-level mission outcomes, is applied to enable goal-oriented decision-making with robustness to uncertainty. These nuanced analyses are efficiently computed by utilizing a heuristic-based adaptive sampling scheme that either maximizes an objective function for autonomous planning or resolves details of interest for preliminary analysis and general study. Illustrative examples reveal the chaotic nature of small body systems through the structure of various families of reachable orbits, such as those that facilitate close-range observation of targeted surface locations or achieve soft impact upon them. In order to fulfill extensive sets of observation tasks, the single-maneuver design method is implemented in a receding-horizon framework such that a complete mission is constructed on-the-fly one piece at a time. Long-term performance and convergence are assured by augmenting the objective function with a prospect heuristic, which approximates the likelihood that a reachable end-state will benefit the subsequent planning horizon. When state and model uncertainty produce larger trajectory deviations than were anticipated, the next control horizon is advanced to allow for corrective action -- a low-frequency form of feedback control. Through Monte Carlo analysis, the planning algorithm is ultimately demonstrated to produce mission profiles that vary drastically in their physical paths but nonetheless consistently complete all goals, suggesting a high degree of flexibility. It is further shown that the objective function can be tuned to preferentially minimize fuel cost or mission duration, as well as to optimize performance under different levels of uncertainty by appropriately balancing the mitigation paradigms of robust planning and reactive execution.

  20. From Data-Sharing to Model-Sharing: SCEC and the Development of Earthquake System Science (Invited)

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2009-12-01

    Earthquake system science seeks to construct system-level models of earthquake phenomena and use them to predict emergent seismic behavior—an ambitious enterprise that requires high degree of interdisciplinary, multi-institutional collaboration. This presentation will explore model-sharing structures that have been successful in promoting earthquake system science within the Southern California Earthquake Center (SCEC). These include disciplinary working groups to aggregate data into community models; numerical-simulation working groups to investigate system-specific phenomena (process modeling) and further improve the data models (inverse modeling); and interdisciplinary working groups to synthesize predictive system-level models. SCEC has developed a cyberinfrastructure, called the Community Modeling Environment, that can distribute the community models; manage large suites of numerical simulations; vertically integrate the hardware, software, and wetware needed for system-level modeling; and promote the interactions among working groups needed for model validation and refinement. Various socio-scientific structures contribute to successful model-sharing. Two of the most important are “communities of trust” and collaborations between government and academic scientists on mission-oriented objectives. The latter include improvements of earthquake forecasts and seismic hazard models and the use of earthquake scenarios in promoting public awareness and disaster management.

  1. Simulation Exploration Experience 2018 Overview

    NASA Technical Reports Server (NTRS)

    Paglialonga, Stephen; Elfrey, Priscilla; Crues, Edwin Z.

    2018-01-01

    The Simulation Exploration Experience (SEE) joins students, industry, professional associations, and faculty together for an annual modeling and simulation (M&S) challenge. SEE champions collaborative collegiate-level modeling and simulation by providing a venue for students to work in highly dispersed inter-university teams to design, develop, test, and execute simulated missions associated with space exploration. Participating teams gain valuable knowledge, skills, and increased employability by working closely with industry professionals, NASA, and faculty advisors. This presentation gives and overview of the SEE and the upcoming 2018 SEE event.

  2. Determining Coastal Mean Dynamic Topography by Geodetic Methods

    NASA Astrophysics Data System (ADS)

    Huang, Jianliang

    2017-11-01

    In geodesy, coastal mean dynamic topography (MDT) was traditionally determined by spirit leveling technique. Advances in navigation satellite positioning (e.g., GPS) and geoid determination enable space-based leveling with an accuracy of about 3 cm at tide gauges. Recent CryoSat-2, a satellite altimetry mission with synthetic aperture radar (SAR) and SAR interferometric measurements, extends the space-based leveling to the coastal ocean with the same accuracy. However, barriers remain in applying the two space-based geodetic methods for MDT determination over the coastal ocean because current geoid modeling focuses primarily on land as a substitute to spirit leveling to realize the vertical datum.

  3. Cost Model Comparison: A Study of Internally and Commercially Developed Cost Models in Use by NASA

    NASA Technical Reports Server (NTRS)

    Gupta, Garima

    2011-01-01

    NASA makes use of numerous cost models to accurately estimate the cost of various components of a mission - hardware, software, mission/ground operations - during the different stages of a mission's lifecycle. The purpose of this project was to survey these models and determine in which respects they are similar and in which they are different. The initial survey included a study of the cost drivers for each model, the form of each model (linear/exponential/other CER, range/point output, capable of risk/sensitivity analysis), and for what types of missions and for what phases of a mission lifecycle each model is capable of estimating cost. The models taken into consideration consisted of both those that were developed by NASA and those that were commercially developed: GSECT, NAFCOM, SCAT, QuickCost, PRICE, and SEER. Once the initial survey was completed, the next step in the project was to compare the cost models' capabilities in terms of Work Breakdown Structure (WBS) elements. This final comparison was then portrayed in a visual manner with Venn diagrams. All of the materials produced in the process of this study were then posted on the Ground Segment Team (GST) Wiki.

  4. Promoting active visits to parks: models and strategies for transdisciplinary collaboration

    Treesearch

    David M. Buchner; Paul H. Gobster

    2007-01-01

    The purpose of this paper is to discuss the shared interest of the public health and parks and recreation sectors in promoting active visits to parks. At the institutional level, both sectors have missions to promote physical activity and view parks as key components in attaining physical activity goals. While some balancing among park goals may be necessary to avoid...

  5. Observations of TOPEX/Poseidon Orbit Errors Due to Gravitational and Tidal Modeling Errors Using the Global Positioning System

    NASA Technical Reports Server (NTRS)

    Haines, B.; Christensen, E.; Guinn, J.; Norman, R.; Marshall, J.

    1995-01-01

    Satellite altimetry must measure variations in ocean topography with cm-level accuracy. The TOPEX/Poseidon mission is designed to do this by measuring the radial component of the orbit with an accuracy of 13 cm or better RMS. Recent advances, however, have improved this accuracy by about an order of magnitude.

  6. The level and determinants of mission statement use: a questionnaire survey.

    PubMed

    Desmidt, Sebastian; Prinzie, Anita; Heene, Aimé

    2008-10-01

    Although mission statements are one of the most popular management instruments, little is known about the nature and direction of the presumed relationship between mission statements and organizational performance. In particular, empirical insights into the degree of mission statement use by individual organizational members are insufficient. We address the observed knowledge gap by (a) measuring the level of mission statement use (e.g., explaining the mission statement, making linkages to extant programs or practices, communicating enthusiasm, and adapting the mission statement to the personal work situation) by individual organizational members, and (b) identifying the antecedents that influence mission statement use. Questionnaires were used to collect data from a sample of 510 nurses from three Flemish hospitals. Mission statement use was measured by means of Fairhurst's Management of Meaning Scale. Antecedents of mission statement use were derived from the Theory of Planned Behavior and the mission statement literature. The findings indicate that mission statement use is low on average. Attitude, subjective norm, perceived behavioral control, and formal involvement in mission statement communication proved to be significant determinants of mission statement use and accounted for 43% of the variance. The results of the conducted regression analyses indicate that nurses (a) who have a positive attitude towards the mission statement, (b) who perceive pressure from superiors and colleagues to use the mission statement, (c) who feel they are in control of performing such behavior, and (d) who are formally involved in the mission statement communication processes are more likely to use the mission statement. Furthermore, the results indicated that demographic characteristics are not associated with mission statement use. To effectively increase mission statement use, investments should focus on redesigning a work environment that stresses the importance of the organizational mission statement and provides detailed information on the ways that individual organizational members can contribute in realizing the mission statement.

  7. Investigation of Space Based Solid State Coherent Lidar

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin

    2002-01-01

    This report describes the work performed over the period of October 1, 1997 through March 31, 2001. Under this contract, UAH/CAO participated in defining and designing the SPAce Readiness Coherent Lidar Experiment (SPARCLE) mission, and developed the instrument's optical subsystem. This work was performed in collaborative fashion with NASA/MSFC engineers at both UAH/CAO and NASA/MSFC facilities. Earlier work by the UAH/CAO had produced a preliminary top-level system design for the Shuttle lidar instrument meeting the proposed mission performance requirements and the Space Shuttle Hitchhiker canister volume constraints. The UAH/CAO system design efforts had concentrated on the optical and mechanical designs of the instrument. The instrument electronics were also addressed, and the major electronic components and their interfaces defined. The instrument design concept was mainly based on the state of the transmitter and local oscillator laser development at NASA Langley Research Center and Jet Propulsion Laboratory, and utilized several lidar-related technologies that were either developed or evaluated by the NASA/MSFC and UAH/CAO scientists. UAH/CAO has developed a comprehensive coherent lidar numerical model capable of analyzing the performance of different instrument and mission concepts. This model uses the instrument configuration, atmospheric conditions and current velocity estimation theory to provide prediction of instrument performance during different phases of operation. This model can also optimize the design parameters of the instrument.

  8. Improving the Operations of the Earth Observing One Mission via Automated Mission Planning

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.; Tran, Daniel; Rabideau, Gregg; Schaffer, Steve; Mandl, Daniel; Frye, Stuart

    2010-01-01

    We describe the modeling and reasoning about operations constraints in an automated mission planning system for an earth observing satellite - EO-1. We first discuss the large number of elements that can be naturally represented in an expressive planning and scheduling framework. We then describe a number of constraints that challenge the current state of the art in automated planning systems and discuss how we modeled these constraints as well as discuss tradeoffs in representation versus efficiency. Finally we describe the challenges in efficiently generating operations plans for this mission. These discussions involve lessons learned from an operations model that has been in use since Fall 2004 (called R4) as well as a newer more accurate operations model operational since June 2009 (called R5). We present analysis of the R5 software documenting a significant (greater than 50%) increase in the number of weekly observations scheduled by the EO-1 mission. We also show that the R5 mission planning system produces schedules within 15% of an upper bound on optimal schedules. This operational enhancement has created value of millions of dollars US over the projected remaining lifetime of the EO-1 mission.

  9. An operations and command systems for the extreme ultraviolet explorer

    NASA Technical Reports Server (NTRS)

    Muscettola, Nicola; Korsmeyer, David J.; Olson, Eric C.; Wong, Gary

    1994-01-01

    About 40% of the budget of a scientific spacecraft mission is usually consumed by Mission Operations & Data Analysis (MO&DA) with MO driving these costs. In the current practice, MO is separated from spacecraft design and comes in focus relatively late in the mission life cycle. As a result, spacecraft may be designed that are very difficult to operate. NASA centers have extensive MO expertise but often lessons learned in one mission are not exploited for other parallel or future missions. A significant reduction of MO costs is essential to ensure a continuing and growing access to space for the scientific community. We are addressing some of these issues with a highly automated payload operations and command system for an existing mission, the Extreme Ultraviolet Explorer (EUVE). EUVE is currently operated jointly by the Goddard Space Flight Center (GSFC), responsible for spacecraft operations, and the Center for Extreme Ultraviolet Astrophysics (CEA) of the University of California, Berkeley, which controls the telescopes and scientific instruments aboard the satellite. The new automated system is being developed by a team including personnel from the NASA Ames Research Center (ARC), the Jet Propulsion Laboratory (JPL) and the Center for EUV Astrophysics (CEA). An important goal of the project is to provide AI-based technology that can be easily operated by nonspecialists in AI. Another important goal is the reusability of the techniques for other missions. Models of the EUVE spacecraft need to be built both for planning/scheduling and for monitoring. In both cases, our modeling tools allow the assembly of a spacecraft model from separate sub-models of the various spacecraft subsystems. These sub-models are reusable; therefore, building mission operations systems for another small satellite mission will require choosing pre-existing modules, reparametrizing them with respect to the actual satellite telemetry information, and reassembling them in a new model. We briefly describe the EUVE mission and indicate why it is particularly suitable for the task. Then we briefly outline our current work in mission planning/scheduling and spacecraft and instrument health monitoring.

  10. Model implementation for dynamic computation of system cost for advanced life support

    NASA Technical Reports Server (NTRS)

    Levri, J. A.; Vaccari, D. A.

    2004-01-01

    Life support system designs for long-duration space missions have a multitude of requirements drivers, such as mission objectives, political considerations, cost, crew wellness, inherent mission attributes, as well as many other influences. Evaluation of requirements satisfaction can be difficult, particularly at an early stage of mission design. Because launch cost is a critical factor and relatively easy to quantify, it is a point of focus in early mission design. The method used to determine launch cost influences the accuracy of the estimate. This paper discusses the appropriateness of dynamic mission simulation in estimating the launch cost of a life support system. This paper also provides an abbreviated example of a dynamic simulation life support model and possible ways in which such a model might be utilized for design improvement. c2004 COSPAR. Published by Elsevier Ltd. All rights reserved.

  11. Reliability Assessment Approach for Stirling Convertors and Generators

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Schreiber, Jeffrey G.; Zampino, Edward; Best, Timothy

    2004-01-01

    Stirling power conversion is being considered for use in a Radioisotope Power System for deep-space science missions because it offers a multifold increase in the conversion efficiency of heat to electric power. Quantifying the reliability of a Radioisotope Power System that utilizes Stirling power conversion technology is important in developing and demonstrating the capability for long-term success. A description of the Stirling power convertor is provided, along with a discussion about some of the key components. Ongoing efforts to understand component life, design variables at the component and system levels, related sources, and the nature of uncertainties is discussed. The requirement for reliability also is discussed, and some of the critical areas of concern are identified. A section on the objectives of the performance model development and a computation of reliability is included to highlight the goals of this effort. Also, a viable physics-based reliability plan to model the design-level variable uncertainties at the component and system levels is outlined, and potential benefits are elucidated. The plan involves the interaction of different disciplines, maintaining the physical and probabilistic correlations at all the levels, and a verification process based on rational short-term tests. In addition, both top-down and bottom-up coherency were maintained to follow the physics-based design process and mission requirements. The outlined reliability assessment approach provides guidelines to improve the design and identifies governing variables to achieve high reliability in the Stirling Radioisotope Generator design.

  12. Systems Engineering Lessons Learned for Class D Missions

    NASA Technical Reports Server (NTRS)

    Rojdev, Kristina; Piatek, Irene; Moore, Josh; Calvert, Derek

    2015-01-01

    One of NASA's goals within human exploration is to determine how to get humans to Mars safely and to live and work on the Martian surface. To accomplish this goal, several smaller missions act as stepping-stones to the larger end goal. NASA uses these smaller missions to develop new technologies and learn about how to survive outside of Low Earth Orbit for long periods. Additionally, keeping a cadence of these missions allows the team to maintain proficiency in the complex art of bringing spacecraft to fruition. Many of these smaller missions are robotic in nature and have smaller timescales, whereas there are others that involve crew and have longer mission timelines. Given the timelines associated with these various missions, different levels of risk and rigor need to be implemented to be more in line with what is appropriate for the mission. Thus, NASA has four different classifications that range from Class A to Class D based on the mission details. One of these projects is the Resource Prospector (RP) Mission, which is a multi-center and multi-institution collaborative project to search for volatiles in the polar regions of the Moon. The RP mission is classified as a Class D mission and as such, has the opportunity to more tightly manage, and therefore accept, greater levels of risk. The requirements for Class D missions were at the forefront of the design and thus presented unique challenges in vehicle development and systems engineering processes. This paper will discuss the systems engineering process at NASA and how that process is tailored for Class D missions, specifically the RP mission.

  13. Satellite servicing mission preliminary cost estimation model

    NASA Technical Reports Server (NTRS)

    1987-01-01

    The cost model presented is a preliminary methodology for determining a rough order-of-magnitude cost for implementing a satellite servicing mission. Mission implementation, in this context, encompassess all activities associated with mission design and planning, including both flight and ground crew training and systems integration (payload processing) of servicing hardward with the Shuttle. A basic assumption made in developing this cost model is that a generic set of servicing hardware was developed and flight tested, is inventoried, and is maintained by NASA. This implies that all hardware physical and functional interfaces are well known and therefore recurring CITE testing is not required. The development of the cost model algorithms and examples of their use are discussed.

  14. Issues and challenges in resource management and its interaction with levels 2/3 fusion with applications to real-world problems: an annotated perspective

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Kadar, Ivan; Hintz, Kenneth; Biermann, Joachim; Chong, Chee-Yee; Salerno, John; Das, Subrata

    2007-04-01

    Resource management (or process refinement) is critical for information fusion operations in that users, sensors, and platforms need to be informed, based on mission needs, on how to collect, process, and exploit data. To meet these growing concerns, a panel session was conducted at the International Society of Information Fusion Conference in 2006 to discuss the various issues surrounding the interaction of Resource Management with Level 2/3 Situation and Threat Assessment. This paper briefly consolidates the discussion of the invited panel panelists. The common themes include: (1) Addressing the user in system management, sensor control, and knowledge based information collection (2) Determining a standard set of fusion metrics for optimization and evaluation based on the application (3) Allowing dynamic and adaptive updating to deliver timely information needs and information rates (4) Optimizing the joint objective functions at all information fusion levels based on decision-theoretic analysis (5) Providing constraints from distributed resource mission planning and scheduling; and (6) Defining L2/3 situation entity definitions for knowledge discovery, modeling, and information projection

  15. Human-Robot Interaction in High Vulnerability Domains

    NASA Technical Reports Server (NTRS)

    Gore, Brian F.

    2016-01-01

    Future NASA missions will require successful integration of the human with highly complex systems. Highly complex systems are likely to involve humans, automation, and some level of robotic assistance. The complex environments will require successful integration of the human with automation, with robots, and with human-automation-robot teams to accomplish mission critical goals. Many challenges exist for the human performing in these types of operational environments with these kinds of systems. Systems must be designed to optimally integrate various levels of inputs and outputs based on the roles and responsibilities of the human, the automation, and the robots; from direct manual control, shared human-robotic control, or no active human control (i.e. human supervisory control). It is assumed that the human will remain involved at some level. Technologies that vary based on contextual demands and on operator characteristics (workload, situation awareness) will be needed when the human integrates into these systems. Predictive models that estimate the impact of the technologies on the system performance and the on the human operator are also needed to meet the challenges associated with such future complex human-automation-robot systems in extreme environments.

  16. Developing Vocabularies to Improve Understanding and Use of NOAA Observing Systems

    NASA Astrophysics Data System (ADS)

    Austin, M.

    2014-12-01

    The NOAA Observing System Integrated Analysis project (NOSIA II), is an attempt to capture and tell the story of how valuable observing systems are in producing products and services that are required to fulfill the NOAA's diverse mission. NOAA's goals and mission areas cover a broad range of environmental data; a complexity exists in terms and vocabulary as applied to the creation of observing system derived products. The NOSIA data collection focused first on decomposing NOAA's goals in the creation and acceptance of Mission Service Areas (MSAs) by NOAA senior leadership. Products and services that supported the MSAs were then identified through the process of interviewing product producers across NOAA organization. Product Data inputs including models, databases and observing system were also identified. The NOSIA model contains over 20,000 nodes each representing levels in a network connecting products, datasources, users and desired outcomes. An immediate need became apparent that the complexity and variety of the data collected required data management to mature the quality and the content of the NOSIA model. The NOSIA Analysis Database (ADB) was developed initially to improve consistency of terms and data types to allow for the linkage of observing systems, products and NOAA's Goals and mission. The ADB also allowed for the prototyping of reports and product generation in an easily accessible and comprehensive format for the first time. Web based visualization of relationships between products, datasources, users, producers were generated to make the information easily understood This includes developing ontologies/vocabularies that are used for the development of users type specific products for NOAA leadership, Observing System Portfolio mangers and the users of NOAA data.

  17. Orbit Determination Toolbox

    NASA Technical Reports Server (NTRS)

    Carpenter, James R.; Berry, Kevin; Gregpru. Late; Speckman, Keith; Hur-Diaz, Sun; Surka, Derek; Gaylor, Dave

    2010-01-01

    The Orbit Determination Toolbox is an orbit determination (OD) analysis tool based on MATLAB and Java that provides a flexible way to do early mission analysis. The toolbox is primarily intended for advanced mission analysis such as might be performed in concept exploration, proposal, early design phase, or rapid design center environments. The emphasis is on flexibility, but it has enough fidelity to produce credible results. Insight into all flight dynamics source code is provided. MATLAB is the primary user interface and is used for piecing together measurement and dynamic models. The Java Astrodynamics Toolbox is used as an engine for things that might be slow or inefficient in MATLAB, such as high-fidelity trajectory propagation, lunar and planetary ephemeris look-ups, precession, nutation, polar motion calculations, ephemeris file parsing, and the like. The primary analysis functions are sequential filter/smoother and batch least-squares commands that incorporate Monte-Carlo data simulation, linear covariance analysis, measurement processing, and plotting capabilities at the generic level. These functions have a user interface that is based on that of the MATLAB ODE suite. To perform a specific analysis, users write MATLAB functions that implement truth and design system models. The user provides his or her models as inputs to the filter commands. The software provides a capability to publish and subscribe to a software bus that is compliant with the NASA Goddard Mission Services Evolution Center (GMSEC) standards, to exchange data with other flight dynamics tools to simplify the flight dynamics design cycle. Using the publish and subscribe approach allows for analysts in a rapid design center environment to seamlessly incorporate changes in spacecraft and mission design into navigation analysis and vice versa.

  18. Remote mission specialist - A study in real-time, adaptive planning

    NASA Technical Reports Server (NTRS)

    Rokey, Mark J.

    1990-01-01

    A high-level planning architecture for robotic operations is presented. The remote mission specialist integrates high-level directives with low-level primitives executable by a run-time controller for command of autonomous servicing activities. The planner has been designed to address such issues as adaptive plan generation, real-time performance, and operator intervention.

  19. Spaceborne synthetic aperture radar signal processing using FPGAs

    NASA Astrophysics Data System (ADS)

    Sugimoto, Yohei; Ozawa, Satoru; Inaba, Noriyasu

    2017-10-01

    Synthetic Aperture Radar (SAR) imagery requires image reproduction through successive signal processing of received data before browsing images and extracting information. The received signal data records of the ALOS-2/PALSAR-2 are stored in the onboard mission data storage and transmitted to the ground. In order to compensate the storage usage and the capacity of transmission data through the mission date communication networks, the operation duty of the PALSAR-2 is limited. This balance strongly relies on the network availability. The observation operations of the present spaceborne SAR systems are rigorously planned by simulating the mission data balance, given conflicting user demands. This problem should be solved such that we do not have to compromise the operations and the potential of the next-generation spaceborne SAR systems. One of the solutions is to compress the SAR data through onboard image reproduction and information extraction from the reproduced images. This is also beneficial for fast delivery of information products and event-driven observations by constellation. The Emergence Studio (Sōhatsu kōbō in Japanese) with Japan Aerospace Exploration Agency is developing evaluation models of FPGA-based signal processing system for onboard SAR image reproduction. The model, namely, "Fast L1 Processor (FLIP)" developed in 2016 can reproduce a 10m-resolution single look complex image (Level 1.1) from ALOS/PALSAR raw signal data (Level 1.0). The processing speed of the FLIP at 200 MHz results in twice faster than CPU-based computing at 3.7 GHz. The image processed by the FLIP is no way inferior to the image processed with 32-bit computing in MATLAB.

  20. Technology Maturation in Preparation for the Cryogenic Propellant Storage and Transfer (CPST) Technology Demonstration Mission (TDM)

    NASA Technical Reports Server (NTRS)

    Meyer, Michael L.; Doherty, Michael P.; Moder, Jeffrey P.

    2014-01-01

    In support of its goal to find an innovative path for human space exploration, NASA embarked on the Cryogenic Propellant Storage and Transfer (CPST) Project, a Technology Demonstration Mission (TDM) to test and validate key cryogenic capabilities and technologies required for future exploration elements, opening up the architecture for large in-space cryogenic propulsion stages and propellant depots. Recognizing that key Cryogenic Fluid Management (CFM) technologies anticipated for on-orbit (flight) demonstration would benefit from additional maturation to a readiness level appropriate for infusion into the design of the flight demonstration, the NASA Headquarters Space Technology Mission Directorate (STMD) authorized funding for a one-year technology maturation phase of the CPST project. The strategy, proposed by the CPST Project Manager, focused on maturation through modeling, concept studies, and ground tests of the storage and fluid transfer of CFM technology sub-elements and components that were lower than a Technology Readiness Level (TRL) of 5. A technology maturation plan (TMP) was subsequently approved which described: the CFM technologies selected for maturation, the ground testing approach to be used, quantified success criteria of the technologies, hardware and data deliverables, and a deliverable to provide an assessment of the technology readiness after completion of the test, study or modeling activity. The specific technologies selected were grouped into five major categories: thick multilayer insulation, tank applied active thermal control, cryogenic fluid transfer, propellant gauging, and analytical tool development. Based on the success of the technology maturation efforts, the CPST project was approved to proceed to flight system development.

  1. The Value of Biomedical Simulation Environments to Future Human Space Flight Missions

    NASA Technical Reports Server (NTRS)

    Mulugeta, Lealem; Myers, Jerry G.; Skytland, Nicholas G.; Platts, Steven H.

    2010-01-01

    With the ambitious goals to send manned missions to asteroids and onto Mars, substantial work will be required to ensure the well being of the men and women who will undertake these difficult missions. Unlike current International Space Station or Shuttle missions, astronauts will be required to endure long-term exposure to higher levels of radiation, isolation and reduced gravity. These new operation conditions will pose health risks that are currently not well understood and perhaps unanticipated. Therefore, it is essential to develop and apply advanced tools to predict, assess and mitigate potential hazards to astronaut health. NASA s Digital Astronaut Project (DAP) is working to develop and apply computational models of physiologic response to space flight operation conditions over various time periods and environmental circumstances. The collective application and integration of well vetted models assessing the physiology, biomechanics and anatomy is referred to as the Digital Astronaut. The Digital Astronaut simulation environment will serve as a practical working tool for use by NASA in operational activities such as the prediction of biomedical risks and functional capabilities of astronauts. In additional to space flight operation conditions, DAP s work has direct applicability to terrestrial biomedical research by providing virtual environments for hypothesis testing, experiment design, and to reduce animal/human testing. A practical application of the DA to assess pre and post flight responses to exercise is illustrated and the difficulty in matching true physiological responses is discussed.

  2. Evaluating Shielding Effectiveness for Reducing Space Radiation Cancer Risks

    NASA Technical Reports Server (NTRS)

    Cucinotta, Francis A.; Kim, Myung-Hee Y.; Ren, Lei

    2007-01-01

    We discuss calculations of probability distribution functions (PDF) representing uncertainties in projecting fatal cancer risk from galactic cosmic rays (GCR) and solar particle events (SPE). The PDF s are used in significance tests of the effectiveness of potential radiation shielding approaches. Uncertainties in risk coefficients determined from epidemiology data, dose and dose-rate reduction factors, quality factors, and physics models of radiation environments are considered in models of cancer risk PDF s. Competing mortality risks and functional correlations in radiation quality factor uncertainties are treated in the calculations. We show that the cancer risk uncertainty, defined as the ratio of the 95% confidence level (CL) to the point estimate is about 4-fold for lunar and Mars mission risk projections. For short-stay lunar missions (<180 d), SPE s present the most significant risk, however one that is mitigated effectively by shielding, especially for carbon composites structures with high hydrogen content. In contrast, for long duration lunar (>180 d) or Mars missions, GCR risks may exceed radiation risk limits, with 95% CL s exceeding 10% fatal risk for males and females on a Mars mission. For reducing GCR cancer risks, shielding materials are marginally effective because of the penetrating nature of GCR and secondary radiation produced in tissue by relativistic particles. At the present time, polyethylene or carbon composite shielding can not be shown to significantly reduce risk compared to aluminum shielding based on a significance test that accounts for radiobiology uncertainties in GCR risk projection.

  3. Long-range planning cost model for support of future space missions by the deep space network

    NASA Technical Reports Server (NTRS)

    Sherif, J. S.; Remer, D. S.; Buchanan, H. R.

    1990-01-01

    A simple model is suggested to do long-range planning cost estimates for Deep Space Network (DSP) support of future space missions. The model estimates total DSN preparation costs and the annual distribution of these costs for long-range budgetary planning. The cost model is based on actual DSN preparation costs from four space missions: Galileo, Voyager (Uranus), Voyager (Neptune), and Magellan. The model was tested against the four projects and gave cost estimates that range from 18 percent above the actual total preparation costs of the projects to 25 percent below. The model was also compared to two other independent projects: Viking and Mariner Jupiter/Saturn (MJS later became Voyager). The model gave cost estimates that range from 2 percent (for Viking) to 10 percent (for MJS) below the actual total preparation costs of these missions.

  4. Geosynchronous platform definition study. Volume 4, Part 1: Traffic analysis and system requirements for the baseline traffic model

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The traffic analyses and system requirements data generated in the study resulted in the development of two traffic models; the baseline traffic model and the new traffic model. The baseline traffic model provides traceability between the numbers and types of geosynchronous missions considered in the study and the entire spectrum of missions foreseen in the total national space program. The information presented pertaining to the baseline traffic model includes: (1) definition of the baseline traffic model, including identification of specific geosynchronous missions and their payload delivery schedules through 1990; (2) Satellite location criteria, including the resulting distribution of the satellite population; (3) Geosynchronous orbit saturation analyses, including the effects of satellite physical proximity and potential electromagnetic interference; and (4) Platform system requirements analyses, including satellite and mission equipment descriptions, the options and limitations in grouping satellites, and on-orbit servicing criteria (both remotely controlled and man-attended).

  5. Optimal allocation of thermodynamic irreversibility for the integrated design of propulsion and thermal management systems

    NASA Astrophysics Data System (ADS)

    Maser, Adam Charles

    More electric aircraft systems, high power avionics, and a reduction in heat sink capacity have placed a larger emphasis on correctly satisfying aircraft thermal management requirements during conceptual design. Thermal management systems must be capable of dealing with these rising heat loads, while simultaneously meeting mission performance. Since all subsystem power and cooling requirements are ultimately traced back to the engine, the growing interactions between the propulsion and thermal management systems are becoming more significant. As a result, it is necessary to consider their integrated performance during the conceptual design of the aircraft gas turbine engine cycle to ensure that thermal requirements are met. This can be accomplished by using thermodynamic subsystem modeling and simulation while conducting the necessary design trades to establish the engine cycle. However, this approach also poses technical challenges associated with the existence of elaborate aircraft subsystem interactions. This research addresses these challenges through the creation of a parsimonious, transparent thermodynamic model of propulsion and thermal management systems performance with a focus on capturing the physics that have the largest impact on propulsion design choices. This modeling environment, known as Cycle Refinement for Aircraft Thermodynamically Optimized Subsystems (CRATOS), is capable of operating in on-design (parametric) and off-design (performance) modes and includes a system-level solver to enforce design constraints. A key aspect of this approach is the incorporation of physics-based formulations involving the concurrent usage of the first and second laws of thermodynamics, which are necessary to achieve a clearer view of the component-level losses across the propulsion and thermal management systems. This is facilitated by the direct prediction of the exergy destruction distribution throughout the system and the resulting quantification of available work losses over the time history of the mission. The characterization of the thermodynamic irreversibility distribution helps give the propulsion systems designer an absolute and consistent view of the tradeoffs associated with the design of the entire integrated system. Consequently, this leads directly to the question of the proper allocation of irreversibility across each of the components. The process of searching for the most favorable allocation of this irreversibility is the central theme of the research and must take into account production cost and vehicle mission performance. The production cost element is accomplished by including an engine component weight and cost prediction capability within the system model. The vehicle mission performance is obtained by directly linking the propulsion and thermal management model to a vehicle performance model and flying it through a mission profile. A canonical propulsion and thermal management systems architecture is then presented to experimentally test each element of the methodology separately: first the integrated modeling and simulation, then the irreversibility, cost, and mission performance considerations, and then finally the proper technique to perform the optimal allocation. A goal of this research is the description of the optimal allocation of system irreversibility to enable an engine cycle design with improved performance and cost at the vehicle-level. To do this, a numerical optimization was first used to minimize system-level production and operating costs by fixing the performance requirements and identifying the best settings for all of the design variables. There are two major drawbacks to this approach: It does not allow the designer to directly trade off the performance requirements and it does not allow the individual component losses to directly factor into the optimization. An irreversibility allocation approach based on the economic concept of resource allocation is then compared to the numerical optimization. By posing the problem in economic terms, exergy destruction is treated as a true common currency to barter for improved efficiency, cost, and performance. This allows the designer to clearly see how changes in the irreversibility distribution impact the overall system. The inverse design is first performed through a filtered Monte Carlo to allow the designer to view the irreversibility design space. The designer can then directly perform the allocation using the exergy destruction, which helps to place the design choices on an even thermodynamic footing. Finally, two use cases are presented to show how the irreversibility allocation approach can assist the designer. The first describes a situation where the designer can better address competing system-level requirements; the second describes a different situation where the designer can choose from a number of options to improve a system in a manner that is more robust to future requirements.

  6. Overview of Graphical User Interface for ARRBOD (Acute Radiation Risk and BRYNTRN Organ Dose Projection)

    NASA Technical Reports Server (NTRS)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2010-01-01

    Solar particle events (SPEs) pose the risk of acute radiation sickness (ARS) to astronauts, because organ doses from large SPEs may reach critical levels during extra vehicular activities (EVAs) or lightly shielded spacecraft. NASA has developed an organ dose projection model of Baryon transport code (BRYNTRN) with an output data processing module of SUMDOSE, and a probabilistic model of acute radiation risk (ARR). BRYNTRN code operation requires extensive input preparation, and the risk projection models of organ doses and ARR take the output from BRYNTRN as an input to their calculations. With a graphical user interface (GUI) to handle input and output for BRYNTRN, these response models can be connected easily and correctly to BRYNTRN in a user friendly way. The GUI for the Acute Radiation Risk and BRYNTRN Organ Dose (ARRBOD) projection code provides seamless integration of input and output manipulations required for operations of the ARRBOD modules: BRYNTRN, SUMDOSE, and the ARR probabilistic response model. The ARRBOD GUI is intended for mission planners, radiation shield designers, space operations in the mission operations directorate (MOD), and space biophysics researchers. Assessment of astronauts organ doses and ARS from the exposure to historically large SPEs is in support of mission design and operation planning to avoid ARS and stay within the current NASA short-term dose limits. The ARRBOD GUI will serve as a proof-of-concept for future integration of other risk projection models for human space applications. We present an overview of the ARRBOD GUI product, which is a new self-contained product, for the major components of the overall system, subsystem interconnections, and external interfaces.

  7. Models Required to Mitigate Impacts of Space Weather on Space Systems

    NASA Technical Reports Server (NTRS)

    Barth, Janet L.

    2003-01-01

    This viewgraph presentation attempts to develop a model of factors which need to be considered in the design and construction of spacecraft to lessen the effects of space weather on these vehicles. Topics considered include: space environments and effects, radiation environments and effects, space weather drivers, space weather models, climate models, solar proton activity and mission design for the GOES mission. The authors conclude that space environment models need to address issues from mission planning through operations and a program to develop and validate authoritative space environment models for application to spacecraft design does not exist at this time.

  8. Economics of ion propulsion for large space systems

    NASA Technical Reports Server (NTRS)

    Masek, T. D.; Ward, J. W.; Rawlin, V. K.

    1978-01-01

    This study of advanced electrostatic ion thrusters for space propulsion was initiated to determine the suitability of the baseline 30-cm thruster for future missions and to identify other thruster concepts that would better satisfy mission requirements. The general scope of the study was to review mission requirements, select thruster designs to meet these requirements, assess the associated thruster technology requirements, and recommend short- and long-term technology directions that would support future thruster needs. Preliminary design concepts for several advanced thrusters were developed to assess the potential practical difficulties of a new design. This study produced useful general methodologies for assessing both planetary and earth orbit missions. For planetary missions, the assessment is in terms of payload performance as a function of propulsion system technology level. For earth orbit missions, the assessment is made on the basis of cost (cost sensitivity to propulsion system technology level).

  9. Validation of SARAL/AltiKa data in the Amazon basin

    NASA Astrophysics Data System (ADS)

    Santos da Silva, Joecila; Calmant, Stephane; Medeiros Moreira, Daniel; Oliveira, Robson; Conchy, Taina; Gennero, Marie-Claude; Seyler, Frederique

    2015-04-01

    SARAL/AltiKa is a link between past missions (since it flies on the ERS-ENVISAT orbit with Ku band nadir altimeters in LRM) and future missions such as SWOT's Ka band interferometry swaths. In the present study, we compare the capability of its altimeter AltiKa to that of previous missions working in the Ku band such as ENVISAT and Jason-2 in retrieving water levels over the Amazon basin. Same as for the aforementioned preceding missions, the best results were obtained with the ICE-1 retracking algorithm. We qualitatively analyze the impact of rainfalls in the loss of measurements. Since making long -multi mission- time series is of major importance either for hydro-climatic studies or for basin management, we also present an estimate of the altimeter bias in order that the SARAL series of water level can be appended to those of these previous missions.

  10. Space Radiation Risk Assessment

    NASA Astrophysics Data System (ADS)

    Blakely, E.

    Evaluation of potential health effects from radiation exposure during and after deep space travel is important for the future of manned missions To date manned missions have been limited to near-Earth orbits with the moon our farthest distance from earth Historical space radiation career exposures for astronauts from all NASA Missions show that early missions involved total exposures of less than about 20 mSv With the advent of Skylab and Mir total career exposure levels increased to a maximum of nearly 200 mSv Missions in deep space with the requisite longer duration of the missions planned may pose greater risks due to the increased potential for exposure to complex radiation fields comprised of a broad range of radiation types and energies from cosmic and unpredictable solar sources The first steps in the evaluation of risks are underway with bio- and physical-dosimetric measurements on both commercial flight personnel and international space crews who have experience on near-earth orbits and the necessary theoretical modeling of particle-track traversal per cell including the contributing effects of delta-rays in particle exposures An assumption for biologic effects due to exposure of radiation in deep space is that they differ quantitatively and qualitatively from that on earth The dose deposition and density pattern of heavy charged particles are very different from those of sparsely ionizing radiation The potential risks resulting from exposure to radiation in deep space are cancer non-cancer and genetic effects Radiation from

  11. Features of the Drag-Free-Simulator demonstrated for the Microscope-mission

    NASA Astrophysics Data System (ADS)

    List, Meike; Bremer, Stefanie; Dittus, Hansjoerg; Selig, Hanns

    The ZARM Drag-Free-Simulator is being developed as a tool for comprehensive mission modeling. Environmental disturbances like solar radiation pressure, atmospheric drag, interactions between the satellite and the Earth's magnetic field can be taken into account via several models. Besides the gravitational field of the Earth, the influence of Sun, Moon and the planets including Pluto can be considered for aimed simulations, too. Methods of modeling and implementation will be presented. At the moment, effort is made to adapt this simulation tool for the french mission MICRO- SCOPE which is designed for testing the equivalence principle up to an accuracy of η=10-15 . Additionally, detailed modeling of on-board capacitive sensors is necessary for a better understanding of the real system. The actual status of mission modeling will be reported.

  12. Sustainable, Reliable Mission-Systems Architecture

    NASA Technical Reports Server (NTRS)

    O'Neil, Graham; Orr, James K.; Watson, Steve

    2005-01-01

    A mission-systems architecture, based on a highly modular infrastructure utilizing open-standards hardware and software interfaces as the enabling technology is essential for affordable md sustainable space exploration programs. This mission-systems architecture requires (8) robust communication between heterogeneous systems, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, end verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered systems are applied to define the model. Technology projections reaching out 5 years are made to refine model details.

  13. Methodology for back-contamination risk assessment for a Mars sample return mission

    NASA Technical Reports Server (NTRS)

    Merkhofer, M. W.; Quinn, D. J.

    1977-01-01

    The risk of back-contamination from Mars Surface Sample Return (MSSR) missions is assessed. The methodology is designed to provide an assessment of the probability that a given mission design and strategy will result in accidental release of Martian organisms acquired as a result of MSSR. This is accomplished through the construction of risk models describing the mission risk elements and their impact on back-contamination probability. A conceptual framework is presented for using the risk model to evaluate mission design decisions that require a trade-off between science and planetary protection considerations.

  14. Sustainable, Reliable Mission-Systems Architecture

    NASA Technical Reports Server (NTRS)

    O'Neil, Graham; Orr, James K.; Watson, Steve

    2007-01-01

    A mission-systems architecture, based on a highly modular infrastructure utilizing: open-standards hardware and software interfaces as the enabling technology is essential for affordable and sustainable space exploration programs. This mission-systems architecture requires (a) robust communication between heterogeneous system, (b) high reliability, (c) minimal mission-to-mission reconfiguration, (d) affordable development, system integration, and verification of systems, and (e) minimal sustaining engineering. This paper proposes such an architecture. Lessons learned from the Space Shuttle program and Earthbound complex engineered system are applied to define the model. Technology projections reaching out 5 years are mde to refine model details.

  15. Human Mars Mission Performance Crew Taxi Profile

    NASA Technical Reports Server (NTRS)

    Duaro, Vince A.

    1999-01-01

    Using the results from Integrated Mission Program (IMP), a simulation language and code used to model present and future Earth Moon, or Mars missions, this report presents six different case studies of a manned Mars mission. The mission profiles, timelines, propellant requirements, feasibility and perturbation analysis is presented for two aborted, two delayed rendezvous, and two normal rendezvous cases for a future Mars mission.

  16. Modeling Real-Time Coordination of Distributed Expertise and Event Response in NASA Mission Control Center Operations

    NASA Astrophysics Data System (ADS)

    Onken, Jeffrey

    This dissertation introduces a multidisciplinary framework for the enabling of future research and analysis of alternatives for control centers for real-time operations of safety-critical systems. The multidisciplinary framework integrates functional and computational models that describe the dynamics in fundamental concepts of previously disparate engineering and psychology research disciplines, such as group performance and processes, supervisory control, situation awareness, events and delays, and expertise. The application in this dissertation is the real-time operations within the NASA Mission Control Center in Houston, TX. This dissertation operationalizes the framework into a model and simulation, which simulates the functional and computational models in the framework according to user-configured scenarios for a NASA human-spaceflight mission. The model and simulation generates data according to the effectiveness of the mission-control team in supporting the completion of mission objectives and detecting, isolating, and recovering from anomalies. Accompanying the multidisciplinary framework is a proof of concept, which demonstrates the feasibility of such a framework. The proof of concept demonstrates that variability occurs where expected based on the models. The proof of concept also demonstrates that the data generated from the model and simulation is useful for analyzing and comparing MCC configuration alternatives because an investigator can give a diverse set of scenarios to the simulation and the output compared in detail to inform decisions about the effect of MCC configurations on mission operations performance.

  17. A Review of Aerothermal Modeling for Mars Entry Missions

    NASA Technical Reports Server (NTRS)

    Wright, Michael J; Tang, Chun Y.; Edquist, Karl T.; Hollis, Brian R.; Krasa, Paul

    2009-01-01

    The current status of aerothermal analysis for Mars entry missions is reviewed. The aeroheating environment of all Mars missions to date has been dominated by convective heating. Two primary uncertainties in our ability to predict forebody convective heating are turbulence on a blunt lifting cone and surface catalysis in a predominantly CO2 environment. Future missions, particularly crewed vehicles, will encounter additional heating from shock-layer radiation due to a combination of larger size and faster entry velocity. Localized heating due to penetrations or other singularities on the aeroshell must also be taken into account. The physical models employed to predict these phenomena are reviewed, and key uncertainties or deficiencies inherent in these models are explored. Capabilities of existing ground test facilities to support aeroheating validation are also summarized. Engineering flight data from the Viking and Pathfinder missions, which may be useful for aerothermal model validation, are discussed, and an argument is presented for obtaining additional flight data. Examples are taken from past, present, and future Mars entry missions, including the twin Mars Exploration Rovers and the Mars Science Laboratory, scheduled for launch by NASA in 2011.

  18. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    NASA Technical Reports Server (NTRS)

    Meshkat, Leila; Grenander, Sven; Evensen, Ken

    2011-01-01

    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  19. Using a High-Performance Planning Model to Increase Levels of Functional Effectiveness Within Professional Development.

    PubMed

    Winter, Peggi

    2016-01-01

    Nursing professional practice models continue to shape how we practice nursing by putting families and members at the heart of everything we do. Faced with enormous challenges around healthcare reform, models create frameworks for practice by unifying, uniting, and guiding our nurses. The Kaiser Permanente Practice model was developed to ensure consistency for nursing practice across the continuum. Four key pillars support this practice model and the work of nursing: quality and safety, leadership, professional development, and research/evidence-based practice. These four pillars form the foundation that makes transformational practice possible and aligns nursing with Kaiser Permanente's mission. The purpose of this article is to discuss the pillar of professional development and the components of the Nursing Professional Development: Scope and Standards of Practice model (American Nurses Association & National Nursing Staff Development Organization, 2010) and place them in a five-level development framework. This process allowed us to identify the current organizational level of practice, prioritize each nursing professional development component, and design an operational strategy to move nursing professional development toward a level of high performance. This process is suggested for nursing professional development specialists.

  20. High-Performance Optical Frequency References for Space

    NASA Astrophysics Data System (ADS)

    Schuldt, Thilo; Döringshoff, Klaus; Milke, Alexander; Sanjuan, Josep; Gohlke, Martin; Kovalchuk, Evgeny V.; Gürlebeck, Norman; Peters, Achim; Braxmaier, Claus

    2016-06-01

    A variety of future space missions rely on the availability of high-performance optical clocks with applications in fundamental physics, geoscience, Earth observation and navigation and ranging. Examples are the gravitational wave detector eLISA (evolved Laser Interferometer Space Antenna), the Earth gravity mission NGGM (Next Generation Gravity Mission) and missions, dedicated to tests of Special Relativity, e.g. by performing a Kennedy- Thorndike experiment testing the boost dependence of the speed of light. In this context we developed optical frequency references based on Doppler-free spectroscopy of molecular iodine; compactness and mechanical and thermal stability are main design criteria. With a setup on engineering model (EM) level we demonstrated a frequency stability of about 2·10-14 at an integration time of 1 s and below 6·10-15 at integration times between 100s and 1000s, determined from a beat-note measurement with a cavity stabilized laser where a linear drift was removed from the data. A cavity-based frequency reference with focus on improved long-term frequency stability is currently under development. A specific sixfold thermal shield design based on analytical methods and numerical calculations is presented.

  1. Integrated Vehicle Health Management (IVHM) for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Baroth, Edmund C.; Pallix, Joan

    2006-01-01

    To achieve NASA's ambitious Integrated Space Transportation Program objectives, aerospace systems will implement a variety of new concept in health management. System level integration of IVHM technologies for real-time control and system maintenance will have significant impact on system safety and lifecycle costs. IVHM technologies will enhance the safety and success of complex missions despite component failures, degraded performance, operator errors, and environment uncertainty. IVHM also has the potential to reduce, or even eliminate many of the costly inspections and operations activities required by current and future aerospace systems. This presentation will describe the array of NASA programs participating in the development of IVHM technologies for NASA missions. Future vehicle systems will use models of the system, its environment, and other intelligent agents with which they may interact. IVHM will be incorporated into future mission planners, reasoning engines, and adaptive control systems that can recommend or execute commands enabling the system to respond intelligently in real time. In the past, software errors and/or faulty sensors have been identified as significant contributors to mission failures. This presentation will also address the development and utilization of highly dependable sohare and sensor technologies, which are key components to ensure the reliability of IVHM systems.

  2. System model development for nuclear thermal propulsion

    NASA Technical Reports Server (NTRS)

    Walton, James T.; Hannan, Nelson A.; Perkins, Ken R.; Buksa, John H.; Worley, Brian A.; Dobranich, Dean

    1992-01-01

    A critical enabling technology in the evolutionary development of nuclear thermal propulsion (NTP) is the ability to predict the system performance under a variety of operating conditions. This is crucial for mission analysis and for control subsystem testing as well as for the modeling of various failure modes. Performance must be accurately predicted during steady-state and transient operation, including startup, shutdown, and post operation cooling. The development and application of verified and validated system models has the potential to reduce the design, testing, and cost and time required for the technology to reach flight-ready status. Since Oct. 1991, the U.S. Department of Energy (DOE), Department of Defense (DOD), and NASA have initiated critical technology development efforts for NTP systems to be used on Space Exploration Initiative (SEI) missions to the Moon and Mars. This paper presents the strategy and progress of an interagency NASA/DOE/DOD team for NTP system modeling. It is the intent of the interagency team to develop several levels of computer programs to simulate various NTP systems. The first level will provide rapid, parameterized calculations of overall system performance. Succeeding computer programs will provide analysis of each component in sufficient detail to guide the design teams and experimental efforts. The computer programs will allow simulation of the entire system to allow prediction of the integrated performance. An interagency team was formed for this task to use the best capabilities available and to assure appropriate peer review.

  3. Analysis and Optimization of the Recovered ESA Huygens Mission

    NASA Astrophysics Data System (ADS)

    Kazeminejad, Bobby

    2002-06-01

    The Huygens Probe is the ESA-provided element of the joint NASA/ESA Cassini - Huygens mission to Saturn and Titan. A recently discovered design flaw in the Huygens radio receiver onboard Cassini led to a significantly different mission geometry, redesigned and implemented by both the ESA Huygens and NASA Cassini project teams. A numerical integration of the Orbiter trajectory and the Huygens descent profile with simplified assumptions for Probe attitude and correlated aerodynamic aspects offered the opportunity to re-calculate key mission parameters, which depend on the relative geometry and motion of the bodies. This was a crucial step to assess whether science-imposed constraints were not violated. A review of existing Titan wind and atmosphere models and their physical background led to a subsequent parametric study of their impact on the supersonic entry phase, the parachute descent and finally the bodyfixed landing coordinates of the Probe. In addition to the deterministic (nominal) Probe trajectory, it is important to quantify the influence of various uncertainties that enter into the equations of motion on the results (e.g., state vectors, physical parameters of the environment and the Probe itself). This was done by propagating the system covariance matrix together with the nominal state vectors. A sophisticated Monte Carlo technique developed to save up computation time was then used to determine statistical percentiles of the key parameters. The Probe Orbiter link geometry was characterized by evaluating the link budget and received frequency at receiver level. In this calculation the spin of the Probe and the asymmetric gain pattern of the transmitting antennas was taken into account. The results were then used in a mathematical model that describes the tracking capability of the receiver symbol synchronizer. This allowed the loss of data during the mission to be quantified. A subsequent parametric study of different sets of mission parameters with the goal of minimizing the data losses and maximizing the overall mission robustness resulted in the recommendation to change the flyby altitude of the Orbiter from 65,000 km down to 60,000 km.

  4. Using the Full Cycle of GOCE Data in the Quasi-Geoid Modelling of Finland

    NASA Astrophysics Data System (ADS)

    Saari, Timo; Bilker-Koivula, Mirjam; Poutanen, Markku

    2016-08-01

    In the Dragon 3 project 10519 "Case study on heterogeneous geoid/quasigeoid based on space borne and terrestrial data combination with special consideration of GOCE mission data impact" we combined the latest GOCE models with the terrestrial gravity data of Finland and surrounding areas to calculate a quasi-geoid model for Finland. Altogether 249 geoid models with different modifications were calculated using the GOCE DIR5 models up to spherical harmonic degree and order 240 and 300 and the EIGEN-6C4 up to degree and order 1000 and 2190.The calculated quasi-geoid models were compared against the ground truth in Finland with two independent GPS-levelling datasets. The best GOCE- only models gave standard deviations of 2.8 cm, 2.6 cm (DIR5 d/o 240) and 2.7 cm, 2.3 cm (DIR5 d/o 300) in Finnish territory for NLS-FIN and EUVN-DA datasets, respectively. For the high resolution model EIGEN-6C4 (which includes the full cycle of the GOCE data), the results were 2.4 cm, 1.8 cm (d/o 1000) and 2.5 cm, 1.7 (d/o 2190). The sub-2-centimetre (and near 2 cm with GOCE-only) accuracy is an improvement over the previous and current Finnish geoid models, thus leading to a conclusion of the great impact of the GOCE- mission on regional geoid modelling.

  5. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  6. Large Terrain Modeling and Visualization for Planets

    NASA Technical Reports Server (NTRS)

    Myint, Steven; Jain, Abhinandan; Cameron, Jonathan; Lim, Christopher

    2011-01-01

    Physics-based simulations are actively used in the design, testing, and operations phases of surface and near-surface planetary space missions. One of the challenges in realtime simulations is the ability to handle large multi-resolution terrain data sets within models as well as for visualization. In this paper, we describe special techniques that we have developed for visualization, paging, and data storage for dealing with these large data sets. The visualization technique uses a real-time GPU-based continuous level-of-detail technique that delivers multiple frames a second performance even for planetary scale terrain model sizes.

  7. Mars Global Reference Atmospheric Model 2010 Version: Users Guide

    NASA Technical Reports Server (NTRS)

    Justh, H. L.

    2014-01-01

    This Technical Memorandum (TM) presents the Mars Global Reference Atmospheric Model 2010 (Mars-GRAM 2010) and its new features. Mars-GRAM is an engineering-level atmospheric model widely used for diverse mission applications. Applications include systems design, performance analysis, and operations planning for aerobraking, entry, descent and landing, and aerocapture. Additionally, this TM includes instructions on obtaining the Mars-GRAM source code and data files as well as running Mars-GRAM. It also contains sample Mars-GRAM input and output files and an example of how to incorporate Mars-GRAM as an atmospheric subroutine in a trajectory code.

  8. Open high-level data formats and software for gamma-ray astronomy

    NASA Astrophysics Data System (ADS)

    Deil, Christoph; Boisson, Catherine; Kosack, Karl; Perkins, Jeremy; King, Johannes; Eger, Peter; Mayer, Michael; Wood, Matthew; Zabalza, Victor; Knödlseder, Jürgen; Hassan, Tarek; Mohrmann, Lars; Ziegler, Alexander; Khelifi, Bruno; Dorner, Daniela; Maier, Gernot; Pedaletti, Giovanna; Rosado, Jaime; Contreras, José Luis; Lefaucheur, Julien; Brügge, Kai; Servillat, Mathieu; Terrier, Régis; Walter, Roland; Lombardi, Saverio

    2017-01-01

    In gamma-ray astronomy, a variety of data formats and proprietary software have been traditionally used, often developed for one specific mission or experiment. Especially for ground-based imaging atmospheric Cherenkov telescopes (IACTs), data and software are mostly private to the collaborations operating the telescopes. However, there is a general movement in science towards the use of open data and software. In addition, the next-generation IACT instrument, the Cherenkov Telescope Array (CTA), will be operated as an open observatory. We have created a Github organisation at https://github.com/open-gamma-ray-astro where we are developing high-level data format specifications. A public mailing list was set up at https://lists.nasa.gov/mailman/listinfo/open-gamma-ray-astro and a first face-to-face meeting on the IACT high-level data model and formats took place in April 2016 in Meudon (France). This open multi-mission effort will help to accelerate the development of open data formats and open-source software for gamma-ray astronomy, leading to synergies in the development of analysis codes and eventually better scientific results (reproducible, multi-mission). This write-up presents this effort for the first time, explaining the motivation and context, the available resources and process we use, as well as the status and planned next steps for the data format specifications. We hope that it will stimulate feedback and future contributions from the gamma-ray astronomy community.

  9. Digital Gunnery: How Combat Vehicle Gunnery Training Creates a Model for Training the Mission Command System.

    DTIC Science & Technology

    2017-06-09

    DIGITAL GUNNERY: HOW COMBAT VEHICLE GUNNERY TRAINING CREATES A MODEL FOR TRAINING THE MISSION COMMAND SYSTEM A thesis presented...Training Creates a Model for Training the Mission Command System 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S...digital systems that give commanders an unprecedented ability to understand and lead in the battlefields where they operate. Unfortunately, units

  10. Cost Analysis In A Multi-Mission Operations Environment

    NASA Technical Reports Server (NTRS)

    Newhouse, M.; Felton, L.; Bornas, N.; Botts, D.; Roth, K.; Ijames, G.; Montgomery, P.

    2014-01-01

    Spacecraft control centers have evolved from dedicated, single-mission or single missiontype support to multi-mission, service-oriented support for operating a variety of mission types. At the same time, available money for projects is shrinking and competition for new missions is increasing. These factors drive the need for an accurate and flexible model to support estimating service costs for new or extended missions; the cost model in turn drives the need for an accurate and efficient approach to service cost analysis. The National Aeronautics and Space Administration (NASA) Huntsville Operations Support Center (HOSC) at Marshall Space Flight Center (MSFC) provides operations services to a variety of customers around the world. HOSC customers range from launch vehicle test flights; to International Space Station (ISS) payloads; to small, short duration missions; and has included long duration flagship missions. The HOSC recently completed a detailed analysis of service costs as part of the development of a complete service cost model. The cost analysis process required the team to address a number of issues. One of the primary issues involves the difficulty of reverse engineering individual mission costs in a highly efficient multimission environment, along with a related issue of the value of detailed metrics or data to the cost model versus the cost of obtaining accurate data. Another concern is the difficulty of balancing costs between missions of different types and size and extrapolating costs to different mission types. The cost analysis also had to address issues relating to providing shared, cloud-like services in a government environment, and then assigning an uncertainty or risk factor to cost estimates that are based on current technology, but will be executed using future technology. Finally the cost analysis needed to consider how to validate the resulting cost models taking into account the non-homogeneous nature of the available cost data and the decreasing flight rate. This paper presents the issues encountered during the HOSC cost analysis process, and the associated lessons learned. These lessons can be used when planning for a new multi-mission operations center or in the transformation from a dedicated control center to multi-center operations, as an aid in defining processes that support future cost analysis and estimation. The lessons can also be used by mature serviceoriented, multi-mission control centers to streamline or refine their cost analysis process.

  11. Hubble Space Telescope servicing mission scientific instrument protective enclosure design requirements and contamination controls

    NASA Technical Reports Server (NTRS)

    Hansen, Patricia A.; Hughes, David W.; Hedgeland, Randy J.; Chivatero, Craig J.; Studer, Robert J.; Kostos, Peter J.

    1994-01-01

    The Scientific Instrument Protective Enclosures were designed for the Hubble Space Telescope Servicing Missions to provide a beginning environment to a Scientific Instrument during ground and on orbit activities. The Scientific Instruments required very stringent surface cleanliness and molecular outgassing levels to maintain ultraviolet performance. Data from the First Servicing Mission verified that both the Scientific Instruments and Scientific Instrument Protective Enclosures met surface cleanliness level requirements during ground and on-orbit activities.

  12. Exploration of Force Transition in Stability Operations Using Multi-Agent Simulation

    DTIC Science & Technology

    2006-09-01

    risk, mission failure risk, and time in the context of the operational threat environment. The Pythagoras Multi-Agent Simulation and Data Farming...NUMBER OF PAGES 173 14. SUBJECT TERMS Stability Operations, Peace Operations, Data Farming, Pythagoras , Agent- Based Model, Multi-Agent Simulation...the operational threat environment. The Pythagoras Multi-Agent Simulation and Data Farming techniques are used to investigate force-level

  13. Heuristic Modeling for TRMM Lifetime Predictions

    NASA Technical Reports Server (NTRS)

    Jordan, P. S.; Sharer, P. J.; DeFazio, R. L.

    1996-01-01

    Analysis time for computing the expected mission lifetimes of proposed frequently maneuvering, tightly altitude constrained, Earth orbiting spacecraft have been significantly reduced by means of a heuristic modeling method implemented in a commercial-off-the-shelf spreadsheet product (QuattroPro) running on a personal computer (PC). The method uses a look-up table to estimate the maneuver frequency per month as a function of the spacecraft ballistic coefficient and the solar flux index, then computes the associated fuel use by a simple engine model. Maneuver frequency data points are produced by means of a single 1-month run of traditional mission analysis software for each of the 12 to 25 data points required for the table. As the data point computations are required only a mission design start-up and on the occasion of significant mission redesigns, the dependence on time consuming traditional modeling methods is dramatically reduced. Results to date have agreed with traditional methods to within 1 to 1.5 percent. The spreadsheet approach is applicable to a wide variety of Earth orbiting spacecraft with tight altitude constraints. It will be particularly useful to such missions as the Tropical Rainfall Measurement Mission scheduled for launch in 1997, whose mission lifetime calculations are heavily dependent on frequently revised solar flux predictions.

  14. Shuttle program. MCC level C formulation requirements: Shuttle TAEM guidance and flight control

    NASA Technical Reports Server (NTRS)

    Carman, G. L.

    1980-01-01

    The Level C requirements for the shuttle orbiter terminal area energy management (TAEM) guidance and flight control functions to be incorporated into the Mission Control Center entry profile planning processor are defined. This processor will be used for preentry evaluation of the entry through landing maneuvers, and will include a simplified three degree-of-freedom model of the body rotational dynamics that is necessary to account for the effects of attitude response on the trajectory dynamics. This simulation terminates at TAEM-autoland interface.

  15. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, Mohammed Omair

    2012-01-01

    Our simulation was able to mimic the results of 30 tests on the actual hardware. This shows that simulations have the potential to enable early design validation - well before actual hardware exists. Although simulations focused around data processing procedures at subsystem and device level, they can also be applied to system level analysis to simulate mission scenarios and consumable tracking (e.g. power, propellant, etc.). Simulation engine plug-in developments are continually improving the product, but handling time for time-sensitive operations (like those of the remote engineering unit and bus controller) can be cumbersome.

  16. Nuclear electric propulsion mission performance for fast piloted Mars missions

    NASA Technical Reports Server (NTRS)

    Hack, K. J.; George, J. A.; Dudzinski, L. A.

    1991-01-01

    A mission study aimed at minimizing the time humans would spend in the space environment is presented. The use of nuclear electric propulsion (NEP), when combined with a suitable mission profile, can reduce the trip time to durations competitive with other propulsion systems. Specifically, a split mission profile utilizing an earth crew capture vehicle accounts for a significant portion of the trip time reduction compared to previous studies. NEP is shown to be capable of performing fast piloted missions to Mars at low power levels using near-term technology and is considered to be a viable candidate for these missions.

  17. A study of Bangladesh's sub-surface water storages using satellite products and data assimilation scheme.

    PubMed

    Khaki, M; Forootan, E; Kuhn, M; Awange, J; Papa, F; Shum, C K

    2018-06-01

    Climate change can significantly influence terrestrial water changes around the world particularly in places that have been proven to be more vulnerable such as Bangladesh. In the past few decades, climate impacts, together with those of excessive human water use have changed the country's water availability structure. In this study, we use multi-mission remotely sensed measurements along with a hydrological model to separately analyze groundwater and soil moisture variations for the period 2003-2013, and their interactions with rainfall in Bangladesh. To improve the model's estimates of water storages, terrestrial water storage (TWS) data obtained from the Gravity Recovery And Climate Experiment (GRACE) satellite mission are assimilated into the World-Wide Water Resources Assessment (W3RA) model using the ensemble-based sequential technique of the Square Root Analysis (SQRA) filter. We investigate the capability of the data assimilation approach to use a non-regional hydrological model for a regional case study. Based on these estimates, we investigate relationships between the model derived sub-surface water storage changes and remotely sensed precipitations, as well as altimetry-derived river level variations in Bangladesh by applying the empirical mode decomposition (EMD) method. A larger correlation is found between river level heights and rainfalls (78% on average) in comparison to groundwater storage variations and rainfalls (57% on average). The results indicate a significant decline in groundwater storage (∼32% reduction) for Bangladesh between 2003 and 2013, which is equivalent to an average rate of 8.73 ± 2.45mm/year. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Integrated Medical Model (IMM) 4.0 Enhanced Functionalities

    NASA Technical Reports Server (NTRS)

    Young, M.; Keenan, A. B.; Saile, L.; Boley, L. A.; Walton, M. E.; Shah, R. V.; Kerstman, E. L.; Myers, J. G.

    2015-01-01

    The Integrated Medical Model is a probabilistic simulation model that uses input data on 100 medical conditions to simulate expected medical events, the resources required to treat, and the resulting impact to the mission for specific crew and mission characteristics. The newest development version of IMM, IMM v4.0, adds capabilities that remove some of the conservative assumptions that underlie the current operational version, IMM v3. While IMM v3 provides the framework to simulate whether a medical event occurred, IMMv4 also simulates when the event occurred during a mission timeline. This allows for more accurate estimation of mission time lost and resource utilization. In addition to the mission timeline, IMMv4.0 features two enhancements that address IMM v3 assumptions regarding medical event treatment. Medical events in IMMv3 are assigned the untreated outcome if any resource required to treat the event was unavailable. IMMv4 allows for partially treated outcomes that are proportional to the amount of required resources available, thus removing the dichotomous treatment assumption. An additional capability IMMv4 is to use an alternative medical resource when the primary resource assigned to the condition is depleted, more accurately reflecting the real-world system. The additional capabilities defining IMM v4.0the mission timeline, partial treatment, and alternate drug result in more realistic predicted mission outcomes. The primary model outcomes of IMM v4.0 for the ISS6 mission, including mission time lost, probability of evacuation, and probability of loss of crew life, are be compared to those produced by the current operational version of IMM to showcase enhanced prediction capabilities.

  19. Earth Observatory Satellite system definition study. Report no. 3: Design/cost tradeoff studies. Appendix A: EOS program WBS dictionary. Appendix B: EOS mission functional analysis

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The work breakdown structure (WBS) dictionary for the Earth Observatory Satellite (EOS) is defined. The various elements of the EOS program are examined to include the aggregate of hardware, computer software, services, and data required to develop, produce, test, support, and operate the space vehicle and the companion ground data management system. A functional analysis of the EOS mission is developed. The operations for three typical EOS missions, Delta, Titan, and Shuttle launched are considered. The functions were determined for the top program elements, and the mission operations, function 2.0, was expanded to level one functions. Selection of ten level one functions for further analysis to level two and three functions were based on concern for the EOS operations and associated interfaces.

  20. Optimizing technology investments: a broad mission model approach

    NASA Technical Reports Server (NTRS)

    Shishko, R.

    2003-01-01

    A long-standing problem in NASA is how to allocate scarce technology development resources across advanced technologies in order to best support a large set of future potential missions. Within NASA, two orthogonal paradigms have received attention in recent years: the real-options approach and the broad mission model approach. This paper focuses on the latter.

  1. Orbiter/payload contamination control assessment support

    NASA Technical Reports Server (NTRS)

    Rantanen, R. O.; Strange, D. A.; Hetrick, M. A.

    1978-01-01

    The development and integration of 16 payload bay liner filters into the existing shuttle/payload contamination evaluation (SPACE) computer program is discussed as well as an initial mission profile model. As part of the mission profile model, a thermal conversion program, a temperature cycling routine, a flexible plot routine and a mission simulation of orbital flight test 3 are presented.

  2. Air Force Research Laboratory space technology strategic investment model: analysis and outcomes for warfighter capabilities

    NASA Astrophysics Data System (ADS)

    Preiss, Bruce; Greene, Lloyd; Kriebel, Jamie; Wasson, Robert

    2006-05-01

    The Air Force Research Laboratory utilizes a value model as a primary input for space technology planning and budgeting. The Space Sector at AFRL headquarters manages space technology investment across all the geographically disparate technical directorates and ensures that integrated planning is achieved across the space community. The space investment portfolio must ultimately balance near, mid, and far-term investments across all the critical space mission areas. Investment levels and growth areas can always be identified by a typical capability analysis or gap analysis, but the value model approach goes one step deeper and helps identify the potential payoff of technology investments by linking the technology directly to an existing or potential concept. The value of the technology is then viewed from the enabling performance perspective of the concept that ultimately fulfills the Air Force mission. The process of linking space technologies to future concepts and technology roadmaps will be reviewed in this paper, along with representative results from this planning cycle. The initial assumptions in this process will be identified along with the strengths and weaknesses of this planning methodology.

  3. Topex/Poseidon: A United States/France mission. Oceanography from space: The oceans and climate

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The TOPEX/POSEIDON space mission, sponsored by NASA and France's space agency, the Centre National d'Etudes Spatiales (CNES), will give new observations of the Earth from space to gain a quantitative understanding of the role of ocean currents in climate change. Rising atmospheric concentrations of carbon dioxide and other 'greenhouse gases' produced as a result of human activities could generate a global warming, followed by an associated rise in sea level. The satellite will use radar altimetry to measure sea-surface height and will be tracked by three independent systems to yield accurate topographic maps over the dimensions of entire ocean basins. The satellite data, together with the Tropical Ocean and Global Atmosphere (TOGA) program and the World Ocean Circulation Experiment (WOCE) measurements, will be analyzed by an international scientific team. By merging the satellite observations with TOGA and WOCE findings, the scientists will establish the extensive data base needed for the quantitative description and computer modeling of ocean circulation. The ocean models will eventually be coupled with atmospheric models to lay the foundation for predictions of global climate change.

  4. Space Radiation and Manned Mission: Interface Between Physics and Biology

    NASA Astrophysics Data System (ADS)

    Hei, Tom

    2012-07-01

    The natural radiation environment in space consists of a mixed field of high energy protons, heavy ions, electrons and alpha particles. Interplanetary travel to the International Space Station and any planned establishment of satellite colonies on other solar system implies radiation exposure to the crew and is a major concern to space agencies. With shielding, the radiation exposure level in manned space missions is likely to be chronic, low dose irradiation. Traditionally, our knowledge of biological effects of cosmic radiation in deep space is almost exclusively derived from ground-based accelerator experiments with heavy ions in animal or in vitro models. Radiobiological effects of low doses of ionizing radiation are subjected to modulations by various parameters including bystander effects, adaptive response, genomic instability and genetic susceptibility of the exposed individuals. Radiation dosimetry and modeling will provide conformational input in areas where data are difficult to acquire experimentally. However, modeling is only as good as the quality of input data. This lecture will discuss the interdependent nature of physics and biology in assessing the radiobiological response to space radiation.

  5. Assessment of active methods for removal of LEO debris

    NASA Astrophysics Data System (ADS)

    Hakima, Houman; Emami, M. Reza

    2018-03-01

    This paper investigates the applicability of five active methods for removal of large low Earth orbit debris. The removal methods, namely net, laser, electrodynamic tether, ion beam shepherd, and robotic arm, are selected based on a set of high-level space mission constraints. Mission level criteria are then utilized to assess the performance of each redirection method in light of the results obtained from a Monte Carlo simulation. The simulation provides an insight into the removal time, performance robustness, and propellant mass criteria for the targeted debris range. The remaining attributes are quantified based on the models provided in the literature, which take into account several important parameters pertaining to each removal method. The means of assigning attributes to each assessment criterion is discussed in detail. A systematic comparison is performed using two different assessment schemes: Analytical Hierarchy Process and utility-based approach. A third assessment technique, namely the potential-loss analysis, is utilized to highlight the effect of risks in each removal methods.

  6. Autocommander: A Supervisory Controller for Integrated Guidance and Control for the 2nd Generation Reusable Launch Vehicle

    NASA Technical Reports Server (NTRS)

    Fisher, J. E.; Lawrence, D. A.; Zhu, J. J.; Jackson, Scott (Technical Monitor)

    2002-01-01

    This paper presents a hierarchical architecture for integrated guidance and control that achieves risk and cost reduction for NASA's 2d generation reusable launch vehicle (RLV). Guidance, attitude control, and control allocation subsystems that heretofore operated independently will now work cooperatively under the coordination of a top-level autocommander. In addition to delivering improved performance from a flight mechanics perspective, the autocommander is intended to provide an autonomous supervisory control capability for traditional mission management under nominal conditions, G&C reconfiguration in response to effector saturation, and abort mode decision-making upon vehicle malfunction. This high-level functionality is to be implemented through the development of a relational database that is populated with the broad range of vehicle and mission specific data and translated into a discrete event system model for analysis, simulation, and onboard implementation. A Stateflow Autocoder software tool that translates the database into the Stateflow component of a Matlab/Simulink simulation is also presented.

  7. The NASA-JPL advanced propulsion program

    NASA Technical Reports Server (NTRS)

    Frisbee, Robert H.

    1994-01-01

    The NASA Advanced Propulsion Concepts (APC) program at the Jet Propulsion Laboratory (JPL) consists of two main areas: The first involves cooperative modeling and research activities between JPL and various universities and industry; the second involves research at universities and industry that is directly supported by JPL. The cooperative research program consists of mission studies, research and development of ion engine technology using C-60 (Buckminsterfullerene) propellant, and research and development of lithium-propellant Lorentz-force accelerator (LFA) engine technology. The university/industry- supported research includes research (modeling and proof-of-concept experiments) in advanced, long-life electric propulsion, and in fusion propulsion. These propulsion concepts were selected primarily to cover a range of applications from near-term to far-term missions. For example, the long-lived pulsed-xenon thruster research that JPL is supporting at Princeton University addresses the near-term need for efficient, long-life attitude control and station-keeping propulsion for Earth-orbiting spacecraft. The C-60-propellant ion engine has the potential for good efficiency in a relatively low specific impulse (Isp) range (10,000 - 30,000 m/s) that is optimum for relatively fast (less than 100 day) cis-lunar (LEO/GEO/Lunar) missions employing near-term, high-specific mass electric propulsion vehicles. Research and modeling on the C-60-ion engine are currently being performed by JPL (engine demonstration), Caltech (C-60 properties), MIT (plume modeling), and USC (diagnostics). The Li-propellant LFA engine also has good efficiency in the modest Isp range (40,000 - 50,000 m/s) that is optimum for near-to-mid-term megawatt-class solar- and nuclear-electric propulsion vehicles used for Mars missions transporting cargo (in support of a piloted mission). Research and modeling on the Li-LFA engine are currently being performed by JPL (cathode development), Moscow Aviation Institute (engine testing), Thermacore (electrode development), as well as at MIT (plume modeling), and USC (diagnostics). Also, the mission performance of a nuclear-electric propulsion (NEP) Li-LFA Mars cargo vehicle is being modeled by JPL (mission analysis; thruster and power processor modeling) and the Rocketdyne Energy Technology and Engineering Center (ETEC) (power system modeling). Finally, the fusion propulsion research activities that JPL is supporting at Pennsylvania State University (PSU) and at Lawrenceville Plasma Physics (LPP) are aimed at far-term fast (less than 100 day round trip) piloted Mars missions and, in the very far term, interstellar missions.

  8. KSC00pp0074

    NASA Image and Video Library

    2000-01-14

    KENNEDY SPACE CENTER, Fla. -- At Launch Pad 39A, STS-99 Mission Specialists Gerhard Thiele (Ph.D.), of the European Space Agency (in front), and Janet Kavandi (Ph.D.) prepare to practice emergency egress procedures with a slidewire basket. Seven slidewires, with flatbottom baskets suspended from each wire, extend from the Fixed Service Structure at the orbiter access arm level. These baskets could provide an escape route for the astronauts until the final 30 seconds of the countdown in case of an emergency. The crew is taking part in Terminal Countdown Demonstration Test (TCDT) activities that provide the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST.

  9. KSC-00pp0074

    NASA Image and Video Library

    2000-01-14

    KENNEDY SPACE CENTER, Fla. -- At Launch Pad 39A, STS-99 Mission Specialists Gerhard Thiele (Ph.D.), of the European Space Agency (in front), and Janet Kavandi (Ph.D.) prepare to practice emergency egress procedures with a slidewire basket. Seven slidewires, with flatbottom baskets suspended from each wire, extend from the Fixed Service Structure at the orbiter access arm level. These baskets could provide an escape route for the astronauts until the final 30 seconds of the countdown in case of an emergency. The crew is taking part in Terminal Countdown Demonstration Test (TCDT) activities that provide the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST.

  10. STS-99 crew check out emergency egress equipment at launch pad during TCDT

    NASA Technical Reports Server (NTRS)

    2000-01-01

    At Launch Pad 39A, STS-99 Mission Specialists Gerhard Thiele (Ph.D.), of the European Space Agency (in front), and Janet Kavandi (Ph.D.) prepare to practice emergency egress procedures with a slidewire basket. Seven slidewires, with flatbottom baskets suspended from each wire, extend from the Fixed Service Structure at the orbiter access arm level. These baskets could provide an escape route for the astronauts until the final 30 seconds of the countdown in case of an emergency. The crew is taking part in Terminal Countdown Demonstration Test (TCDT) activities that provide the crew with simulated countdown exercises, emergency egress training, and opportunities to inspect the mission payloads in the orbiter's payload bay. STS-99 is the Shuttle Radar Topography Mission, which will chart a new course, using two antennae and a 200-foot-long section of space station-derived mast protruding from the payload bay to produce unrivaled 3-D images of the Earth's surface. The result of the Shuttle Radar Topography Mission could be close to 1 trillion measurements of the Earth's topography. Besides contributing to the production of better maps, these measurements could lead to improved water drainage modeling, more realistic flight simulators, better locations for cell phone towers, and enhanced navigation safety. Launch of Endeavour on the 11-day mission is scheduled for Jan. 31 at 12:47 p.m. EST.

  11. Formulation of consumables management models. Development approach for the mission planning processor working model

    NASA Technical Reports Server (NTRS)

    Connelly, L. C.

    1977-01-01

    The mission planning processor is a user oriented tool for consumables management and is part of the total consumables subsystem management concept. The approach to be used in developing a working model of the mission planning processor is documented. The approach includes top-down design, structured programming techniques, and application of NASA approved software development standards. This development approach: (1) promotes cost effective software development, (2) enhances the quality and reliability of the working model, (3) encourages the sharing of the working model through a standard approach, and (4) promotes portability of the working model to other computer systems.

  12. Apollo 7 - Press Kit

    NASA Technical Reports Server (NTRS)

    1968-01-01

    Contents include the following: General release. Mission objectives. Mission description. Flight plan. Alternate missions. Experiments. Abort model. Spacecraft structure system. The Saturn 1B launch vehicle. Flight sequence. Launch preparations. Mission control center-Houston. Manned space flight network. Photographic equipment. Apollo 7 crew. Apollo 7 test program.

  13. Comparing simulations and test data of a radiation damaged charge-coupled device for the Euclid mission

    NASA Astrophysics Data System (ADS)

    Skottfelt, Jesper; Hall, David J.; Gow, Jason P. D.; Murray, Neil J.; Holland, Andrew D.; Prod'homme, Thibaut

    2017-04-01

    The visible imager instrument on board the Euclid mission is a weak-lensing experiment that depends on very precise shape measurements of distant galaxies obtained by a large charge-coupled device (CCD) array. Due to the harsh radiative environment outside the Earth's atmosphere, it is anticipated that the CCDs over the mission lifetime will be degraded to an extent that these measurements will be possible only through the correction of radiation damage effects. We have therefore created a Monte Carlo model that simulates the physical processes taking place when transferring signals through a radiation-damaged CCD. The software is based on Shockley-Read-Hall theory and is made to mimic the physical properties in the CCD as closely as possible. The code runs on a single electrode level and takes the three-dimensional trap position, potential structure of the pixel, and multilevel clocking into account. A key element of the model is that it also takes device specific simulations of electron density as a direct input, thereby avoiding making any analytical assumptions about the size and density of the charge cloud. This paper illustrates how test data and simulated data can be compared in order to further our understanding of the positions and properties of the individual radiation-induced traps.

  14. Satellite services system analysis study. Volume 1, part 2: Executive summary

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The early mission model was developed through a survey of the potential user market. Service functions were defined and a group of design reference missions were selected which represented needs for each of the service functions. Servicing concepts were developed through mission analysis and STS timeline constraint analysis. The hardware needs for accomplishing the service functions were identified with emphasis being placed on applying equipment in the current NASA inventory and that in advanced stages of planning. A more comprehensive service model was developed based on the NASA and DoD mission models segregated by mission class. The number of service events of each class were estimated based on average revisit and service assumptions. Service Kits were defined as collections of equipment applicable to performing one or more service functions. Preliminary design was carrid out on a selected set of hardware needed for early service missions. The organization and costing of the satellie service systems were addressed.

  15. Small Stirling dynamic isotope power system for robotic space missions

    NASA Technical Reports Server (NTRS)

    Bents, D. J.

    1992-01-01

    The design of a multihundred-watt Dynamic Isotope Power System (DIPS), based on the U.S. Department of Energy (DOE) General Purpose Heat Source (GPHS) and small (multihundred-watt) free-piston Stirling engine (FPSE), is being pursued as a potential lower cost alternative to radioisotope thermoelectric generators (RTG's). The design is targeted at the power needs of future unmanned deep space and planetary surface exploration missions ranging from scientific probes to Space Exploration Initiative precursor missions. Power level for these missions is less than a kilowatt. The incentive for any dynamic system is that it can save fuel and reduce costs and radiological hazard. Unlike DIPS based on turbomachinery conversion (e.g. Brayton), this small Stirling DIPS can be advantageously scaled to multihundred-watt unit size while preserving size and mass competitiveness with RTG's. Stirling conversion extends the competitive range for dynamic systems down to a few hundred watts--a power level not previously considered for dynamic systems. The challenge for Stirling conversion will be to demonstrate reliability and life similar to RTG experience. Since the competitive potential of FPSE as an isotope converter was first identified, work has focused on feasibility of directly integrating GPHS with the Stirling heater head. Thermal modeling of various radiatively coupled heat source/heater head geometries has been performed using data furnished by the developers of FPSE and GPHS. The analysis indicates that, for the 1050 K heater head configurations considered, GPHS fuel clad temperatures remain within acceptable operating limits. Based on these results, preliminary characterizations of multihundred-watt units have been established.

  16. Top-level modeling of an als system utilizing object-oriented techniques

    NASA Astrophysics Data System (ADS)

    Rodriguez, L. F.; Kang, S.; Ting, K. C.

    The possible configuration of an Advanced Life Support (ALS) System capable of supporting human life for long-term space missions continues to evolve as researchers investigate potential technologies and configurations. To facilitate the decision process the development of acceptable, flexible, and dynamic mathematical computer modeling tools capable of system level analysis is desirable. Object-oriented techniques have been adopted to develop a dynamic top-level model of an ALS system.This approach has several advantages; among these, object-oriented abstractions of systems are inherently modular in architecture. Thus, models can initially be somewhat simplistic, while allowing for adjustments and improvements. In addition, by coding the model in Java, the model can be implemented via the World Wide Web, greatly encouraging the utilization of the model. Systems analysis is further enabled with the utilization of a readily available backend database containing information supporting the model. The subsystem models of the ALS system model include Crew, Biomass Production, Waste Processing and Resource Recovery, Food Processing and Nutrition, and the Interconnecting Space. Each subsystem model and an overall model have been developed. Presented here is the procedure utilized to develop the modeling tool, the vision of the modeling tool, and the current focus for each of the subsystem models.

  17. Sulfate and Pb-210 Simulated in a Global Model Using Assimilated Meteorological Fields

    NASA Technical Reports Server (NTRS)

    Chin, Mian; Rood, Richard; Lin, S.-J.; Jacob, Daniel; Muller, Jean-Francois

    1999-01-01

    This report presents the results of distributions of tropospheric sulfate, Pb-210 and their precursors from a global 3-D model. This model is driven by assimilated meteorological fields generated by the Goddard Data Assimilation Office. Model results are compared with observations from surface sites and from multiplatform field campaigns of Pacific Exploratory Missions (PEM) and Advanced Composition Explorer (ACE). The model generally captures the seasonal variation of sulfate at the surface sites, and reproduces well the short-term in-situ observations. We will discuss the roles of various processes contributing to the sulfate levels in the troposphere, and the roles of sulfate aerosol in regional and global radiative forcing.

  18. Big Software for SmallSats: Adapting CFS to CubeSat Missions

    NASA Technical Reports Server (NTRS)

    Cudmore, Alan P.; Crum, Gary; Sheikh, Salman; Marshall, James

    2015-01-01

    Expanding capabilities and mission objectives for SmallSats and CubeSats is driving the need for reliable, reusable, and robust flight software. While missions are becoming more complicated and the scientific goals more ambitious, the level of acceptable risk has decreased. Design challenges are further compounded by budget and schedule constraints that have not kept pace. NASA's Core Flight Software System (cFS) is an open source solution which enables teams to build flagship satellite level flight software within a CubeSat schedule and budget. NASA originally developed cFS to reduce mission and schedule risk for flagship satellite missions by increasing code reuse and reliability. The Lunar Reconnaissance Orbiter, which launched in 2009, was the first of a growing list of Class B rated missions to use cFS. Large parts of cFS are now open source, which has spurred adoption outside of NASA. This paper reports on the experiences of two teams using cFS for current CubeSat missions. The performance overheads of cFS are quantified, and the reusability of code between missions is discussed. The analysis shows that cFS is well suited to use on CubeSats and demonstrates the portability and modularity of cFS code.

  19. Informing a hydrological model of the Ogooué with multi-mission remote sensing data

    NASA Astrophysics Data System (ADS)

    Kittel, Cecile M. M.; Nielsen, Karina; Tøttrup, Christian; Bauer-Gottwein, Peter

    2018-02-01

    Remote sensing provides a unique opportunity to inform and constrain a hydrological model and to increase its value as a decision-support tool. In this study, we applied a multi-mission approach to force, calibrate and validate a hydrological model of the ungauged Ogooué river basin in Africa with publicly available and free remote sensing observations. We used a rainfall-runoff model based on the Budyko framework coupled with a Muskingum routing approach. We parametrized the model using the Shuttle Radar Topography Mission digital elevation model (SRTM DEM) and forced it using precipitation from two satellite-based rainfall estimates, FEWS-RFE (Famine Early Warning System rainfall estimate) and the Tropical Rainfall Measuring Mission (TRMM) 3B42 v.7, and temperature from ECMWF ERA-Interim. We combined three different datasets to calibrate the model using an aggregated objective function with contributions from (1) historical in situ discharge observations from the period 1953-1984 at six locations in the basin, (2) radar altimetry measurements of river stages by Envisat and Jason-2 at 12 locations in the basin and (3) GRACE (Gravity Recovery and Climate Experiment) total water storage change (TWSC). Additionally, we extracted CryoSat-2 observations throughout the basin using a Sentinel-1 SAR (synthetic aperture radar) imagery water mask and used the observations for validation of the model. The use of new satellite missions, including Sentinel-1 and CryoSat-2, increased the spatial characterization of river stage. Throughout the basin, we achieved good agreement between observed and simulated discharge and the river stage, with an RMSD between simulated and observed water amplitudes at virtual stations of 0.74 m for the TRMM-forced model and 0.87 m for the FEWS-RFE-forced model. The hydrological model also captures overall total water storage change patterns, although the amplitude of storage change is generally underestimated. By combining hydrological modeling with multi-mission remote sensing from 10 different satellite missions, we obtain new information on an otherwise unstudied basin. The proposed model is the best current baseline characterization of hydrological conditions in the Ogooué in light of the available observations.

  20. A Three-Legged Stool or Race? Governance Models for NOAA RISAs, DOI CSCs, and USDA Climate Hubs

    NASA Astrophysics Data System (ADS)

    Foster, J. G.

    2014-12-01

    NOAAs Regional Integrated Sciences and Assessments (RISA) Teams, DOIs Climate Science Centers (CSCs), and USDAs Regional Climate Hubs (RCHs) have common missions of integrating climate and related knowledge across scientific disciplines and regions to create "actionable" information that decision-makes can use to manage climate risks and impacts at state and local scales. However, the sponsoring agency programs, university investigators, and local federal officials govern each differently. The three models of program and center governance are 1) exclusively university (RISAs), 2) a hybrid of Federal government and (host) university (CSCs,), and 3) exclusively Federal (Hubs). Each model has it's advantages and disadvantages in terms of legal definition and authority, scientific mission requirements and strategies, flexibility and legitimacy to conduct research and to collaborate regionally with constituencies, leadership and governance approach and "friction points,", staff capacity and ability to engage stakeholders, necessity to deliver products and services, bureaucratic oversight, performance evaluation, and political support at Congressional, state, and local levels. Using available sources of information and data, this paper will compare and contrast the strengths and weakness of these three regional applied climate science center governance models.

Top