Sample records for open cloud testbed

  1. A Test-Bed of Secure Mobile Cloud Computing for Military Applications

    DTIC Science & Technology

    2016-09-13

    searching databases. This kind of applications is a typical example of mobile cloud computing (MCC). MCC has lots of applications in the military...Release; Distribution Unlimited UU UU UU UU 13-09-2016 1-Aug-2014 31-Jul-2016 Final Report: A Test-bed of Secure Mobile Cloud Computing for Military...Army Research Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 Test-bed, Mobile Cloud Computing , Security, Military Applications REPORT

  2. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  3. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  4. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  5. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2017-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  6. The OGC Innovation Program Testbeds - Advancing Architectures for Earth and Systems

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Simonis, I.; Serich, S.

    2016-12-01

    The OGC Innovation Program provides a collaborative agile process for solving challenging science problems and advancing new technologies. Since 1999, 100 initiatives have taken place, from multi-million dollar testbeds to small interoperability experiments. During these initiatives, sponsors and technology implementers (including academia and private sector) come together to solve problems, produce prototypes, develop demonstrations, provide best practices, and advance the future of standards. This presentation will provide the latest system architectures that can be used for Earth and space systems as a result of the OGC Testbed 13, including the following components: Elastic cloud autoscaler for Earth Observations (EO) using a WPS in an ESGF hybrid climate data research platform. Accessibility of climate data for the scientist and non-scientist users via on demand models wrapped in WPS. Standards descriptions for containerize applications to discover processes on the cloud, including using linked data, a WPS extension for hybrid clouds and linking to hybrid big data stores. OpenID and OAuth to secure OGC Services with built-in Attribute Based Access Control (ABAC) infrastructures leveraging GeoDRM patterns. Publishing and access of vector tiles, including use of compression and attribute options reusing patterns from WMS, WMTS and WFS. Servers providing 3D Tiles and streaming of data, including Indexed 3d Scene Layer (I3S), CityGML and Common DataBase (CDB). Asynchronous Services with advanced pushed notifications strategies, with a filter language instead of simple topic subscriptions, that can be use across OGC services. Testbed 14 will continue advancing topics like Big Data, security, and streaming, as well as making easier to use OGC services (e.g. RESTful APIs). The Call for Participation will be issued in December and responses are due on mid January 2018.

  7. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  8. Open Source Cloud-Based Technologies for Bim

    NASA Astrophysics Data System (ADS)

    Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.

    2018-05-01

    This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.

  9. A satellite observation test bed for cloud parameterization development

    NASA Astrophysics Data System (ADS)

    Lebsock, M. D.; Suselj, K.

    2015-12-01

    We present an observational test-bed of cloud and precipitation properties derived from CloudSat, CALIPSO, and the the A-Train. The focus of the test-bed is on marine boundary layer clouds including stratocumulus and cumulus and the transition between these cloud regimes. Test-bed properties include the cloud cover and three dimensional cloud fraction along with the cloud water path and precipitation water content, and associated radiative fluxes. We also include the subgrid scale distribution of cloud and precipitation, and radiaitive quantities, which must be diagnosed by a model parameterization. The test-bed further includes meterological variables from the Modern Era Retrospective-analysis for Research and Applications (MERRA). MERRA variables provide the initialization and forcing datasets to run a parameterization in Single Column Model (SCM) mode. We show comparisons of an Eddy-Diffusivity/Mass-FLux (EDMF) parameterization coupled to micorphsycis and macrophysics packages run in SCM mode with observed clouds. Comparsions are performed regionally in areas of climatological subsidence as well stratified by dynamical and thermodynamical variables. Comparisons demonstrate the ability of the EDMF model to capture the observed transitions between subtropical stratocumulus and cumulus cloud regimes.

  10. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  11. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  12. A Business-to-Business Interoperability Testbed: An Overview

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulvatunyou, Boonserm; Ivezic, Nenad; Monica, Martin

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  13. Open-cell and closed-cell clouds off Peru

    NASA Image and Video Library

    2010-04-27

    2010/107 - 04/17 at 21 :05 UTC. Open-cell and closed-cell clouds off Peru, Pacific Ocean Resembling a frosted window on a cold winter's day, this lacy pattern of marine clouds was captured off the coast of Peru in the Pacific Ocean by the MODIS on the Aqua satellite on April 19, 2010. The image reveals both open- and closed-cell cumulus cloud patterns. These cells, or parcels of air, often occur in roughly hexagonal arrays in a layer of fluid (the atmosphere often behaves like a fluid) that begins to "boil," or convect, due to heating at the base or cooling at the top of the layer. In "closed" cells warm air is rising in the center, and sinking around the edges, so clouds appear in cell centers, but evaporate around cell edges. This produces cloud formations like those that dominate the lower left. The reverse flow can also occur: air can sink in the center of the cell and rise at the edge. This process is called "open cell" convection, and clouds form at cell edges around open centers, which creates a lacy, hollow-looking pattern like the clouds in the upper right. Closed and open cell convection represent two stable atmospheric configurations — two sides of the convection coin. But what determines which path the "boiling" atmosphere will take? Apparently the process is highly chaotic, and there appears to be no way to predict whether convection will result in open or closed cells. Indeed, the atmosphere may sometimes flip between one mode and another in no predictable pattern. Satellite: Aqua NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team To learn more about MODIS go to: rapidfire.sci.gsfc.nasa.gov/gallery/?latest NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

  14. The GridEcon Platform: A Business Scenario Testbed for Commercial Cloud Services

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn; Guo, Li; Fleming, Alan; Courcoubetis, Costas

    Within this paper, we present the GridEcon Platform, a testbed for designing and evaluating economics-aware services in a commercial Cloud computing setting. The Platform is based on the idea that the exact working of such services is difficult to predict in the context of a market and, therefore, an environment for evaluating its behavior in an emulated market is needed. To identify the components of the GridEcon Platform, a number of economics-aware services and their interactions have been envisioned. The two most important components of the platform are the Marketplace and the Workflow Engine. The Workflow Engine allows the simple composition of a market environment by describing the service interactions between economics-aware services. The Marketplace allows trading goods using different market mechanisms. The capabilities of these components of the GridEcon Platform in conjunction with the economics-aware services are described in this paper in detail. The validation of an implemented market mechanism and a capacity planning service using the GridEcon Platform also demonstrated the usefulness of the GridEcon Platform.

  15. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-02

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Satellite Cloud and Radiative Property Processing and Distribution System on the NASA Langley ASDC OpenStack and OpenShift Cloud Platform

    NASA Astrophysics Data System (ADS)

    Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.

    2017-12-01

    Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.

  17. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth`s atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy`s Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climatemore » change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described.« less

  18. Solar Resource Assessment with Sky Imagery and a Virtual Testbed for Sky Imager Solar Forecasting

    NASA Astrophysics Data System (ADS)

    Kurtz, Benjamin Bernard

    In recent years, ground-based sky imagers have emerged as a promising tool for forecasting solar energy on short time scales (0 to 30 minutes ahead). Following the development of sky imager hardware and algorithms at UC San Diego, we present three new or improved algorithms for sky imager forecasting and forecast evaluation. First, we present an algorithm for measuring irradiance with a sky imager. Sky imager forecasts are often used in conjunction with other instruments for measuring irradiance, so this has the potential to decrease instrumentation costs and logistical complexity. In particular, the forecast algorithm itself often relies on knowledge of the current irradiance which can now be provided directly from the sky images. Irradiance measurements are accurate to within about 10%. Second, we demonstrate a virtual sky imager testbed that can be used for validating and enhancing the forecast algorithm. The testbed uses high-quality (but slow) simulations to produce virtual clouds and sky images. Because virtual cloud locations are known, much more advanced validation procedures are possible with the virtual testbed than with measured data. In this way, we are able to determine that camera geometry and non-uniform evolution of the cloud field are the two largest sources of forecast error. Finally, with the assistance of the virtual sky imager testbed, we develop improvements to the cloud advection model used for forecasting. The new advection schemes are 10-20% better at short time horizons.

  19. Open-cell cloud formation over the Bahamas

    NASA Technical Reports Server (NTRS)

    2002-01-01

    What atmospheric scientists refer to as open cell cloud formation is a regular occurrence on the back side of a low-pressure system or cyclone in the mid-latitudes. In the Northern Hemisphere, a low-pressure system will draw in surrounding air and spin it counterclockwise. That means that on the back side of the low-pressure center, cold air will be drawn in from the north, and on the front side, warm air will be drawn up from latitudes closer to the equator. This movement of an air mass is called advection, and when cold air advection occurs over warmer waters, open cell cloud formations often result. This MODIS image shows open cell cloud formation over the Atlantic Ocean off the southeast coast of the United States on February 19, 2002. This particular formation is the result of a low-pressure system sitting out in the North Atlantic Ocean a few hundred miles east of Massachusetts. (The low can be seen as the comma-shaped figure in the GOES-8 Infrared image from February 19, 2002.) Cold air is being drawn down from the north on the western side of the low and the open cell cumulus clouds begin to form as the cold air passes over the warmer Caribbean waters. For another look at the scene, check out the MODIS Direct Broadcast Image from the University of Wisconsin. Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  20. Open-cell and closed-cell clouds off Peru [detail

    NASA Image and Video Library

    2017-12-08

    2010/107 - 04/17 at 21 :05 UTC. Open-cell and closed-cell clouds off Peru, Pacific Ocean. To view the full fame of this image to go: www.flickr.com/photos/gsfc/4557497219/ Resembling a frosted window on a cold winter's day, this lacy pattern of marine clouds was captured off the coast of Peru in the Pacific Ocean by the MODIS on the Aqua satellite on April 19, 2010. The image reveals both open- and closed-cell cumulus cloud patterns. These cells, or parcels of air, often occur in roughly hexagonal arrays in a layer of fluid (the atmosphere often behaves like a fluid) that begins to "boil," or convect, due to heating at the base or cooling at the top of the layer. In "closed" cells warm air is rising in the center, and sinking around the edges, so clouds appear in cell centers, but evaporate around cell edges. This produces cloud formations like those that dominate the lower left. The reverse flow can also occur: air can sink in the center of the cell and rise at the edge. This process is called "open cell" convection, and clouds form at cell edges around open centers, which creates a lacy, hollow-looking pattern like the clouds in the upper right. Closed and open cell convection represent two stable atmospheric configurations — two sides of the convection coin. But what determines which path the "boiling" atmosphere will take? Apparently the process is highly chaotic, and there appears to be no way to predict whether convection will result in open or closed cells. Indeed, the atmosphere may sometimes flip between one mode and another in no predictable pattern. Satellite: Aqua NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team To learn more about MODIS go to: rapidfire.sci.gsfc.nasa.gov/gallery/?latest NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

  1. An open science cloud for scientific research

    NASA Astrophysics Data System (ADS)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  2. A Hybrid Cloud Computing Service for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, C. P.

    2016-12-01

    Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.

  3. Trace explosives sensor testbed (TESTbed)

    NASA Astrophysics Data System (ADS)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  4. Precipitation-generated oscillations in open cellular cloud fields.

    PubMed

    Feingold, Graham; Koren, Ilan; Wang, Hailong; Xue, Huiwen; Brewer, Wm Alan

    2010-08-12

    Cloud fields adopt many different patterns that can have a profound effect on the amount of sunlight reflected back to space, with important implications for the Earth's climate. These cloud patterns can be observed in satellite images of the Earth and often exhibit distinct cell-like structures associated with organized convection at scales of tens of kilometres. Recent evidence has shown that atmospheric aerosol particles-through their influence on precipitation formation-help to determine whether cloud fields take on closed (more reflective) or open (less reflective) cellular patterns. The physical mechanisms controlling the formation and evolution of these cells, however, are still poorly understood, limiting our ability to simulate realistically the effects of clouds on global reflectance. Here we use satellite imagery and numerical models to show how precipitating clouds produce an open cellular cloud pattern that oscillates between different, weakly stable states. The oscillations are a result of precipitation causing downward motion and outflow from clouds that were previously positively buoyant. The evaporating precipitation drives air down to the Earth's surface, where it diverges and collides with the outflows of neighbouring precipitating cells. These colliding outflows form surface convergence zones and new cloud formation. In turn, the newly formed clouds produce precipitation and new colliding outflow patterns that are displaced from the previous ones. As successive cycles of this kind unfold, convergence zones alternate with divergence zones and new cloud patterns emerge to replace old ones. The result is an oscillating, self-organized system with a characteristic cell size and precipitation frequency.

  5. WFIRST Coronagraph Technology Development Testbeds: Status and Recent Testbed Results

    NASA Astrophysics Data System (ADS)

    Shi, Fang; An, Xin; Balasubramanian, Kunjithapatham; cady, eric; Gordon, Brian; Greer, Frank; Kasdin, N. Jeremy; Kern, Brian; Lam, Raymond; Marx, David; Moody, Dwight; Patterson, Keith; Poberezhskiy, Ilya; mejia prada, camilo; Gersh-Range, Jessica; Eldorado Riggs, A. J.; Seo, Byoung-Joon; Shields, Joel; Sidick, Erkin; Tang, Hong; Trauger, John Terry; Truong, Tuan; White, Victor; Wilson, Daniel; Zhou, Hanying; JPL WFIRST Testbed Team, Princeton University

    2018-01-01

    As a part of technology development for the WFIRST coronagraph instrument (CGI), dedicated testbeds are built and commissioned at JPL. The coronagraph technology development testbeds include the Occulting Mask Coronagraph (OMC) testbed, the Shaped Pupil Coronagraph/Integral Field Spectrograph (SPC/IFS) testbed, and the Vacuum Surface Gauge (VSG) testbed. With its configuration similar to the WFIRST flight coronagraph instrument the OMC testbed consists of two coronagraph modes, Shaped Pupil Coronagraph (SPC) and Hybrid Lyot Coronagraph (HLC), a low order wavefront sensor (LOWFS), and an optical telescope assembly (OTA) simulator which can generate realistic LoS drift and jitter as well as low order wavefront error that would be induced by the WFIRST telescope’s vibration and thermal changes. The SPC/IFS testbed is a dedicated testbed to test the IFS working with a Shaped Pupil Coronagraph while the VSG testbed is for measuring and calibrating the deformable mirrors, a key component used for WFIRST CGI's wavefront control. In this poster, we will describe the testbed functions and status as well as the highlight of the latest testbed results from OMC, SPC/IFS and VSG testbeds.

  6. Creating a Rackspace and NASA Nebula compatible cloud using the OpenStack project (Invited)

    NASA Astrophysics Data System (ADS)

    Clark, R.

    2010-12-01

    NASA and Rackspace have both provided technology to the OpenStack that allows anyone to create a private Infrastructure as a Service (IaaS) cloud using open source software and commodity hardware. OpenStack is designed and developed completely in the open and with an open governance process. NASA donated Nova, which powers the compute portion of NASA Nebula Cloud Computing Platform, and Rackspace donated Swift, which powers Rackspace Cloud Files. The project is now in continuous development by NASA, Rackspace, and hundreds of other participants. When you create a private cloud using Openstack, you will have the ability to easily interact with your private cloud, a government cloud, and an ecosystem of public cloud providers, using the same API.

  7. Identity federation in OpenStack - an introduction to hybrid clouds

    NASA Astrophysics Data System (ADS)

    Denis, Marek; Castro Leon, Jose; Ormancey, Emmanuel; Tedesco, Paolo

    2015-12-01

    We are evaluating cloud identity federation available in the OpenStack ecosystem that allows for on premise bursting into remote clouds with use of local identities (i.e. domain accounts). Further enhancements to identity federation are a clear way to hybrid cloud architectures - virtualized infrastructures layered across independent private and public clouds.

  8. Increasing the value of geospatial informatics with open approaches for Big Data

    NASA Astrophysics Data System (ADS)

    Percivall, G.; Bermudez, L. E.

    2017-12-01

    Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."

  9. Efficacy of Cloud-Radiative Perturbations in Deep Open- and Closed-Cell Stratocumulus Clouds due to Aerosol Perturbations

    NASA Astrophysics Data System (ADS)

    Possner, A.; Wang, H.; Caldeira, K.; Wood, R.; Ackerman, T. P.

    2017-12-01

    Aerosol-cloud interactions (ACIs) in marine stratocumulus remain a significant source of uncertainty in constraining the cloud-radiative effect in a changing climate. Ship tracks are undoubted manifestations of ACIs embedded within stratocumulus cloud decks and have proven to be a useful framework to study the effect of aerosol perturbations on cloud morphology, macrophysical, microphyiscal and cloud-radiative properties. However, so far most observational (Christensen et al. 2012, Chen et al. 2015) and numerical studies (Wang et al. 2011, Possner et al. 2015, Berner et al. 2015) have concentrated on ship tracks in shallow boundary layers of depths between 300 - 800 m, while most stratocumulus decks form in significantly deeper boundary layers (Muhlbauer et al. 2014). In this study we investigate the efficacy of aerosol perturbations in deep open and closed cell stratocumulus. Multi-day idealised cloud-resolving simulations are performed for the RF06 flight of the VOCALS-Rex field campaign (Wood et al. 2011). During this flight pockets of deep open and closed cells were observed in a 1410 m deep boundary layer. The efficacy of aerosol perturbations of varied concentration and spatial gradients in altering the cloud micro- and macrophysical state and cloud-radiative effect is determined in both cloud regimes. Our simulations show that a continued point source emission flux of 1.16*1011 particles m-2 s-1 applied within a 300x300 m2 gridbox induces pronounced cloud cover changes in approximately a third of the simulated 80x80 km2 domain, a weakening of the diurnal cycle in the open-cell regime and a resulting increase in domain-mean cloud albedo of 0.2. Furthermore, we contrast the efficacy of equal strength near-surface or above-cloud aerosol perturbations in altering the cloud state.

  10. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  11. OpenID Connect as a security service in cloud-based medical imaging systems

    PubMed Central

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-01-01

    Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  12. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  13. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  14. Delay Tolerant Networking on NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Johnson, Sandra; Eddy, Wesley

    2016-01-01

    This presentation covers the status of the implementation of an open source software that implements the specifications developed by the CCSDS Working Group. Interplanetary Overlay Network (ION) is open source software and it implements specifications that have been developed by two international working groups through IETF and CCSDS. ION was implemented on the SCaN Testbed, a testbed located on an external pallet on ISS, by the GRC team. The presentation will cover the architecture of the system, high level implementation details, and issues porting ION to VxWorks.

  15. Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun

    NASA Technical Reports Server (NTRS)

    Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.

    2008-01-01

    Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.

  16. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  17. Optical interferometer testbed

    NASA Technical Reports Server (NTRS)

    Blackwood, Gary H.

    1991-01-01

    Viewgraphs on optical interferometer testbed presented at the MIT Space Research Engineering Center 3rd Annual Symposium are included. Topics covered include: space-based optical interferometer; optical metrology; sensors and actuators; real time control hardware; controlled structures technology (CST) design methodology; identification for MIMO control; FEM/ID correlation for the naked truss; disturbance modeling; disturbance source implementation; structure design: passive damping; low authority control; active isolation of lightweight mirrors on flexible structures; open loop transfer function of mirror; and global/high authority control.

  18. The Magellan Final Report on Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,; Coghlan, Susan; Yelick, Katherine

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computingmore » Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.« less

  19. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  20. The EPOS Vision for the Open Science Cloud

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  1. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    NASA Astrophysics Data System (ADS)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  2. Continuation: The EOSDIS testbed data system

    NASA Technical Reports Server (NTRS)

    Emery, Bill; Kelley, Timothy D.

    1995-01-01

    The continuation of the EOSDIS testbed ('Testbed') has materialized from a multi-task system to a fully functional stand-alone data archive distribution center that once was only X-Windows driven to a system that is accessible by all types of users and computers via the World Wide Web. Throughout the past months, the Testbed has evolved into a completely new system. The current system is now accessible through Netscape, Mosaic, and all other servers that can contact the World Wide Web. On October 1, 1995 we will open to the public and we expect that the statistics of the type of user, where they are located, and what they are looking for will drastically change. What is the most important change in the Testbed has been the Web interface. This interface will allow more users access to the system and walk them through the data types with more ease than before. All of the callbacks are written in such a way that icons can be used to easily move around in the programs interface. The homepage offers the user the opportunity to go and get more information about each satellite data type and also information on free programs. These programs are grouped into categories for types of computers that the programs are compiled for, along with information on how to FTP the programs back to the end users computer. The heart of the Testbed is still the acquisition of satellite data. From the Testbed homepage, the user selects the 'access to data system' icon, which will take them to the world map and allow them to select an area that they would like coverage on by simply clicking that area of the map. This creates a new map where other similar choices can be made to get the latitude and longitude of the region the satellite data will cover. Once a selection has been made the search parameters page will appear to be filled out. Afterwards, the browse image will be called for once the search is completed and the images for viewing can be selected. There are several other option pages

  3. HyspIRI Low Latency Concept and Benchmarks

    NASA Technical Reports Server (NTRS)

    Mandl, Dan

    2010-01-01

    Topics include HyspIRI low latency data ops concept, HyspIRI data flow, ongoing efforts, experiment with Web Coverage Processing Service (WCPS) approach to injecting new algorithms into SensorWeb, low fidelity HyspIRI IPM testbed, compute cloud testbed, open cloud testbed environment, Global Lambda Integrated Facility (GLIF) and OCC collaboration with Starlight, delay tolerant network (DTN) protocol benchmarking, and EO-1 configuration for preliminary DTN prototype.

  4. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  5. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License

  6. Managing autonomy levels in the SSM/PMAD testbed. [Space Station Power Management and Distribution

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry R.

    1990-01-01

    It is pointed out that when autonomous operations are mixed with those of a manual nature, concepts concerning the boundary of operations and responsibility become clouded. The space station module power management and distribution (SSM/PMAD) automation testbed has the need for such mixed-mode capabilities. The concept of managing the SSM/PMAD testbed in the presence of changing levels of autonomy is examined. A knowledge-based approach to implementing autonomy management in the distributed SSM/PMAD utilizing a centralized planning system is presented. Its knowledge relations and system-wide interactions are discussed, along with the operational nature of the currently functioning SSM/PMAD knowledge-based systems.

  7. Network testbed creation and validation

    DOEpatents

    Thai, Tan Q.; Urias, Vincent; Van Leeuwen, Brian P.; Watts, Kristopher K.; Sweeney, Andrew John

    2017-03-21

    Embodiments of network testbed creation and validation processes are described herein. A "network testbed" is a replicated environment used to validate a target network or an aspect of its design. Embodiments describe a network testbed that comprises virtual testbed nodes executed via a plurality of physical infrastructure nodes. The virtual testbed nodes utilize these hardware resources as a network "fabric," thereby enabling rapid configuration and reconfiguration of the virtual testbed nodes without requiring reconfiguration of the physical infrastructure nodes. Thus, in contrast to prior art solutions which require a tester manually build an emulated environment of physically connected network devices, embodiments receive or derive a target network description and build out a replica of this description using virtual testbed nodes executed via the physical infrastructure nodes. This process allows for the creation of very large (e.g., tens of thousands of network elements) and/or very topologically complex test networks.

  8. Fading testbed for free-space optical communications

    NASA Astrophysics Data System (ADS)

    Shrestha, Amita; Giggenbach, Dirk; Mustafa, Ahmad; Pacheco-Labrador, Jorge; Ramirez, Julio; Rein, Fabian

    2016-10-01

    Free-space optical (FSO) communication is a very attractive technology offering very high throughput without spectral regulation constraints, yet allowing small antennas (telescopes) and tap-proof communication. However, the transmitted signal has to travel through the atmosphere where it gets influenced by atmospheric turbulence, causing scintillation of the received signal. In addition, climatic effects like fogs, clouds and rain also affect the signal significantly. Moreover, FSO being a line of sight communication requires precise pointing and tracking of the telescopes, which otherwise also causes fading. To achieve error-free transmission, various mitigation techniques like aperture averaging, adaptive optics, transmitter diversity, sophisticated coding and modulation schemes are being investigated and implemented. Evaluating the performance of such systems under controlled conditions is very difficult in field trials since the atmospheric situation constantly changes, and the target scenario (e.g. on aircraft or satellites) is not easily accessible for test purposes. Therefore, with the motivation to be able to test and verify a system under laboratory conditions, DLR has developed a fading testbed that can emulate most realistic channel conditions. The main principle of the fading testbed is to control the input current of a variable optical attenuator such that it attenuates the incoming signal according to the loaded power vector. The sampling frequency and mean power of the vector can be optionally changed according to requirements. This paper provides a brief introduction to software and hardware development of the fading testbed and measurement results showing its accuracy and application scenarios.

  9. Network testbed creation and validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thai, Tan Q.; Urias, Vincent; Van Leeuwen, Brian P.

    Embodiments of network testbed creation and validation processes are described herein. A "network testbed" is a replicated environment used to validate a target network or an aspect of its design. Embodiments describe a network testbed that comprises virtual testbed nodes executed via a plurality of physical infrastructure nodes. The virtual testbed nodes utilize these hardware resources as a network "fabric," thereby enabling rapid configuration and reconfiguration of the virtual testbed nodes without requiring reconfiguration of the physical infrastructure nodes. Thus, in contrast to prior art solutions which require a tester manually build an emulated environment of physically connected network devices,more » embodiments receive or derive a target network description and build out a replica of this description using virtual testbed nodes executed via the physical infrastructure nodes. This process allows for the creation of very large (e.g., tens of thousands of network elements) and/or very topologically complex test networks.« less

  10. Cross layer optimization for cloud-based radio over optical fiber networks

    NASA Astrophysics Data System (ADS)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  11. Stewardship and management challenges within a cloud-based open data ecosystem (Invited Paper 211863)

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2017-12-01

    NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.

  12. Tidal disruption of open clusters in their parent molecular clouds

    NASA Technical Reports Server (NTRS)

    Long, Kevin

    1989-01-01

    A simple model of tidal encounters has been applied to the problem of an open cluster in a clumpy molecular cloud. The parameters of the clumps are taken from the Blitz, Stark, and Long (1988) catalog of clumps in the Rosette molecular cloud. Encounters are modeled as impulsive, rectilinear collisions between Plummer spheres, but the tidal approximation is not invoked. Mass and binding energy changes during an encounter are computed by considering the velocity impulses given to individual stars in a random realization of a Plummer sphere. Mean rates of mass and binding energy loss are then computed by integrating over many encounters. Self-similar evolutionary calculations using these rates indicate that the disruption process is most sensitive to the cluster radius and relatively insensitive to cluster mass. The calculations indicate that clusters which are born in a cloud similar to the Rosette with a cluster radius greater than about 2.5 pc will not survive long enough to leave the cloud. The majority of clusters, however, have smaller radii and will survive the passage through their parent cloud.

  13. Advanced turboprop testbed systems study

    NASA Technical Reports Server (NTRS)

    Goldsmith, I. M.

    1982-01-01

    The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.

  14. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  15. Single link flexible beam testbed project. Thesis

    NASA Technical Reports Server (NTRS)

    Hughes, Declan

    1992-01-01

    This thesis describes the single link flexible beam testbed at the CLaMS laboratory in terms of its hardware, software, and linear model, and presents two controllers, each including a hub angle proportional-derivative (PD) feedback compensator and one augmented by a second static gain full state feedback loop, based upon a synthesized strictly positive real (SPR) output, that increases specific flexible mode pole damping ratios w.r.t the PD only case and hence reduces unwanted residual oscillation effects. Restricting full state feedback gains so as to produce a SPR open loop transfer function ensures that the associated compensator has an infinite gain margin and a phase margin of at least (-90, 90) degrees. Both experimental and simulation data are evaluated in order to compare some different observer performance when applied to the real testbed and to the linear model when uncompensated flexible modes are included.

  16. The Algae Testbed Public-Private Partnership (ATP 3 ) framework; establishment of a national network of testbed sites to support sustainable algae production

    DOE PAGES

    McGowen, John; Knoshaug, Eric P.; Laurens, Lieve M. L.; ...

    2017-07-01

    Well-controlled experiments that directly compare seasonal algal productivities across geographically distinct locations have not been reported before. To fill this gap, six cultivation testbed facilities were chosen across the United States to evaluate different climatic zones with respect to algal biomass productivity potential. The geographical locations and climates were as follows: Southwest, desert; Western, coastal; Southeast, inland; Southeast, coastal; Pacific, tropical; and Midwest, greenhouse. The testbed facilities were equipped with identical systems for inoculum production and open pond operation and methods were standardized across all testbeds to ensure accurate measurement of physical and biological variables. The ability of the testbedmore » sites to culture and analyze the same algal species, Nannochloropsis oceanica KA32, using identical pond operational and data collection procedures was evaluated during the same seasonal timeframe. This manuscript describes the results of a first-of-its-kind coordinated testbed validation field study while providing critical details on how geographical variations in temperature, light, and weather variables influenced algal productivity, nitrate consumption, and biomass composition. We found distinct differences in growth characteristics due to the geographic location and the resulting climatic and seasonal conditions across the sites, with the highest productivities observed at the desert Southwest and tropical Pacific regions, followed by the Western coastal region. The lowest productivities were observed at the Southeast inland and Midwest greenhouse locations. These differences in productivities among the sites correlated with the differences in pond water temperature and available solar radiation. In addition two sites, the tropical Pacific and Southeast inland experienced unusual events, spontaneous flocculation, and unusually cold and wet (rainfall) conditions respectively, that negatively affected

  17. The Algae Testbed Public-Private Partnership (ATP 3 ) framework; establishment of a national network of testbed sites to support sustainable algae production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGowen, John; Knoshaug, Eric P.; Laurens, Lieve M. L.

    Well-controlled experiments that directly compare seasonal algal productivities across geographically distinct locations have not been reported before. To fill this gap, six cultivation testbed facilities were chosen across the United States to evaluate different climatic zones with respect to algal biomass productivity potential. The geographical locations and climates were as follows: Southwest, desert; Western, coastal; Southeast, inland; Southeast, coastal; Pacific, tropical; and Midwest, greenhouse. The testbed facilities were equipped with identical systems for inoculum production and open pond operation and methods were standardized across all testbeds to ensure accurate measurement of physical and biological variables. The ability of the testbedmore » sites to culture and analyze the same algal species, Nannochloropsis oceanica KA32, using identical pond operational and data collection procedures was evaluated during the same seasonal timeframe. This manuscript describes the results of a first-of-its-kind coordinated testbed validation field study while providing critical details on how geographical variations in temperature, light, and weather variables influenced algal productivity, nitrate consumption, and biomass composition. We found distinct differences in growth characteristics due to the geographic location and the resulting climatic and seasonal conditions across the sites, with the highest productivities observed at the desert Southwest and tropical Pacific regions, followed by the Western coastal region. The lowest productivities were observed at the Southeast inland and Midwest greenhouse locations. These differences in productivities among the sites correlated with the differences in pond water temperature and available solar radiation. In addition two sites, the tropical Pacific and Southeast inland experienced unusual events, spontaneous flocculation, and unusually cold and wet (rainfall) conditions respectively, that negatively affected

  18. The Fizeau Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Zhang, Xiaolei; Carpenter, Kenneth G.; Lyon, Richard G,; Huet, Hubert; Marzouk, Joe; Solyar, Gregory

    2003-01-01

    The Fizeau Interferometer Testbed (FIT) is a collaborative effort between NASA's Goddard Space Flight Center, the Naval Research Laboratory, Sigma Space Corporation, and the University of Maryland. The testbed will be used to explore the principles of and the requirements for the full, as well as the pathfinder, Stellar Imager mission concept. It has a long term goal of demonstrating closed-loop control of a sparse array of numerous articulated mirrors to keep optical beams in phase and optimize interferometric synthesis imaging. In this paper we present the optical and data acquisition system design of the testbed, and discuss the wavefront sensing and control algorithms to be used. Currently we have completed the initial design and hardware procurement for the FIT. The assembly and testing of the Testbed will be underway at Goddard's Instrument Development Lab in the coming months.

  19. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick Start Guide (Revision 1)

    DTIC Science & Technology

    2017-06-01

    for GIFT Cloud, the web -based application version of the Generalized Intelligent Framework for Tutoring (GIFT). GIFT is a modular, open-source...external applications. GIFT is available to users with a GIFT Account at no cost. GIFT Cloud is an implementation of GIFT. This web -based application...section. Approved for public release; distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser

  20. The Confluence of GIS, Cloud and Open Source, Enabling Big Raster Data Applications

    NASA Astrophysics Data System (ADS)

    Plesea, L.; Emmart, C. B.; Boller, R. A.; Becker, P.; Baynes, K.

    2016-12-01

    The rapid evolution of available cloud services is profoundly changing the way applications are being developed and used. Massive object stores, service scalability, continuous integration are some of the most important cloud technology advances that directly influence science applications and GIS. At the same time, more and more scientists are using GIS platforms in their day to day research. Yet with new opportunities there are always some challenges. Given the large amount of data commonly required in science applications, usually large raster datasets, connectivity is one of the biggest problems. Connectivity has two aspects, one being the limited bandwidth and latency of the communication link due to the geographical location of the resources, the other one being the interoperability and intrinsic efficiency of the interface protocol used to connect. NASA and Esri are actively helping each other and collaborating on a few open source projects, aiming to provide some of the core technology components to directly address the GIS enabled data connectivity problems. Last year Esri contributed LERC, a very fast and efficient compression algorithm to the GDAL/MRF format, which itself is a NASA/Esri collaboration project. The MRF raster format has some cloud aware features that make it possible to build high performance web services on cloud platforms, as some of the Esri projects demonstrate. Currently, another NASA open source project, the high performance OnEarth WMTS server is being refactored and enhanced to better integrate with MRF, GDAL and Esri software. Taken together, the GDAL, MRF and OnEarth form the core of an open source CloudGIS toolkit that is already showing results. Since it is well integrated with GDAL, which is the most common interoperability component of GIS applications, this approach should improve the connectivity and performance of many science and GIS applications in the cloud.

  1. The NASA/OAST telerobot testbed architecture

    NASA Technical Reports Server (NTRS)

    Matijevic, J. R.; Zimmerman, W. F.; Dolinsky, S.

    1989-01-01

    Through a phased development such as a laboratory-based research testbed, the NASA/OAST Telerobot Testbed provides an environment for system test and demonstration of the technology which will usefully complement, significantly enhance, or even replace manned space activities. By integrating advanced sensing, robotic manipulation and intelligent control under human-interactive supervision, the Testbed will ultimately demonstrate execution of a variety of generic tasks suggestive of space assembly, maintenance, repair, and telescience. The Testbed system features a hierarchical layered control structure compatible with the incorporation of evolving technologies as they become available. The Testbed system is physically implemented in a computing architecture which allows for ease of integration of these technologies while preserving the flexibility for test of a variety of man-machine modes. The development currently in progress on the functional and implementation architectures of the NASA/OAST Testbed and capabilities planned for the coming years are presented.

  2. Cloud Based Earth Observation Data Exploitation Platforms

    NASA Astrophysics Data System (ADS)

    Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.

    2017-12-01

    In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland

  3. MIT's interferometer CST testbed

    NASA Technical Reports Server (NTRS)

    Hyde, Tupper; Kim, ED; Anderson, Eric; Blackwood, Gary; Lublin, Leonard

    1990-01-01

    The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.

  4. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  5. NBodyLab: A Testbed for Undergraduates Utilizing a Web Interface to NEMO and MD-GRAPE2 Hardware

    NASA Astrophysics Data System (ADS)

    Johnson, V. L.; Teuben, P. J.; Penprase, B. E.

    An N-body simulation testbed called NBodyLab was developed at Pomona College as a teaching tool for undergraduates. The testbed runs under Linux and provides a web interface to selected back-end NEMO modeling and analysis tools, and several integration methods which can optionally use an MD-GRAPE2 supercomputer card in the server to accelerate calculation of particle-particle forces. The testbed provides a framework for using and experimenting with the main components of N-body simulations: data models and transformations, numerical integration of the equations of motion, analysis and visualization products, and acceleration techniques (in this case, special purpose hardware). The testbed can be used by students with no knowledge of programming or Unix, freeing such students and their instructor to spend more time on scientific experimentation. The advanced student can extend the testbed software and/or more quickly transition to the use of more advanced Unix-based toolsets such as NEMO, Starlab and model builders such as GalactICS. Cosmology students at Pomona College used the testbed to study collisions of galaxies with different speeds, masses, densities, collision angles, angular momentum, etc., attempting to simulate, for example, the Tadpole Galaxy and the Antenna Galaxies. The testbed framework is available as open-source to assist other researchers and educators. Recommendations are made for testbed enhancements.

  6. Mapping urban green open space in Bontang city using QGIS and cloud computing

    NASA Astrophysics Data System (ADS)

    Agus, F.; Ramadiani; Silalahi, W.; Armanda, A.; Kusnandar

    2018-04-01

    Digital mapping techniques are available freely and openly so that map-based application development is easier, faster and cheaper. A rapid development of Cloud Computing Geographic Information System makes this system can help the needs of the community for the provision of geospatial information online. The presence of urban Green Open Space (GOS) provide great benefits as an oxygen supplier, carbon-binding agent and can contribute to providing comfort and beauty of city life. This study aims to propose a platform application of GIS Cloud Computing (CC) of Bontang City GOS mapping. The GIS-CC platform uses the basic map available that’s free and open source. The research used survey method to collect GOS data obtained from Bontang City Government, while application developing works Quantum GIS-CC. The result section describes the existence of GOS Bontang City and the design of GOS mapping application.

  7. Thermodynamic and cloud parameter retrieval using infrared spectral data

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Huang, Hung-Lung A.; Li, Jun; McGill, Matthew J.; Mango, Stephen A.

    2005-01-01

    High-resolution infrared radiance spectra obtained from near nadir observations provide atmospheric, surface, and cloud property information. A fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. The retrieval algorithm is presented along with its application to recent field experiment data from the NPOESS Airborne Sounding Testbed - Interferometer (NAST-I). The retrieval accuracy dependence on cloud properties is discussed. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with an accuracy of approximately 1.0 km. Preliminary NAST-I retrieval results from the recent Atlantic-THORPEX Regional Campaign (ATReC) are presented and compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL).

  8. A price- and-time-slot-negotiation mechanism for Cloud service reservations.

    PubMed

    Son, Seokho; Sim, Kwang Mong

    2012-06-01

    When making reservations for Cloud services, consumers and providers need to establish service-level agreements through negotiation. Whereas it is essential for both a consumer and a provider to reach an agreement on the price of a service and when to use the service, to date, there is little or no negotiation support for both price and time-slot negotiations (PTNs) for Cloud service reservations. This paper presents a multi-issue negotiation mechanism to facilitate the following: 1) PTNs between Cloud agents and 2) tradeoff between price and time-slot utilities. Unlike many existing negotiation mechanisms in which a negotiation agent can only make one proposal at a time, agents in this work are designed to concurrently make multiple proposals in a negotiation round that generate the same aggregated utility, differing only in terms of individual price and time-slot utilities. Another novelty of this work is formulating a novel time-slot utility function that characterizes preferences for different time slots. These ideas are implemented in an agent-based Cloud testbed. Using the testbed, experiments were carried out to compare this work with related approaches. Empirical results show that PTN agents reach faster agreements and achieve higher utilities than other related approaches. A case study was carried out to demonstrate the application of the PTN mechanism for pricing Cloud resources.

  9. Secure open cloud in data transmission using reference pattern and identity with enhanced remote privacy checking

    NASA Astrophysics Data System (ADS)

    Vijay Singh, Ran; Agilandeeswari, L.

    2017-11-01

    To handle the large amount of client’s data in open cloud lots of security issues need to be address. Client’s privacy should not be known to other group members without data owner’s valid permission. Sometime clients are fended to have accessing with open cloud servers due to some restrictions. To overcome the security issues and these restrictions related to storing, data sharing in an inter domain network and privacy checking, we propose a model in this paper which is based on an identity based cryptography in data transmission and intermediate entity which have client’s reference with identity that will take control handling of data transmission in an open cloud environment and an extended remote privacy checking technique which will work at admin side. On behalf of data owner’s authority this proposed model will give best options to have secure cryptography in data transmission and remote privacy checking either as private or public or instructed. The hardness of Computational Diffie-Hellman assumption algorithm for key exchange makes this proposed model more secure than existing models which are being used for public cloud environment.

  10. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  11. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  12. Application of Model-based Prognostics to a Pneumatic Valves Testbed

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.; Gorospe, George

    2014-01-01

    Pneumatic-actuated valves play an important role in many applications, including cryogenic propellant loading for space operations. Model-based prognostics emphasizes the importance of a model that describes the nominal and faulty behavior of a system, and how faulty behavior progresses in time, causing the end of useful life of the system. We describe the construction of a testbed consisting of a pneumatic valve that allows the injection of faulty behavior and controllable fault progression. The valve opens discretely, and is controlled through a solenoid valve. Controllable leaks of pneumatic gas in the testbed are introduced through proportional valves, allowing the testing and validation of prognostics algorithms for pneumatic valves. A new valve prognostics approach is developed that estimates fault progression and predicts remaining life based only on valve timing measurements. Simulation experiments demonstrate and validate the approach.

  13. Variable Dynamic Testbed Vehicle Dynamics Analysis

    DOT National Transportation Integrated Search

    1996-03-01

    ANTI-ROLL BAR, EMULATION, FOUR-WHEEL-STEERING, LATERAL RESPONSE CHARACTERISTICS, SIMULATION, VARIABLE DYNAMIC TESTBED VEHICLE, INTELLIGENT VEHICLE INITIATIVE OR IVI : THE VARIABLE DYNAMIC TESTBED VEHICLE (VDTV) CONCEPT HAS BEEN PROPOSED AS A TOOL...

  14. Evaluating Aerosol Process Modules within the Framework of the Aerosol Modeling Testbed

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Velu, V.; Gustafson, W. I.; Chapman, E.; Easter, R. C.; Shrivastava, M.; Singh, B.

    2012-12-01

    Factors that influence predictions of aerosol direct and indirect forcing, such as aerosol mass, composition, size distribution, hygroscopicity, and optical properties, still contain large uncertainties in both regional and global models. New aerosol treatments are usually implemented into a 3-D atmospheric model and evaluated using a limited number of measurements from a specific case study. Under this modeling paradigm, the performance and computational efficiency of several treatments for a specific aerosol process cannot be adequately quantified because many other processes among various modeling studies (e.g. grid configuration, meteorology, emission rates) are different as well. The scientific community needs to know the advantages and disadvantages of specific aerosol treatments when the meteorology, chemistry, and other aerosol processes are identical in order to reduce the uncertainties associated with aerosols predictions. To address these issues, an Aerosol Modeling Testbed (AMT) has been developed that systematically and objectively evaluates new aerosol treatments for use in regional and global models. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from the Community Atmosphere Model version 5 (CAM5) have also been ported to WRF so that they can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. Data from several campaigns, including the 2006

  15. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  16. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    .... 120322212-2212-01] Spectrum Sharing Innovation Test-Bed Pilot Program AGENCY: National Telecommunications... Innovation Test-Bed pilot program to assess whether devices employing Dynamic Spectrum Access techniques can... Spectrum Sharing Innovation Test-Bed (Test-Bed) pilot program to examine the feasibility of increased...

  17. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  18. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-28

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  19. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  20. New Educational Modules Using a Cyber-Distribution System Testbed

    DOE PAGES

    Xie, Jing; Bedoya, Juan Carlos; Liu, Chen-Ching; ...

    2018-03-30

    At Washington State University (WSU), a modern cyber-physical system testbed has been implemented based on an industry grade distribution management system (DMS) that is integrated with remote terminal units (RTUs), smart meters, and a solar photovoltaic (PV). In addition, the real model from the Avista Utilities distribution system in Pullman, WA, is modeled in DMS. The proposed testbed environment allows students and instructors to utilize these facilities for innovations in learning and teaching. For power engineering education, this testbed helps students understand the interaction between a cyber system and a physical distribution system through industrial level visualization. The testbed providesmore » a distribution system monitoring and control environment for students. Compared with a simulation based approach, the testbed brings the students' learning environment a step closer to the real world. The educational modules allow students to learn the concepts of a cyber-physical system and an electricity market through an integrated testbed. Furthermore, the testbed provides a platform in the study mode for students to practice working on a real distribution system model. Here, this paper describes the new educational modules based on the testbed environment. Three modules are described together with the underlying educational principles and associated projects.« less

  1. New Educational Modules Using a Cyber-Distribution System Testbed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Jing; Bedoya, Juan Carlos; Liu, Chen-Ching

    At Washington State University (WSU), a modern cyber-physical system testbed has been implemented based on an industry grade distribution management system (DMS) that is integrated with remote terminal units (RTUs), smart meters, and a solar photovoltaic (PV). In addition, the real model from the Avista Utilities distribution system in Pullman, WA, is modeled in DMS. The proposed testbed environment allows students and instructors to utilize these facilities for innovations in learning and teaching. For power engineering education, this testbed helps students understand the interaction between a cyber system and a physical distribution system through industrial level visualization. The testbed providesmore » a distribution system monitoring and control environment for students. Compared with a simulation based approach, the testbed brings the students' learning environment a step closer to the real world. The educational modules allow students to learn the concepts of a cyber-physical system and an electricity market through an integrated testbed. Furthermore, the testbed provides a platform in the study mode for students to practice working on a real distribution system model. Here, this paper describes the new educational modules based on the testbed environment. Three modules are described together with the underlying educational principles and associated projects.« less

  2. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  3. Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.

  4. Overview on In-Space Internet Node Testbed (ISINT)

    NASA Technical Reports Server (NTRS)

    Richard, Alan M.; Kachmar, Brian A.; Fabian, Theodore; Kerczewski, Robert J.

    2000-01-01

    The Satellite Networks and Architecture Branch has developed the In-Space Internet Node Technology testbed (ISINT) for investigating the use of commercial Internet products for NASA missions. The testbed connects two closed subnets over a tabletop Ka-band transponder by using commercial routers and modems. Since many NASA assets are in low Earth orbits (LEO's), the testbed simulates the varying signal strength, changing propagation delay, and varying connection times that are normally experienced when communicating to the Earth via a geosynchronous orbiting (GEO) communications satellite. Research results from using this testbed will be used to determine which Internet technologies are appropriate for NASA's future communication needs.

  5. Sparse matrix methods research using the CSM testbed software system

    NASA Technical Reports Server (NTRS)

    Chu, Eleanor; George, J. Alan

    1989-01-01

    Research is described on sparse matrix techniques for the Computational Structural Mechanics (CSM) Testbed. The primary objective was to compare the performance of state-of-the-art techniques for solving sparse systems with those that are currently available in the CSM Testbed. Thus, one of the first tasks was to become familiar with the structure of the testbed, and to install some or all of the SPARSPAK package in the testbed. A suite of subroutines to extract from the data base the relevant structural and numerical information about the matrix equations was written, and all the demonstration problems distributed with the testbed were successfully solved. These codes were documented, and performance studies comparing the SPARSPAK technology to the methods currently in the testbed were completed. In addition, some preliminary studies were done comparing some recently developed out-of-core techniques with the performance of the testbed processor INV.

  6. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  7. Development of a space-systems network testbed

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  8. VLTI-PRIMA fringe tracking testbed

    NASA Astrophysics Data System (ADS)

    Abuter, Roberto; Rabien, Sebastian; Eisenhauer, Frank; Sahlmann, Johannes; Di Lieto, Nicola; Haug, Marcus; Wallander, Anders; Lévêque, Samuel; Ménardi, Serge; Delplancke, Françoise; Schuhler, Nicolas; Kellner, Stefan; Frahm, Robert

    2006-06-01

    One of the key components of the planned VLTI dual feed facility PRIMA is the Fringe Sensor Unit (FSU). Its basic function is the instantaneous measurement of the Optical Path Difference (OPD) between two beams. The FSU acts as the sensor for a complex control system involving optical delay lines and laser metrology with the aim of removing any OPD introduced by the atmosphere and the beam relay. We have initiated a cooperation between ESO and MPE with the purpose of systematically testing this Fringe Tracking Control System in a laboratory environment. This testbed facility is being built at MPE laboratories with the aim to simulate the VLTI and includes FSUs, OPD controller, metrology and in-house built delay lines. In this article we describe this testbed in detail, including the environmental conditions in the laboratory, and present the results of the testbed subsystem characterisation.

  9. The telerobot testbed: An architecture for remote servicing

    NASA Technical Reports Server (NTRS)

    Matijevic, J. R.

    1990-01-01

    The NASA/OAST Telerobot Testbed will reach its next increment in development by the end of FY-89. The testbed will have the capability for: force reflection in teleoperation, shared control, traded control, operator designate and relative update. These five capabilities will be shown in a module release and exchange operation using mockups of Orbital Replacement Units (ORU). This development of the testbed shows examples of the technologies needed for remote servicing, particularly under conditions of delay in transmissions to the servicing site. Here, the following topics are presented: the system architecture of the testbed which incorporates these telerobotic technologies for servicing, the implementation of the five capabilities and the operation of the ORU mockups.

  10. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  11. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  12. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme.

  13. Development, Demonstration, and Control of a Testbed for Multiterminal HVDC System

    DOE PAGES

    Li, Yalong; Shi, Xiaojie M.; Liu, Bo; ...

    2016-10-21

    This paper presents the development of a scaled four-terminal high-voltage direct current (HVDC) testbed, including hardware structure, communication architecture, and different control schemes. The developed testbed is capable of emulating typical operation scenarios including system start-up, power variation, line contingency, and converter station failure. Some unique scenarios are also developed and demonstrated, such as online control mode transition and station re-commission. In particular, a dc line current control is proposed, through the regulation of a converter station at one terminal. By controlling a dc line current to zero, the transmission line can be opened by using relatively low-cost HVDC disconnectsmore » with low current interrupting capability, instead of the more expensive dc circuit breaker. Utilizing the dc line current control, an automatic line current limiting scheme is developed. As a result, when a dc line is overloaded, the line current control will be automatically activated to regulate current within the allowable maximum value.« less

  14. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    NASA Astrophysics Data System (ADS)

    Maloney, Thomas M.; Prokopius, Paul R.; Voecks, Gerald E.

    1995-01-01

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway.

  15. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues

  16. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  17. The ac power system testbed

    NASA Technical Reports Server (NTRS)

    Mildice, J.; Sundberg, R.

    1987-01-01

    The object of this program was to design, build, test, and deliver a high frequency (20 kHz) Power System Testbed which would electrically approximate a single, separable power channel of an IOC Space Station. That program is described, including the technical background, and the results are discussed showing that the major assumptions about the characteristics of this class of hardware (size, mass, efficiency, control, etc.) were substantially correct. This testbed equipment was completed and delivered and is being operated as part of the Space Station Power System Test Facility.

  18. Advanced turboprop testbed systems study. Volume 1: Testbed program objectives and priorities, drive system and aircraft design studies, evaluation and recommendations and wind tunnel test plans

    NASA Technical Reports Server (NTRS)

    Bradley, E. S.; Little, B. H.; Warnock, W.; Jenness, C. M.; Wilson, J. M.; Powell, C. W.; Shoaf, L.

    1982-01-01

    The establishment of propfan technology readiness was determined and candidate drive systems for propfan application were identified. Candidate testbed aircraft were investigated for testbed aircraft suitability and four aircraft selected as possible propfan testbed vehicles. An evaluation of the four candidates was performed and the Boeing KC-135A and the Gulfstream American Gulfstream II recommended as the most suitable aircraft for test application. Conceptual designs of the two recommended aircraft were performed and cost and schedule data for the entire testbed program were generated. The program total cost was estimated and a wind tunnel program cost and schedule is generated in support of the testbed program.

  19. Developmental Cryogenic Active Telescope Testbed, a Wavefront Sensing and Control Testbed for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Leboeuf, Claudia M.; Davila, Pamela S.; Redding, David C.; Morell, Armando; Lowman, Andrew E.; Wilson, Mark E.; Young, Eric W.; Pacini, Linda K.; Coulter, Dan R.

    1998-01-01

    As part of the technology validation strategy of the next generation space telescope (NGST), a system testbed is being developed at GSFC, in partnership with JPL and Marshall Space Flight Center (MSFC), which will include all of the component functions envisioned in an NGST active optical system. The system will include an actively controlled, segmented primary mirror, actively controlled secondary, deformable, and fast steering mirrors, wavefront sensing optics, wavefront control algorithms, a telescope simulator module, and an interferometric wavefront sensor for use in comparing final obtained wavefronts from different tests. The developmental. cryogenic active telescope testbed (DCATT) will be implemented in three phases. Phase 1 will focus on operating the testbed at ambient temperature. During Phase 2, a cryocapable segmented telescope will be developed and cooled to cryogenic temperature to investigate the impact on the ability to correct the wavefront and stabilize the image. In Phase 3, it is planned to incorporate industry developed flight-like components, such as figure controlled mirror segments, cryogenic, low hold power actuators, or different wavefront sensing and control hardware or software. A very important element of the program is the development and subsequent validation of the integrated multidisciplinary models. The Phase 1 testbed objectives, plans, configuration, and design will be discussed.

  20. Exploration Systems Health Management Facilities and Testbed Workshop

    NASA Technical Reports Server (NTRS)

    Wilson, Scott; Waterman, Robert; McCleskey, Carey

    2004-01-01

    Presentation Agenda : (1) Technology Maturation Pipeline (The Plan) (2) Cryogenic testbed (and other KSC Labs) (2a) Component / Subsystem technologies (3) Advanced Technology Development Center (ATDC) (3a) System / Vehic1e technologies (4) EL V Flight Experiments (Flight Testbeds).

  1. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  2. Development of a Cloud Resolving Model for Heterogeneous Supercomputers

    NASA Astrophysics Data System (ADS)

    Sreepathi, S.; Norman, M. R.; Pal, A.; Hannah, W.; Ponder, C.

    2017-12-01

    A cloud resolving climate model is needed to reduce major systematic errors in climate simulations due to structural uncertainty in numerical treatments of convection - such as convective storm systems. This research describes the porting effort to enable SAM (System for Atmosphere Modeling) cloud resolving model on heterogeneous supercomputers using GPUs (Graphical Processing Units). We have isolated a standalone configuration of SAM that is targeted to be integrated into the DOE ACME (Accelerated Climate Modeling for Energy) Earth System model. We have identified key computational kernels from the model and offloaded them to a GPU using the OpenACC programming model. Furthermore, we are investigating various optimization strategies intended to enhance GPU utilization including loop fusion/fission, coalesced data access and loop refactoring to a higher abstraction level. We will present early performance results, lessons learned as well as optimization strategies. The computational platform used in this study is the Summitdev system, an early testbed that is one generation removed from Summit, the next leadership class supercomputer at Oak Ridge National Laboratory. The system contains 54 nodes wherein each node has 2 IBM POWER8 CPUs and 4 NVIDIA Tesla P100 GPUs. This work is part of a larger project, ACME-MMF component of the U.S. Department of Energy(DOE) Exascale Computing Project. The ACME-MMF approach addresses structural uncertainty in cloud processes by replacing traditional parameterizations with cloud resolving "superparameterization" within each grid cell of global climate model. Super-parameterization dramatically increases arithmetic intensity, making the MMF approach an ideal strategy to achieve good performance on emerging exascale computing architectures. The goal of the project is to integrate superparameterization into ACME, and explore its full potential to scientifically and computationally advance climate simulation and prediction.

  3. ooi: OpenStack OCCI interface

    NASA Astrophysics Data System (ADS)

    López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo

    In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.

  4. In-Space Networking on NASA's SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Eddy, Wesley M.; Clark, Gilbert J.; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios and a flight computer for supporting in-space communication research. New technologies being studied using the SCaN Testbed include advanced networking, coding, and modulation protocols designed to support the transition of NASAs mission systems from primarily point to point data links and preplanned routes towards adaptive, autonomous internetworked operations needed to meet future mission objectives. Networking protocols implemented on the SCaN Testbed include the Advanced Orbiting Systems (AOS) link-layer protocol, Consultative Committee for Space Data Systems (CCSDS) Encapsulation Packets, Internet Protocol (IP), Space Link Extension (SLE), CCSDS File Delivery Protocol (CFDP), and Delay-Tolerant Networking (DTN) protocols including the Bundle Protocol (BP) and Licklider Transmission Protocol (LTP). The SCaN Testbed end-to-end system provides three S-band data links and one Ka-band data link to exchange space and ground data through NASAs Tracking Data Relay Satellite System or a direct-to-ground link to ground stations. The multiple data links and nodes provide several upgradable elements on both the space and ground systems. This paper will provide a general description of the testbeds system design and capabilities, discuss in detail the design and lessons learned in the implementation of the network protocols, and describe future plans for continuing research to meet the communication needs for evolving global space systems.

  5. The Goddard Space Flight Center (GSFC) robotics technology testbed

    NASA Technical Reports Server (NTRS)

    Schnurr, Rick; Obrien, Maureen; Cofer, Sue

    1989-01-01

    Much of the technology planned for use in NASA's Flight Telerobotic Servicer (FTS) and the Demonstration Test Flight (DTF) is relatively new and untested. To provide the answers needed to design safe, reliable, and fully functional robotics for flight, NASA/GSFC is developing a robotics technology testbed for research of issues such as zero-g robot control, dual arm teleoperation, simulations, and hierarchical control using a high level programming language. The testbed will be used to investigate these high risk technologies required for the FTS and DTF projects. The robotics technology testbed is centered around the dual arm teleoperation of a pair of 7 degree-of-freedom (DOF) manipulators, each with their own 6-DOF mini-master hand controllers. Several levels of safety are implemented using the control processor, a separate watchdog computer, and other low level features. High speed input/output ports allow the control processor to interface to a simulation workstation: all or part of the testbed hardware can be used in real time dynamic simulation of the testbed operations, allowing a quick and safe means for testing new control strategies. The NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) hierarchical control scheme, is being used as the reference standard for system design. All software developed for the testbed, excluding some of simulation workstation software, is being developed in Ada. The testbed is being developed in phases. The first phase, which is nearing completion, and highlights future developments is described.

  6. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  7. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    NASA Astrophysics Data System (ADS)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  8. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  9. Development of Hardware-in-the-loop Microgrid Testbed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiao, Bailu; Prabakar, Kumaraguru; Starke, Michael R

    2015-01-01

    A hardware-in-the-loop (HIL) microgrid testbed for the evaluation and assessment of microgrid operation and control system has been presented in this paper. The HIL testbed is composed of a real-time digital simulator (RTDS) for modeling of the microgrid, multiple NI CompactRIOs for device level control, a prototype microgrid energy management system (MicroEMS), and a relay protection system. The applied communication-assisted hybrid control system has been also discussed. Results of function testing of HIL controller, communication, and the relay protection system are presented to show the effectiveness of the proposed HIL microgrid testbed.

  10. Hybrid cloud: bridging of private and public cloud computing

    NASA Astrophysics Data System (ADS)

    Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol

    2018-05-01

    Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.

  11. BEATBOX v1.0: Background Error Analysis Testbed with Box Models

    NASA Astrophysics Data System (ADS)

    Knote, Christoph; Barré, Jérôme; Eckl, Max

    2018-02-01

    The Background Error Analysis Testbed (BEATBOX) is a new data assimilation framework for box models. Based on the BOX Model eXtension (BOXMOX) to the Kinetic Pre-Processor (KPP), this framework allows users to conduct performance evaluations of data assimilation experiments, sensitivity analyses, and detailed chemical scheme diagnostics from an observation simulation system experiment (OSSE) point of view. The BEATBOX framework incorporates an observation simulator and a data assimilation system with the possibility of choosing ensemble, adjoint, or combined sensitivities. A user-friendly, Python-based interface allows for the tuning of many parameters for atmospheric chemistry and data assimilation research as well as for educational purposes, for example observation error, model covariances, ensemble size, perturbation distribution in the initial conditions, and so on. In this work, the testbed is described and two case studies are presented to illustrate the design of a typical OSSE experiment, data assimilation experiments, a sensitivity analysis, and a method for diagnosing model errors. BEATBOX is released as an open source tool for the atmospheric chemistry and data assimilation communities.

  12. Kite: status of the external metrology testbed for SIM

    NASA Astrophysics Data System (ADS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar S.; Azizi, Alireza; Moser, Steven J.; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-10-01

    Kite is a system level testbed for the External Metrology System of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducials that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to an accuracy of tens of picometers in order to correct for thermal deformations and attitude changes of the spacecraft. Because of the need for such high precision measurements, the Kite testbed was build to test both the metrology gauges and our ability to optically model the system at these levels. The Kite testbed is a redundant metrology truss, in which 6 lengths are measured, but only 5 are needed to define the system. The RMS error between the redundant measurements needs to be less than 140pm for the SIM Wide-Angle observing scenario and less than 8 pm for the Narrow-Angle observing scenario. With our current testbed layout, we have achieved an RMS of 85 pm in the Wide-Angle case, meeting the goal. For the Narrow-Angle case, we have reached 5.8 pm, but only for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more representative of SIM.

  13. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing a aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in real time in a sequence similar to what would occur in the NAS.

  14. The Palomar Testbed Interferometer

    NASA Technical Reports Server (NTRS)

    Colavita, M. M.; Wallace, J. K.; Hines, B. E.; Gursel, Y.; Malbet, F.; Palmer, D. L.; Pan, X. P.; Shao, M.; Yu, J. W.; Boden, A. F.

    1999-01-01

    The Palomar Testbed Interferometer (PTI) is a long-baseline infrared interferometer located at Palomar Observatory, California. It was built as a testbed for interferometric techniques applicable to the Keck Interferometer. First fringes were obtained in 1995 July. PTI implements a dual-star architecture, tracking two stars simultaneously for phase referencing and narrow-angle astrometry. The three fixed 40 cm apertures can be combined pairwise to provide baselines to 110 m. The interferometer actively tracks the white-light fringe using an array detector at 2.2 microns and active delay lines with a range of +/-38 m. Laser metrology of the delay lines allows for servo control, and laser metrology of the complete optical path enables narrow-angle astrometric measurements. The instrument is highly automated, using a multiprocessing computer system for instrument control and sequencing.

  15. Distributed computing testbed for a remote experimental environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butner, D.N.; Casper, T.A.; Howard, B.C.

    1995-09-18

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on themore » DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.« less

  16. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  17. Starlight suppression from the starshade testbed at NGAS

    NASA Astrophysics Data System (ADS)

    Samuele, Rocco; Glassman, Tiffany; Johnson, Adam M. J.; Varshneya, Rupal; Shipley, Ann

    2009-08-01

    We report on progress at the Northrop Grumman Aerospace Systems (NGAS) starshade testbed. The starshade testbed is a 42.8 m, vacuum chamber designed to replicate the Fresnel number of an equivalent full-scale starshade mission, namely the flagship New Worlds Observer (NWO) configuration. Subscale starshades manufactured by the NGAS foundry have shown 10-7 starlight suppression at an equivalent full-mission inner working angle of 85 milliarseconds. In this paper, we present an overview of the experimental set up, scaling relationships to an equivalent full-scale mission, and preliminary results from the testbed. We also discuss potential limitations of the current generation of starshades and improvements for the future.

  18. Hybrid Lyot coronagraph for WFIRST: high-contrast broadband testbed demonstration

    NASA Astrophysics Data System (ADS)

    Seo, Byoung-Joon; Cady, Eric; Gordon, Brian; Kern, Brian; Lam, Raymond; Marx, David; Moody, Dwight; Muller, Richard; Patterson, Keith; Poberezhskiy, Ilya; Mejia Prada, Camilo; Sidick, Erkin; Shi, Fang; Trauger, John; Wilson, Daniel

    2017-09-01

    Hybrid Lyot Coronagraph (HLC) is one of the two operating modes of the Wide-Field InfraRed Survey Telescope (WFIRST) coronagraph instrument. Since being selected by National Aeronautics and Space Administration (NASA) in December 2013, the coronagraph technology is being matured to Technology Readiness Level (TRL) 6 by 2018. To demonstrate starlight suppression in presence of expecting on-orbit input wavefront disturbances, we have built a dynamic testbed in Jet Propulsion Laboratory (JPL) in 2016. This testbed, named as Occulting Mask Coronagraph (OMC) testbed, is designed analogous to the WFIRST flight instrument architecture: It has both HLC and Shape Pupil Coronagraph (SPC) architectures, and also has the Low Order Wavefront Sensing and Control (LOWFS/C) subsystem to sense and correct the dynamic wavefront disturbances. We present upto-date progress of HLC mode demonstration in the OMC testbed. SPC results will be reported separately. We inject the flight-like Line of Sight (LoS) and Wavefront Error (WFE) perturbation to the OMC testbed and demonstrate wavefront control using two deformable mirrors while the LOWFS/C is correcting those perturbation in our vacuum testbed. As a result, we obtain repeatable convergence below 5 × 10-9 mean contrast with 10% broadband light centered at 550 nm in the 360 degrees dark hole with working angle between 3 λ/D and 9 λ/D. We present the key hardware and software used in the testbed, the performance results and their comparison to model expectations.

  19. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  20. SSERVI Analog Regolith Simulant Testbed Facility

    NASA Astrophysics Data System (ADS)

    Minafra, J.; Schmidt, G. K.

    2016-12-01

    SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers. The SSERVI Analog Regolith Simulant Testbed provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment. The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area. SSERVI provides a bridge between several groups, joining together researchers from: 1) scientific and exploration communities, 2) multiple disciplines across a wide range of planetary sciences, and 3) domestic and international communities and partnerships. This testbed provides a means of consolidating the tasks of acquisition, storage and safety mitigation in handling large quantities of regolith simulant Facility hardware and environment testing scenarios include, but are not limited to the following; Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, and Surface features (i.e. grades and rocks) Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and planetary exploration activities at NASA Research Park, to academia and expanded commercial opportunities in California's Silicon Valley, as well as public outreach and education opportunities.

  1. Development of the CSI phase-3 evolutionary model testbed

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Davis, D. A.; Tan, M. K.

    1994-01-01

    This report documents the development effort for the reconfiguration of the Controls-Structures Integration (CSI) Evolutionary Model (CEM) Phase-2 testbed into the CEM Phase-3 configuration. This step responds to the need to develop and test CSI technologies associated with typical planned earth science and remote sensing platforms. The primary objective of the CEM Phase-3 ground testbed is to simulate the overall on-orbit dynamic behavior of the EOS AM-1 spacecraft. Key elements of the objective include approximating the low-frequency appendage dynamic interaction of EOS AM-1, allowing for the changeout of components, and simulating the free-free on-orbit environment using an advanced suspension system. The fundamentals of appendage dynamic interaction are reviewed. A new version of the multiple scaling method is used to design the testbed to have the full-scale geometry and dynamics of the EOS AM-1 spacecraft, but at one-tenth the weight. The testbed design is discussed, along with the testing of the solar array, high gain antenna, and strut components. Analytical performance comparisons show that the CEM Phase-3 testbed simulates the EOS AM-1 spacecraft with good fidelity for the important parameters of interest.

  2. Kite: Status of the External Metrology Testbed for SIM

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar; Azizi, Alireza; Moser, Steven; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-01-01

    Kite is a system level testbed for the External Metrology system of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducial that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to tens of picometers in order to correct for thermal measurements, the Kite testbed was build to test both the metrology gauges and out ability to optically model the system at these levels. The Kite testbed is an over-constraint system where 6 lengths are measured, but only 5 are needed to determine the system. The agreement in the over-constrained length needs to be on the order of 140 pm for the SIM Wide-Angle observing scenario and 8 pm for the Narrow-Angle observing scenario. We demonstrate that we have met the Wide-Angle goal with our current setup. For the Narrow-Angle case, we have only reached the goal for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more SIM like.

  3. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Mike; Munson, Mike; Teate, George

    2006-01-01

    A new testbed for hypersonic flight research is proposed. Known as the Phoenix air-launched small missile (ALSM) flight testbed, it was conceived to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of two unique and very capable flight assets: the United States Navy Phoenix AIM-54 long-range, guided air-to-air missile and the NASA Dryden F-15B testbed airplane. The U.S. Navy retirement of the Phoenix AIM-54 missiles from fleet operation has presented an excellent opportunity for converting this valuable flight asset into a new flight testbed. This cost-effective new platform will fill an existing gap in the test and evaluation of current and future hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform. When launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will be valuable for the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite small-payload air-launched space boosters.

  4. Turning in the Testbed

    NASA Image and Video Library

    2004-01-13

    This image, taken in the JPL In-Situ Instruments Laboratory or Testbed, shows the view from the front hazard avoidance cameras on the Mars Exploration Rover Spirit after the rover has backed up and turned 45 degrees counterclockwise.

  5. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  6. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift

  7. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  8. The computational structural mechanics testbed generic structural-element processor manual

    NASA Technical Reports Server (NTRS)

    Stanley, Gary M.; Nour-Omid, Shahram

    1990-01-01

    The usage and development of structural finite element processors based on the CSM Testbed's Generic Element Processor (GEP) template is documented. By convention, such processors have names of the form ESi, where i is an integer. This manual is therefore intended for both Testbed users who wish to invoke ES processors during the course of a structural analysis, and Testbed developers who wish to construct new element processors (or modify existing ones).

  9. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  10. A Reconfigurable Testbed Environment for Spacecraft Autonomy

    NASA Technical Reports Server (NTRS)

    Biesiadecki, Jeffrey; Jain, Abhinandan

    1996-01-01

    A key goal of NASA's New Millennium Program is the development of technology for increased spacecraft on-board autonomy. Achievement of this objective requires the development of a new class of ground-based automony testbeds that can enable the low-cost and rapid design, test, and integration of the spacecraft autonomy software. This paper describes the development of an Autonomy Testbed Environment (ATBE) for the NMP Deep Space I comet/asteroid rendezvous mission.

  11. Application developer's tutorial for the CSM testbed architecture

    NASA Technical Reports Server (NTRS)

    Underwood, Phillip; Felippa, Carlos A.

    1988-01-01

    This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.

  12. JINR cloud infrastructure evolution

    NASA Astrophysics Data System (ADS)

    Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.

    2016-09-01

    To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.

  13. Experiences with the JPL telerobot testbed: Issues and insights

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Balaram, Bob; Beahan, John

    1989-01-01

    The Jet Propulsion Laboratory's (JPL) Telerobot Testbed is an integrated robotic testbed used to develop, implement, and evaluate the performance of advanced concepts in autonomous, tele-autonomous, and tele-operated control of robotic manipulators. Using the Telerobot Testbed, researchers demonstrated several of the capabilities and technological advances in the control and integration of robotic systems which have been under development at JPL for several years. In particular, the Telerobot Testbed was recently employed to perform a near completely automated, end-to-end, satellite grapple and repair sequence. The task of integrating existing as well as new concepts in robot control into the Telerobot Testbed has been a very difficult and timely one. Now that researchers have completed the first major milestone (i.e., the end-to-end demonstration) it is important to reflect back upon experiences and to collect the knowledge that has been gained so that improvements can be made to the existing system. It is also believed that the experiences are of value to the others in the robotics community. Therefore, the primary objective here will be to use the Telerobot Testbed as a case study to identify real problems and technological gaps which exist in the areas of robotics and in particular systems integration. Such problems have surely hindered the development of what could be reasonably called an intelligent robot. In addition to identifying such problems, researchers briefly discuss what approaches have been taken to resolve them or, in several cases, to circumvent them until better approaches can be developed.

  14. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  15. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  16. Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2

    NASA Technical Reports Server (NTRS)

    Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.

    2003-01-01

    Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used

  17. In-Space Networking On NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David; Eddy, Wesley M.; Clark, Gilbert J., III; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios (SDRs) and a programmable flight computer. The purpose of the Testbed is to conduct inspace research in the areas of communication, navigation, and networking in support of NASA missions and communication infrastructure. Multiple reprogrammable elements in the end to end system, along with several communication paths and a semi-operational environment, provides a unique opportunity to explore networking concepts and protocols envisioned for the future Solar System Internet (SSI). This paper will provide a general description of the system's design and the networking protocols implemented and characterized on the testbed, including Encapsulation, IP over CCSDS, and Delay-Tolerant Networking (DTN). Due to the research nature of the implementation, flexibility and robustness are considered in the design to enable expansion for future adaptive and cognitive techniques. Following a detailed design discussion, lessons learned and suggestions for future missions and communication infrastructure elements will be provided. Plans for the evolving research on SCaN Testbed as it moves towards a more adaptive, autonomous system will be discussed.

  18. Integrated Network Testbed for Energy Grid Research and Technology

    Science.gov Websites

    Network Testbed for Energy Grid Research and Technology Experimentation Project Under the Integrated Network Testbed for Energy Grid Research and Technology Experimentation (INTEGRATE) project, NREL and partners completed five successful technology demonstrations at the ESIF. INTEGRATE is a $6.5-million, cost

  19. A Novel UAV Electric Propulsion Testbed for Diagnostics and Prognostics

    NASA Technical Reports Server (NTRS)

    Gorospe, George E., Jr.; Kulkarni, Chetan S.

    2017-01-01

    This paper presents a novel hardware-in-the-loop (HIL) testbed for systems level diagnostics and prognostics of an electric propulsion system used in UAVs (unmanned aerial vehicle). Referencing the all electric, Edge 540T aircraft used in science and research by NASA Langley Flight Research Center, the HIL testbed includes an identical propulsion system, consisting of motors, speed controllers and batteries. Isolated under a controlled laboratory environment, the propulsion system has been instrumented for advanced diagnostics and prognostics. To produce flight like loading on the system a slave motor is coupled to the motor under test (MUT) and provides variable mechanical resistance, and the capability of introducing nondestructive mechanical wear-like frictional loads on the system. This testbed enables the verification of mathematical models of each component of the propulsion system, the repeatable generation of flight-like loads on the system for fault analysis, test-to-failure scenarios, and the development of advanced system level diagnostics and prognostics methods. The capabilities of the testbed are extended through the integration of a LabVIEW-based client for the Live Virtual Constructive Distributed Environment (LVCDC) Gateway which enables both the publishing of generated data for remotely located observers and prognosers and the synchronization the testbed propulsion system with vehicles in the air. The developed HIL testbed gives researchers easy access to a scientifically relevant portion of the aircraft without the overhead and dangers encountered during actual flight.

  20. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Michael T.; Munson, Michael J.; Teate, George F.

    2007-01-01

    The Phoenix Air-Launched Small Missile (ALSM) flight testbed was conceived and is proposed to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of the United States Navy Phoenix AIM-54 (Hughes Aircraft Company, now Raytheon Company, Waltham, Massachusetts) long-range, guided air-to-air missile and the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (Edwards, California) F-15B (McDonnell Douglas, now the Boeing Company, Chicago, Illinois) testbed airplane. The retirement of the Phoenix AIM-54 missiles from fleet operation has presented an opportunity for converting this flight asset into a new flight testbed. This cost-effective new platform will fill the gap in the test and evaluation of hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform; when launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will assist the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite-small-payload air-launched space boosters.

  1. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  2. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program is developing initial recommendations for requirements and design approaches for the information systems of the Space Station era. During this quarter, drafting of the final reports of the various participants was initiated. Several drafts are included in this report as the University technical reports.

  3. Using Cloud Computing infrastructure with CloudBioLinux, CloudMan and Galaxy

    PubMed Central

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-01-01

    Cloud computing has revolutionized availability and access to computing and storage resources; making it possible to provision a large computational infrastructure with only a few clicks in a web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this protocol, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatics analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to setup the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command line interface, and the web-based Galaxy interface. PMID:22700313

  4. Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.

    PubMed

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-06-01

    Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.

  5. Cloud regimes as phase transitions

    NASA Astrophysics Data System (ADS)

    Stechmann, Samuel; Hottovy, Scott

    2017-11-01

    Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes - open versus closed cells - fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells (POCs) as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. Similar viewpoints of deep convection and self-organized criticality will also be discussed. With these new conceptual viewpoints, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions. The research of S.N.S. is partially supported by a Sloan Research Fellowship, ONR Young Investigator Award N00014-12-1-0744, and ONR MURI Grant N00014-12-1-0912.

  6. Smart Antenna UKM Testbed for Digital Beamforming System

    NASA Astrophysics Data System (ADS)

    Islam, Mohammad Tariqul; Misran, Norbahiah; Yatim, Baharudin

    2009-12-01

    A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH) array antenna and software reconfigurable digital beamforming system (DBS). The antenna is developed based on using the novel LIEH microstrip patch element design arranged into [InlineEquation not available: see fulltext.] uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance [InlineEquation not available: see fulltext.] floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88-2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  7. Thermal structure analyses for CSM testbed (COMET)

    NASA Technical Reports Server (NTRS)

    Xue, David Y.; Mei, Chuh

    1994-01-01

    This document is the final report for the project entitled 'Thermal Structure Analyses for CSM Testbed (COMET),' for the period of May 16, 1992 - August 15, 1994. The project was focused on the investigation and development of finite element analysis capability of the computational structural mechanics (CSM) testbed (COMET) software system in the field of thermal structural responses. The stages of this project consisted of investigating present capabilities, developing new functions, analysis demonstrations, and research topics. The appendices of this report list the detailed documents of major accomplishments and demonstration runstreams for future references.

  8. Sensor Networking Testbed with IEEE 1451 Compatibility and Network Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Gurkan, Deniz; Yuan, X.; Benhaddou, D.; Figueroa, F.; Morris, Jonathan

    2007-01-01

    Design and implementation of a testbed for testing and verifying IEEE 1451-compatible sensor systems with network performance monitoring is of significant importance. The performance parameters measurement as well as decision support systems implementation will enhance the understanding of sensor systems with plug-and-play capabilities. The paper will present the design aspects for such a testbed environment under development at University of Houston in collaboration with NASA Stennis Space Center - SSST (Smart Sensor System Testbed).

  9. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  10. SSERVI Analog Regolith Simulant Testbed Facility

    NASA Astrophysics Data System (ADS)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  11. Wavefront control performance modeling with WFIRST shaped pupil coronagraph testbed

    NASA Astrophysics Data System (ADS)

    Zhou, Hanying; Nemati, Bijian; Krist, John; Cady, Eric; Kern, Brian; Poberezhskiy, Ilya

    2017-09-01

    NASA's WFIRST mission includes a coronagraph instrument (CGI) for direct imaging of exoplanets. Significant improvement in CGI model fidelity has been made recently, alongside a testbed high contrast demonstration in a simulated dynamic environment at JPL. We present our modeling method and results of comparisons to testbed's high order wavefront correction performance for the shaped pupil coronagraph. Agreement between model prediction and testbed result at better than a factor of 2 has been consistently achieved in raw contrast (contrast floor, chromaticity, and convergence), and with that comes good agreement in contrast sensitivity to wavefront perturbations and mask lateral shear.

  12. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; hide

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  13. Overview of the Telescience Testbed Program

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad; Leiner, Barry M.

    1991-01-01

    The NASA's Telescience Testbed Program (TTP) conducted by the Ames Research Center is described with particular attention to the objectives, the approach used to achieve these objectives, and the expected benefits of the program. The goal of the TTP is to gain operational experience for the Space Station Freedom and the Earth Observing System programs, using ground testbeds, and to define the information and communication systems requirements for the development and operation of these programs. The results of TTP are expected to include the requirements for the remote coaching, command and control, monitoring and maintenance, payload design, and operations management. In addition, requirements for technologies such as workstations, software, video, automation, data management, and networking will be defined.

  14. The Mini-Mast CSI testbed: Lessons learned

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Belvin, W. Keith; Horta, Lucas G.; Pappa, R. S.

    1993-01-01

    The Mini-Mast testbed was one of the first large scale Controls-Structure-Interaction (CSI) systems used to evaluate state-of-the-art methodology in flexible structure control. Now that all the testing at Langley Research Center has been completed, a look back is warranted to evaluate the program. This paper describes some of the experiences and technology development studies by NASA, university, and industry investigators. Lessons learned are presented from three categories: the testbed development, control methods, and the operation of a guest investigator program. It is shown how structural safety margins provided a realistic environment to simulate on-orbit CSI research, even though they also reduced the research flexibility afforded to investigators. The limited dynamic coupling between the bending and torsion modes of the cantilevered test article resulted in highly successful SISO and MIMO controllers. However, until accurate models were obtained for the torque wheel actuators, sensors, filters, and the structure itself, most controllers were unstable. Controls research from this testbed should be applicable to cantilevered appendages of future large space structures.

  15. Adaptive controller for a strength testbed for aircraft structures

    NASA Astrophysics Data System (ADS)

    Laperdin, A. I.; Yurkevich, V. D.

    2017-07-01

    The problem of control system design for a strength testbed of aircraft structures is considered. A method for calculating the parameters of a proportional-integral controller (control algorithm) using the time-scale separation method for the testbed taking into account the dead time effect in the control loop is presented. An adaptive control algorithm structure is proposed which limits the amplitude of high-frequency oscillations in the control system with a change in the direction of motion of the rod of the hydraulic cylinders and provides the desired accuracy and quality of transients at all stages of structural loading history. The results of tests of the developed control system with the adaptive control algorithm on an experimental strength testbed for aircraft structures are given.

  16. Phoenix Missile Hypersonic Testbed (PMHT): System Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    A viewgraph presentation of the Phoenix Missile Hypersonic Testbed (PMHT) is shown. The contents include: 1) Need and Goals; 2) Phoenix Missile Hypersonic Testbed; 3) PMHT Concept; 4) Development Objectives; 5) Possible Research Payloads; 6) Possible Research Program Participants; 7) PMHT Configuration; 8) AIM-54 Internal Hardware Schematic; 9) PMHT Configuration; 10) New Guidance and Armament Section Profiles; 11) Nomenclature; 12) PMHT Stack; 13) Systems Concept; 14) PMHT Preflight Activities; 15) Notional Ground Path; and 16) Sample Theoretical Trajectories.

  17. Telescience testbed pilot program, volume 2: Program results

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.

  18. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    PubMed

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  19. Advanced traffic technology test-bed.

    DOT National Transportation Integrated Search

    2004-06-01

    The goal of this project was to create a test-bed to allow the University of California to conduct advanced traffic technology research in a designated, non-public, and controlled setting. Caltrans, with its associated research facilities on UC campu...

  20. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA s Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing an aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in realtime in a sequence similar to what would occur in the NAS. The ATC applications that can be studied are the Aeronautical Telecommunications Network s (ATN) Context Management (CM) and Controller Pilot Data Link Communications (CPDLC). The Surveillance applications are Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Services - Broadcast (TIS-B).

  1. Design and deployment of an elastic network test-bed in IHEP data center based on SDN

    NASA Astrophysics Data System (ADS)

    Zeng, Shan; Qi, Fazhi; Chen, Gang

    2017-10-01

    High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.

  2. The Fizeau Interferometer Testbed

    DTIC Science & Technology

    2003-03-01

    Institute, Jay Rajagopal and Ron Allen; and at the CfA, Margarita Karovska , for their contribu- tions to the development of the testbed and the Stellar...2000. [2] K.G. Carpenter, C.J. Schrijver, R.G. Lyon, L.G. Mundy, R.J. Allen, J.T. Armstrong, W.C. Danchi, M. Karovska , J. Marzouk, L.M. Mazzuca, D

  3. Cloud Computing for DoD

    DTIC Science & Technology

    2012-05-01

    cloud computing 17 NASA Nebula Platform •  Cloud computing pilot program at NASA Ames •  Integrates open-source components into seamless, self...Mission support •  Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research •  Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 •  NASA Nebula (2010). Retrieved from

  4. Physically-Retrieving Cloud and Thermodynamic Parameters from Ultraspectral IR Measurements

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Mango, Stephen A.; Huang, Hung-Lung

    2007-01-01

    A physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). NPOESS Airborne Sounder Testbed Interferometer (NAST-I) retrievals from the Atlantic-THORPEX Regional Campaign are compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL). This work was motivated by the need to obtain solutions for atmospheric soundings from infrared radiances observed for every individual field of view, regardless of cloud cover, from future ultraspectral geostationary satellite sounding instruments, such as the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and the Hyperspectral Environmental Suite (HES). However, this retrieval approach can also be applied to the ultraspectral sounding instruments to fly on Polar satellites, such as the Infrared Atmospheric Sounding Interferometer (IASI) on the European MetOp satellite, the Cross-track Infrared Sounder (CrIS) on the NPOESS Preparatory Project and the following NPOESS series of satellites.

  5. Dynamic federation of grid and cloud storage

    NASA Astrophysics Data System (ADS)

    Furano, Fabrizio; Keeble, Oliver; Field, Laurence

    2016-09-01

    The Dynamic Federations project ("Dynafed") enables the deployment of scalable, distributed storage systems composed of independent storage endpoints. While the Uniform Generic Redirector at the heart of the project is protocol-agnostic, we have focused our effort on HTTP-based protocols, including S3 and WebDAV. The system has been deployed on testbeds covering the majority of the ATLAS and LHCb data, and supports geography-aware replica selection. The work done exploits the federation potential of HTTP to build systems that offer uniform, scalable, catalogue-less access to the storage and metadata ensemble and the possibility of seamless integration of other compatible resources such as those from cloud providers. Dynafed can exploit the potential of the S3 delegation scheme, effectively federating on the fly any number of S3 buckets from different providers and applying a uniform authorization to them. This feature has been used to deploy in production the BOINC Data Bridge, which uses the Uniform Generic Redirector with S3 buckets to harmonize the BOINC authorization scheme with the Grid/X509. The Data Bridge has been deployed in production with good results. We believe that the features of a loosely coupled federation of open-protocolbased storage elements open many possibilities of smoothly evolving the current computing models and of supporting new scientific computing projects that rely on massive distribution of data and that would appreciate systems that can more easily be interfaced with commercial providers and can work natively with Web browsers and clients.

  6. STORMSeq: An Open-Source, User-Friendly Pipeline for Processing Personal Genomics Data in the Cloud

    PubMed Central

    Karczewski, Konrad J.; Fernald, Guy Haskin; Martin, Alicia R.; Snyder, Michael; Tatonetti, Nicholas P.; Dudley, Joel T.

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5–10 hours to process a full exome sequence and $30 and 3–8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756

  7. Optical Design of the Developmental Cryogenic Active Telescope Testbed (DCATT)

    NASA Technical Reports Server (NTRS)

    Davila, Pam; Wilson, Mark; Young, Eric W.; Lowman, Andrew E.; Redding, David C.

    1997-01-01

    In the summer of 1996, three Study teams developed conceptual designs and mission architectures for the Next Generation Space Telescope (NGST). Each group highlighted areas of technology development that need to be further advanced to meet the goals of the NGST mission. The most important areas for future study included: deployable structures, lightweight optics, cryogenic optics and mechanisms, passive cooling, and on-orbit closed loop wavefront sensing and control. NASA and industry are currently planning to develop a series of ground testbeds and validation flights to demonstrate many of these technologies. The Deployed Cryogenic Active Telescope Testbed (DCATT) is a system level testbed to be developed at Goddard Space Flight Center in three phases over an extended period of time. This testbed will combine an actively controlled telescope with the hardware and software elements of a closed loop wavefront sensing and control system to achieve diffraction limited imaging at 2 microns. We will present an overview of the system level requirements, a discussion of the optical design, and results of performance analyses for the Phase 1 ambient concept for DCATT,

  8. Development and validation of a low-cost mobile robotics testbed

    NASA Astrophysics Data System (ADS)

    Johnson, Michael; Hayes, Martin J.

    2012-03-01

    This paper considers the design, construction and validation of a low-cost experimental robotic testbed, which allows for the localisation and tracking of multiple robotic agents in real time. The testbed system is suitable for research and education in a range of different mobile robotic applications, for validating theoretical as well as practical research work in the field of digital control, mobile robotics, graphical programming and video tracking systems. It provides a reconfigurable floor space for mobile robotic agents to operate within, while tracking the position of multiple agents in real-time using the overhead vision system. The overall system provides a highly cost-effective solution to the topical problem of providing students with practical robotics experience within severe budget constraints. Several problems encountered in the design and development of the mobile robotic testbed and associated tracking system, such as radial lens distortion and the selection of robot identifier templates are clearly addressed. The testbed performance is quantified and several experiments involving LEGO Mindstorm NXT and Merlin System MiaBot robots are discussed.

  9. Development and Validation of the Air Force Cyber Intruder Alert Testbed (CIAT)

    DTIC Science & Technology

    2016-07-27

    Validation of the Air Force Cyber Intruder Alert Testbed (CIAT) 5a. CONTRACT NUMBER FA8650-16-C-6722 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...network analysts. Therefore, a new cyber STE focused on network analysts called the Air Force Cyber Intruder Alert Testbed (CIAT) was developed. This...Prescribed by ANSI Std. Z39-18 Development and Validation of the Air Force Cyber Intruder Alert Testbed (CIAT) Gregory Funke, Gregory Dye, Brett Borghetti

  10. Towards standard testbeds for numerical relativity

    NASA Astrophysics Data System (ADS)

    Alcubierre, Miguel; Allen, Gabrielle; Bona, Carles; Fiske, David; Goodale, Tom; Guzmán, F. Siddhartha; Hawke, Ian; Hawley, Scott H.; Husa, Sascha; Koppitz, Michael; Lechner, Christiane; Pollney, Denis; Rideout, David; Salgado, Marcelo; Schnetter, Erik; Seidel, Edward; Shinkai, Hisa-aki; Shoemaker, Deirdre; Szilágyi, Béla; Takahashi, Ryoji; Winicour, Jeff

    2004-01-01

    In recent years, many different numerical evolution schemes for Einstein's equations have been proposed to address stability and accuracy problems that have plagued the numerical relativity community for decades. Some of these approaches have been tested on different spacetimes, and conclusions have been drawn based on these tests. However, differences in results originate from many sources, including not only formulations of the equations, but also gauges, boundary conditions, numerical methods and so on. We propose to build up a suite of standardized testbeds for comparing approaches to the numerical evolution of Einstein's equations that are designed to both probe their strengths and weaknesses and to separate out different effects, and their causes, seen in the results. We discuss general design principles of suitable testbeds, and we present an initial round of simple tests with periodic boundary conditions. This is a pivotal first step towards building a suite of testbeds to serve the numerical relativists and researchers from related fields who wish to assess the capabilities of numerical relativity codes. We present some examples of how these tests can be quite effective in revealing various limitations of different approaches, and illustrating their differences. The tests are presently limited to vacuum spacetimes, can be run on modest computational resources and can be used with many different approaches used in the relativity community.

  11. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  12. Design of an occulter testbed at flight Fresnel numbers

    NASA Astrophysics Data System (ADS)

    Sirbu, Dan; Kasdin, N. Jeremy; Kim, Yunjong; Vanderbei, Robert J.

    2015-01-01

    An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we are designing and building a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. Here, we present a sample design operating at a flight Fresnel number and is thus representative of a realistic space mission. We present calculations of experimental limits arising from the finite size and propagation distance available in the testbed, limitations due to manufacturing feature size, and non-ideal input beam. We demonstrate how the testbed is designed to be feature-size limited, and provide an estimation of the expected performance.

  13. The Wide-Field Imaging Interferometry Testbed (WIIT): Recent Progress and Results

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.; Frey, Bradley J.; Leisawitz, David T.; Lyon, Richard G.; Maher, Stephen F.; Martino, Anthony J.

    2008-01-01

    Continued research with the Wide-Field Imaging Interferometry Testbed (WIIT) has achieved several important milestones. We have moved WIIT into the Advanced Interferometry and Metrology (AIM) Laboratory at Goddard, and have characterized the testbed in this well-controlled environment. The system is now completely automated and we are in the process of acquiring large data sets for analysis. In this paper, we discuss these new developments and outline our future research directions. The WIIT testbed, combined with new data analysis techniques and algorithms, provides a demonstration of the technique of wide-field interferometric imaging, a powerful tool for future space-borne interferometers.

  14. Generalized Nanosatellite Avionics Testbed Lab

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt

    2015-01-01

    The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.

  15. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  16. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets

  17. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  18. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Li, C.; Wang, J.; Cui, C.; He, B.; Fan, D.; Yang, Y.; Chen, J.; Zhang, H.; Yu, C.; Xiao, J.; Wang, C.; Cao, Z.; Fan, Y.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Yin, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on CloudStack, an open source software, we set up the cloud computing environment for AstroCloud Project. It consists of five distributed nodes across the mainland of China. Users can use and analysis data in this cloud computing environment. Based on GlusterFS, we built a scalable cloud storage system. Each user has a private space, which can be shared among different virtual machines and desktop systems. With this environments, astronomer can access to astronomical data collected by different telescopes and data centers easily, and data producers can archive their datasets safely.

  19. Investigating the Application of Moving Target Defenses to Network Security

    DTIC Science & Technology

    2013-08-01

    developing an MTD testbed using OpenStack [14] to show that our MTD design can actually work. Building an MTD system in a cloud infrastructure will be...Information Intelli- gence Research. New York, USA: ACM, 2013. [14] Openstack , “ Openstack : The folsom release,” http://www.openstack.org/software

  20. Development of Liquid Propulsion Systems Testbed at MSFC

    NASA Technical Reports Server (NTRS)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  1. Automating NEURON Simulation Deployment in Cloud Resources.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  2. Automating NEURON Simulation Deployment in Cloud Resources

    PubMed Central

    Santamaria, Fidel

    2016-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341

  3. An Experimental Testbed for Evaluation of Trust and Reputation Systems

    NASA Astrophysics Data System (ADS)

    Kerr, Reid; Cohen, Robin

    To date, trust and reputation systems have often been evaluated using methods of their designers’ own devising. Recently, we demonstrated that a number of noteworthy trust and reputation systems could be readily defeated, revealing limitations in their original evaluations. Efforts in the trust and reputation community to develop a testbed have yielded a successful competition platform, ART. This testbed, however, is less suited to general experimentation and evaluation of individual trust and reputation technologies. In this paper, we propose an experimentation and evaluation testbed based directly on that used in our investigations into security vulnerabilities in trust and reputation systems for marketplaces. We demonstrate the advantages of this design, towards the development of more thorough, objective evaluations of trust and reputation systems.

  4. Storm-based Cloud-to-Ground Lightning Probabilities and Warnings

    NASA Astrophysics Data System (ADS)

    Calhoun, K. M.; Meyer, T.; Kingfield, D.

    2017-12-01

    A new cloud-to-ground (CG) lightning probability algorithm has been developed using machine-learning methods. With storm-based inputs of Earth Networks' in-cloud lightning, Vaisala's CG lightning, multi-radar/multi-sensor (MRMS) radar derived products including the Maximum Expected Size of Hail (MESH) and Vertically Integrated Liquid (VIL), and near storm environmental data including lapse rate and CAPE, a random forest algorithm was trained to produce probabilities of CG lightning up to one-hour in advance. As part of the Prototype Probabilistic Hazard Information experiment in the Hazardous Weather Testbed in 2016 and 2017, National Weather Service forecasters were asked to use this CG lightning probability guidance to create rapidly updating probability grids and warnings for the threat of CG lightning for 0-60 minutes. The output from forecasters was shared with end-users, including emergency managers and broadcast meteorologists, as part of an integrated warning team.

  5. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program (TTPP) is intended to develop initial recommendations for requirements and design approaches for the information system of the Space Station era. Multiple scientific experiments are being performed, each exploring advanced technologies and technical approaches and each emulating some aspect of Space Station era science. The aggregate results of the program will serve to guide the development of future NASA information systems.

  6. To Which Extent can Aerosols Affect Alpine Mixed-Phase Clouds?

    NASA Astrophysics Data System (ADS)

    Henneberg, O.; Lohmann, U.

    2017-12-01

    Aerosol-cloud interactions constitute a high uncertainty in regional climate and changing weather patterns. Such uncertainties are due to the multiple processes that can be triggered by aerosol especially in mixed-phase clouds. Mixed-phase clouds most likely result in precipitation due to the formation of ice crystals, which can grow to precipitation size. Ice nucleating particles (INPs) determine how fast these clouds glaciate and form precipitation. The potential for INP to transfer supercooled liquid clouds to precipitating clouds depends on the available humidity and supercooled liquid. Those conditions are determined by dynamics. Moderately high updraft velocities result in persistent mixed-phase clouds in the Swiss Alps [1], which provide an ideal testbed to investigate the effect of aerosol on precipitation in mixed-phase clouds. To address the effect of aerosols in orographic winter clouds under different dynamic conditions, we run a number of real case ensembles with the regional climate model COSMO on a horizontal resolution of 1.1 km. Simulations with different INP concentrations within the range observed at the GAW research station Jungfraujoch in the Swiss Alps are conducted and repeated within the ensemble. Microphysical processes are described with a two-moment scheme. Enhanced INP concentrations enhance the precipitation rate of a single precipitation event up to 20%. Other precipitation events of similar strength are less affected by the INP concentration. The effect of CCNs is negligible for precipitation from orographic winter clouds in our case study. There is evidence for INP to change precipitation rate and location more effectively in stronger dynamic regimes due to the enhanced potential to transfer supercooled liquid to ice. The classification of the ensemble members according to their dynamics will quantify the interaction of aerosol effects and dynamics. Reference [1] Lohmann et al, 2016: Persistence of orographic mixed-phase clouds, GRL

  7. CAUSES: Clouds Above the United States and Errors at the Surface

    NASA Astrophysics Data System (ADS)

    Ma, H. Y.; Klein, S. A.; Xie, S.; Morcrette, C. J.; Van Weverberg, K.; Zhang, Y.; Lo, M. H.

    2015-12-01

    The Clouds Above the United States and Errors at the Surface (CAUSES) is a new joint Global Atmospheric System Studies/Regional and Global Climate model/Atmospheric System Research (GASS/RGCM/ASR) intercomparison project to evaluate the central U.S. summertime surface warm biases seen in many weather and climate models. The main focus is to identify the role of cloud, radiation, and precipitation processes in contributing to surface air temperature biases. In this project, we use short-term hindcast approach and examine the growth of the error as a function of hindcast lead time. The study period covers from April 1 to August 31, 2011, which also covers the entire Midlatitude Continental Convective Clouds Experiment (MC3E) campaign. Preliminary results from several models will be presented. (http://portal.nersc.gov/project/capt/CAUSES/) (This study is funded by the RGCM and ASR programs of the U.S. Department of Energy as part of the Cloud-Associated Parameterizations Testbed. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-658017)

  8. CAUSES: Clouds Above the United States and Errors at the Surface

    NASA Astrophysics Data System (ADS)

    Ma, H. Y.; Klein, S. A.; Xie, S.; Zhang, Y.; Morcrette, C. J.; Van Weverberg, K.; Petch, J.; Lo, M. H.

    2014-12-01

    The Clouds Above the United States and Errors at the Surface (CAUSES) is a new joint Global Atmospheric System Studies/Regional and Global Climate model/Atmospheric System Research (GASS/RGCM/ASR) intercomparison project to evaluate the central U.S. summertime surface warm biases seen in many weather and climate models. The main focus is to identify the role of cloud, radiation, and precipitation processes in contributing to surface air temperature biases. In this project, we use short-term hindcast approach and examine the growth of the error as a function of hindcast lead time. The study period covers from April 1 to August 31, 2011, which also covers the entire Midlatitude Continental Convective Clouds Experiment (MC3E) campaign. Preliminary results from several models will be presented. (http://portal.nersc.gov/project/capt/CAUSES/) (This study is funded by the RGCM and ASR programs of the U.S. Department of Energy as part of the Cloud-Associated Parameterizations Testbed. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344. LLNL-ABS-658017)

  9. Wireless Testbed Bonsai

    DTIC Science & Technology

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  10. Using the Atmospheric Radiation Measurement (ARM) Datasets to Evaluate Climate Models in Simulating Diurnal and Seasonal Variations of Tropical Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Hailong; Burleyson, Casey D.; Ma, Po-Lun

    We use the long-term Atmospheric Radiation Measurement (ARM) datasets collected at the three Tropical Western Pacific (TWP) sites as a tropical testbed to evaluate the ability of the Community Atmosphere Model (CAM5) to simulate the various types of clouds, their seasonal and diurnal variations, and their impact on surface radiation. We conducted a series of CAM5 simulations at various horizontal grid spacing (around 2°, 1°, 0.5°, and 0.25°) with meteorological constraints from reanalysis. Model biases in the seasonal cycle of cloudiness are found to be weakly dependent on model resolution. Positive biases (up to 20%) in the annual mean totalmore » cloud fraction appear mostly in stratiform ice clouds. Higher-resolution simulations do reduce the positive bias in the frequency of ice clouds, but they inadvertently increase the negative biases in convective clouds and low-level liquid clouds, leading to a positive bias in annual mean shortwave fluxes at the sites, as high as 65 W m-2 in the 0.25° simulation. Such resolution-dependent biases in clouds can adversely lead to biases in ambient thermodynamic properties and, in turn, feedback on clouds. Both the CAM5 model and ARM observations show distinct diurnal cycles in total, stratiform and convective cloud fractions; however, they are out-of-phase by 12 hours and the biases vary by site. Our results suggest that biases in deep convection affect the vertical distribution and diurnal cycle of stratiform clouds through the transport of vapor and/or the detrainment of liquid and ice. We also found that the modelled gridmean surface longwave fluxes are systematically larger than site measurements when the grid that the ARM sites reside in is partially covered by ocean. The modeled longwave fluxes at such sites also lack a discernable diurnal cycle because the ocean part of the grid is warmer and less sensitive to radiative heating/cooling compared to land. Higher spatial resolution is more helpful is this regard

  11. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; hide

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  12. The Wide-Field Imaging Interferometry Testbed: Recent Progress

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.

    2010-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) at NASA's Goddard Space Flight Center was designed to demonstrate the practicality and application of techniques for wide-field spatial-spectral ("double Fourier") interferometry. WIIT is an automated system, and it is now producing substantial amounts of high-quality data from its state-of-the-art operating environment, Goddard's Advanced Interferometry and Metrology Lab. In this paper, we discuss the characterization and operation of the testbed and present the most recent results. We also outline future research directions. A companion paper within this conference discusses the development of new wide-field double Fourier data analysis algorithms.

  13. Telescience testbed pilot program, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.

  14. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.

  15. Trusted Computing Strengthens Cloud Authentication

    PubMed Central

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  16. Variable dynamic testbed vehicle : safety plan

    DOT National Transportation Integrated Search

    1997-02-01

    This safety document covers the entire safety process from inception to delivery of the Variable Dynamic Testbed Vehicle. In addition to addressing the process of safety on the vehicle , it should provide a basis on which to build future safety proce...

  17. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    PubMed Central

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  18. Laboratory MCAO Test-Bed for Developing Wavefront Sensing Concepts.

    PubMed

    Goncharov, A V; Dainty, J C; Esposito, S; Puglisi, A

    2005-07-11

    An experimental optical bench test-bed for developing new wavefront sensing concepts for Multi-Conjugate Adaptive Optics (MCAO) systems is described. The main objective is to resolve imaging problems associated with wavefront sensing of the atmospheric turbulence for future MCAO systems on Extremely Large Telescopes (ELTs). The test-bed incorporates five reference sources, two deformable mirrors (DMs) and atmospheric phase screens to simulate a scaled version of a 10-m adaptive telescope operating at the K band. A recently proposed compact tomographic wavefront sensor is employed for star-oriented DMs control in the MCAO system. The MCAO test-bed is used to verify the feasibility of the wavefront sensing concept utilizing a field lenslet array for multi-pupil imaging on a single detector. First experimental results of MCAO correction with the proposed tomographic wavefront sensor are presented and compared to the theoretical prediction based on the characteristics of the phase screens, actuator density of the DMs and the guide star configuration.

  19. Development of a flexible test-bed for robotics, telemanipulation and servicing research

    NASA Technical Reports Server (NTRS)

    Davies, Barry F.

    1989-01-01

    The development of a flexible operation test-bed, based around a commercially available ASEA industrial robot is described. The test-bed was designed to investigate fundamental human factors issues concerned with the unique problems of robotic manipulation in the hostile environment of Space.

  20. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    DOE PAGES

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  1. A programmable laboratory testbed in support of evaluation of functional brain activation and connectivity.

    PubMed

    Barbour, Randall L; Graber, Harry L; Xu, Yong; Pei, Yaling; Schmitz, Christoph H; Pfeil, Douglas S; Tyagi, Anandita; Andronica, Randy; Lee, Daniel C; Barbour, San-Lian S; Nichols, J David; Pflieger, Mark E

    2012-03-01

    An important determinant of the value of quantitative neuroimaging studies is the reliability of the derived information, which is a function of the data collection conditions. Near infrared spectroscopy (NIRS) and electroencelphalography are independent sensing domains that are well suited to explore principal elements of the brain's response to neuroactivation, and whose integration supports development of compact, even wearable, systems suitable for use in open environments. In an effort to maximize the translatability and utility of such resources, we have established an experimental laboratory testbed that supports measures and analysis of simulated macroscopic bioelectric and hemodynamic responses of the brain. Principal elements of the testbed include 1) a programmable anthropomorphic head phantom containing a multisignal source array embedded within a matrix that approximates the background optical and bioelectric properties of the brain, 2) integrated translatable headgear that support multimodal studies, and 3) an integrated data analysis environment that supports anatomically based mapping of experiment-derived measures that are directly and not directly observable. Here, we present a description of system components and fabrication, an overview of the analysis environment, and findings from a representative study that document the ability to experimentally validate effective connectivity models based on NIRS tomography.

  2. Comparison of two matrix data structures for advanced CSM testbed applications

    NASA Technical Reports Server (NTRS)

    Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.

    1989-01-01

    The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.

  3. Recent Experiments Conducted with the Wide-Field Imaging Interferometry Testbed (WIIT)

    NASA Technical Reports Server (NTRS)

    Leisawitz, David T.; Juanola-Parramon, Roser; Bolcar, Matthew; Iacchetta, Alexander S.; Maher, Stephen F.; Rinehart, Stephen A.

    2016-01-01

    The Wide-field Imaging Interferometry Testbed (WIIT) was developed at NASA's Goddard Space Flight Center to demonstrate and explore the practical limitations inherent in wide field-of-view double Fourier (spatio-spectral) interferometry. The testbed delivers high-quality interferometric data and is capable of observing spatially and spectrally complex hyperspectral test scenes. Although WIIT operates at visible wavelengths, by design the data are representative of those from a space-based far-infrared observatory. We used WIIT to observe a calibrated, independently characterized test scene of modest spatial and spectral complexity, and an astronomically realistic test scene of much greater spatial and spectral complexity. This paper describes the experimental setup, summarizes the performance of the testbed, and presents representative data.

  4. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea-ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Crosier, Jonathan; Bower, Keith N.; Darbyshire, Eoghan; Taylor, Jonathan W.; Liu, Dantong; Allan, James D.; Williams, Paul I.; Gallagher, Martin W.; Choularton, Thomas W.

    2016-04-01

    The Arctic sea-ice is intricately coupled to the atmosphere[1]. The decreasing sea-ice extent with the changing climate raises questions about how Arctic cloud structure will respond. Any effort to answer these questions is hindered by the scarcity of atmospheric observations in this region. Comprehensive cloud and aerosol measurements could allow for an improved understanding of the relationship between surface conditions and cloud structure; knowledge which could be key in validating weather model forecasts. Previous studies[2] have shown via remote sensing that cloudiness increases over the marginal ice zone (MIZ) and ocean with comparison to the sea-ice; however, to our knowledge, detailed in-situ data of this transition have not been previously presented. In 2013, the Aerosol-Cloud Coupling and Climate Interactions in the Arctic (ACCACIA) campaign was carried out in the vicinity of Svalbard, Norway to collect in-situ observations of the Arctic atmosphere and investigate this issue. Fitted with a suite of remote sensing, cloud and aerosol instrumentation, the FAAM BAe-146 aircraft was used during the spring segment of the campaign (Mar-Apr 2013). One case study (23rd Mar 2013) produced excellent coverage of the atmospheric changes when transitioning from sea-ice, through the MIZ, to the open ocean. Clear microphysical changes were observed, with the cloud liquid-water content increasing by almost four times over the transition. Cloud base, depth and droplet number also increased, whilst ice number concentrations decreased slightly. The surface warmed by ~13 K from sea-ice to ocean, with minor differences in aerosol particle number (of sizes corresponding to Cloud Condensation Nuclei or Ice Nucleating Particles) observed, suggesting that the primary driver of these microphysical changes was the increased heat fluxes and induced turbulence from the warm ocean surface as expected. References: [1] Kapsch, M.L., Graversen, R.G. and Tjernström, M. Springtime

  5. Holodeck Testbed Project

    NASA Technical Reports Server (NTRS)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  6. Mini-mast CSI testbed user's guide

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  7. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  8. Remotely Accessible Testbed for Software Defined Radio Development

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2012-01-01

    Previous development testbeds have assumed that the developer was physically present in front of the hardware being used. No provision for remote operation of basic functions (power on/off or reset) was made, because the developer/operator was sitting in front of the hardware, and could just push the button manually. In this innovation, a completely remotely accessible testbed has been created, with all diagnostic equipment and tools set up for remote access, and using standardized interfaces so that failed equipment can be quickly replaced. In this testbed, over 95% of the operating hours were used for testing without the developer being physically present. The testbed includes a pair of personal computers, one running Linux and one running Windows. A variety of peripherals is connected via Ethernet and USB (universal serial bus) interfaces. A private internal Ethernet is used to connect to test instruments and other devices, so that the sole connection to the outside world is via the two PCs. An important design consideration was that all of the instruments and interfaces used stable, long-lived industry standards, such as Ethernet, USB, and GPIB (general purpose interface bus). There are no plug-in cards for the two PCs, so there are no problems with finding replacement computers with matching interfaces, device drivers, and installation. The only thing unique to the two PCs is the locally developed software, which is not specific to computer or operating system version. If a device (including one of the computers) were to fail or become unavailable (e.g., a test instrument needed to be recalibrated), replacing it is a straightforward process with a standard, off-the-shelf device.

  9. Contrasting sea-ice and open-water boundary layers during melt and freeze-up seasons: Some result from the Arctic Clouds in Summer Experiment.

    NASA Astrophysics Data System (ADS)

    Tjernström, Michael; Sotiropoulou, Georgia; Sedlar, Joseph; Achtert, Peggy; Brooks, Barbara; Brooks, Ian; Persson, Ola; Prytherch, John; Salsbury, Dominic; Shupe, Matthew; Johnston, Paul; Wolfe, Dan

    2016-04-01

    With more open water present in the Arctic summer, an understanding of atmospheric processes over open-water and sea-ice surfaces as summer turns into autumn and ice starts forming becomes increasingly important. The Arctic Clouds in Summer Experiment (ACSE) was conducted in a mix of open water and sea ice in the eastern Arctic along the Siberian shelf during late summer and early autumn 2014, providing detailed observations of the seasonal transition, from melt to freeze. Measurements were taken over both ice-free and ice-covered surfaces, offering an insight to the role of the surface state in shaping the lower troposphere and the boundary-layer conditions as summer turned into autumn. During summer, strong surface inversions persisted over sea ice, while well-mixed boundary layers capped by elevated inversions were frequent over open-water. The former were often associated with advection of warm air from adjacent open-water or land surfaces, whereas the latter were due to a positive buoyancy flux from the warm ocean surface. Fog and stratus clouds often persisted over the ice, whereas low-level liquid-water clouds developed over open water. These differences largely disappeared in autumn, when mixed-phase clouds capped by elevated inversions dominated in both ice-free and ice-covered conditions. Low-level-jets occurred ~20-25% of the time in both seasons. The observations indicate that these jets were typically initiated at air-mass boundaries or along the ice edge in autumn, while in summer they appeared to be inertial oscillations initiated by partial frictional decoupling as warm air was advected in over the sea ice. The start of the autumn season was related to an abrupt change in atmospheric conditions, rather than to the gradual change in solar radiation. The autumn onset appeared as a rapid cooling of the whole atmosphere and the freeze up followed as the warm surface lost heat to the atmosphere. While the surface type had a pronounced impact on boundary

  10. Description of the control system design for the SSF PMAD DC testbed

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Kimnach, Greg L.

    1991-01-01

    The Power Management and Distribution (PMAD) DC Testbed Control System for Space Station Freedom was developed using a top down approach based on classical control system and conventional terrestrial power utilities design techniques. The design methodology includes the development of a testbed operating concept. This operating concept describes the operation of the testbed under all possible scenarios. A unique set of operating states was identified and a description of each state, along with state transitions, was generated. Each state is represented by a unique set of attributes and constraints, and its description reflects the degree of system security within which the power system is operating. Using the testbed operating states description, a functional design for the control system was developed. This functional design consists of a functional outline, a text description, and a logical flowchart for all the major control system functions. Described here are the control system design techniques, various control system functions, and the status of the design and implementation.

  11. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  12. Description of the SSF PMAD DC testbed control system data acquisition function

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Mackin, Michael; Wright, Theodore

    1992-01-01

    The NASA LeRC in Cleveland, Ohio has completed the development and integration of a Power Management and Distribution (PMAD) DC Testbed. This testbed is a reduced scale representation of the end to end, sources to loads, Space Station Freedom Electrical Power System (SSF EPS). This unique facility is being used to demonstrate DC power generation and distribution, power management and control, and system operation techniques considered to be prime candidates for the Space Station Freedom. A key capability of the testbed is its ability to be configured to address system level issues in support of critical SSF program design milestones. Electrical power system control and operation issues like source control, source regulation, system fault protection, end-to-end system stability, health monitoring, resource allocation, and resource management are being evaluated in the testbed. The SSF EPS control functional allocation between on-board computers and ground based systems is evolving. Initially, ground based systems will perform the bulk of power system control and operation. The EPS control system is required to continuously monitor and determine the current state of the power system. The DC Testbed Control System consists of standard controllers arranged in a hierarchical and distributed architecture. These controllers provide all the monitoring and control functions for the DC Testbed Electrical Power System. Higher level controllers include the Power Management Controller, Load Management Controller, Operator Interface System, and a network of computer systems that perform some of the SSF Ground based Control Center Operation. The lower level controllers include Main Bus Switch Controllers and Photovoltaic Controllers. Power system status information is periodically provided to the higher level controllers to perform system control and operation. The data acquisition function of the control system is distributed among the various levels of the hierarchy. Data

  13. Testbeds for Assessing Critical Scenarios in Power Control Systems

    NASA Astrophysics Data System (ADS)

    Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem

    The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.

  14. Delivering Unidata Technology via the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  15. Spectral Dependence of MODIS Cloud Droplet Effective Radius Retrievals for Marine Boundary Layer Clouds

    NASA Technical Reports Server (NTRS)

    Zhang, Zhibo; Platnick, Steven E.; Ackerman, Andrew S.; Cho, Hyoun-Myoung

    2014-01-01

    Low-level warm marine boundary layer (MBL) clouds cover large regions of Earth's surface. They have a significant role in Earth's radiative energy balance and hydrological cycle. Despite the fundamental role of low-level warm water clouds in climate, our understanding of these clouds is still limited. In particular, connections between their properties (e.g. cloud fraction, cloud water path, and cloud droplet size) and environmental factors such as aerosol loading and meteorological conditions continue to be uncertain or unknown. Modeling these clouds in climate models remains a challenging problem. As a result, the influence of aerosols on these clouds in the past and future, and the potential impacts of these clouds on global warming remain open questions leading to substantial uncertainty in climate projections. To improve our understanding of these clouds, we need continuous observations of cloud properties on both a global scale and over a long enough timescale for climate studies. At present, satellite-based remote sensing is the only means of providing such observations.

  16. An Integrated Testbed for Cooperative Perception with Heterogeneous Mobile and Static Sensors

    PubMed Central

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  17. An integrated testbed for cooperative perception with heterogeneous mobile and static sensors.

    PubMed

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper.

  18. VCE testbed program planning and definition study

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.; Godston, J.

    1978-01-01

    The flight definition of the Variable Stream Control Engine (VSCE) was updated to reflect design improvements in the two key components: (1) the low emissions duct burner, and (2) the coannular exhaust nozzle. The testbed design was defined and plans for the overall program were formulated. The effect of these improvements was evaluated for performance, emissions, noise, weight, and length. For experimental large scale testing of the duct burner and coannular nozzle, a design definition of the VCE testbed configuration was made. This included selecting the core engine, determining instrumentation requirements, and selecting the test facilities, in addition to defining control system and assembly requirements. Plans for a comprehensive test program to demonstrate the duct burner and nozzle technologies were formulated. The plans include both aeroacoustic and emissions testing.

  19. Versatile simulation testbed for rotorcraft speech I/O system design

    NASA Technical Reports Server (NTRS)

    Simpson, Carol A.

    1986-01-01

    A versatile simulation testbed for the design of a rotorcraft speech I/O system is described in detail. The testbed will be used to evaluate alternative implementations of synthesized speech displays and speech recognition controls for the next generation of Army helicopters including the LHX. The message delivery logic is discussed as well as the message structure, the speech recognizer command structure and features, feedback from the recognizer, and random access to controls via speech command.

  20. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    NASA Astrophysics Data System (ADS)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  1. Context-aware distributed cloud computing using CloudScheduler

    NASA Astrophysics Data System (ADS)

    Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.

    2017-10-01

    The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.

  2. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  3. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maloney, T.M.; Prokopius, P.R.; Voecks, G.E.

    1995-01-25

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbedmore » fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}« less

  4. CRYOTE (Cryogenic Orbital Testbed) Concept

    NASA Technical Reports Server (NTRS)

    Gravlee, Mari; Kutter, Bernard; Wollen, Mark; Rhys, Noah; Walls, Laurie

    2009-01-01

    Demonstrating cryo-fluid management (CFM) technologies in space is critical for advances in long duration space missions. Current space-based cryogenic propulsion is viable for hours, not the weeks to years needed by space exploration and space science. CRYogenic Orbital TEstbed (CRYOTE) provides an affordable low-risk environment to demonstrate a broad array of critical CFM technologies that cannot be tested in Earth's gravity. These technologies include system chilldown, transfer, handling, health management, mixing, pressure control, active cooling, and long-term storage. United Launch Alliance is partnering with Innovative Engineering Solutions, the National Aeronautics and Space Administration, and others to develop CRYOTE to fly as an auxiliary payload between the primary payload and the Centaur upper stage on an Atlas V rocket. Because satellites are expensive, the space industry is largely risk averse to incorporating unproven systems or conducting experiments using flight hardware that is supporting a primary mission. To minimize launch risk, the CRYOTE system will only activate after the primary payload is separated from the rocket. Flying the testbed as an auxiliary payload utilizes Evolved Expendable Launch Vehicle performance excess to cost-effectively demonstrate enhanced CFM.

  5. The Northrop Grumman External Occulter Testbed: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Lo, Amy; Glassman, T.; Lillie, C.

    2007-05-01

    We have built a subscale testbed to demonstrate and validate the performance of the New Worlds Observer (NWO), a terrestrial planet finder external-occulter mission concept. The external occulter concept allows observations of nearby exo-Earths using two spacecraft: one carrying an occulter that is tens of meters in diameter and the other carrying a generic space telescope. The occulter is completely opaque, resembling a flower, with petals having a hypergaussian profile that enable 10-10 intensity suppression of stars that potentially harbor terrestrial planets. The baseline flight NWO system has a 30 meter occulter flying 30,000 km in front of a 4 meter class telescope. Testing the flight configuration on the ground is not feasible, so we have matched the Fresnel number of the flight configuration ( 10) using a subscale occulter. Our testbed consists of an 80 meter length evacuated tube, with a high precision occulter in the center of the tube. The occulter is 4 cm in diameter, manufactured with ¼ micron metrological accuracy and less than 2 micron tip truncation. This mimics a 30 meter occulter with millimeter figure accuracy and less than centimeter tip truncation. Our testbed is an evolving experiment, and we report here the first, preliminary, results using a single wavelength laser (532 nm) as the source.

  6. Processing Uav and LIDAR Point Clouds in Grass GIS

    NASA Astrophysics Data System (ADS)

    Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.

    2016-06-01

    Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.

  7. Definition study for variable cycle engine testbed engine and associated test program

    NASA Technical Reports Server (NTRS)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  8. University of Florida Advanced Technologies Campus Testbed

    DOT National Transportation Integrated Search

    2017-09-21

    The University of Florida (UF) and its Transportation Institute (UFTI), the Florida Department of Transportation (FDOT) and the City of Gainesville (CoG) are cooperating to develop a smart transportation testbed on the University of Florida (UF) main...

  9. Climatic Implications of the Observed Temperature Dependence of the Liquid Water Path of Low Clouds

    NASA Technical Reports Server (NTRS)

    DelGenio, Anthony

    1999-01-01

    The uncertainty in the global climate sensitivity to an equilibrium doubling of carbon dioxide is often stated to be 1.5-4.5 K, largely due to uncertainties in cloud feedbacks. The lower end of this range is based on the assumption or prediction in some GCMs that cloud liquid water behaves adiabatically, thus implying that cloud optical thickness will increase in a warming climate if the physical thickness of clouds is invariant. Satellite observations of low-level cloud optical thickness and liquid water path have challenged this assumption, however, at low and middle latitudes. We attempt to explain the satellite results using four years of surface remote sensing data from the Atmospheric Radiation Measurements (ARM) Cloud And Radiation Testbed (CART) site in the Southern Great Plains. We find that low cloud liquid water path is insensitive to temperature in winter but strongly decreases with temperature in summer. The latter occurs because surface relative humidity decreases with warming, causing cloud base to rise and clouds to geometrically thin. Meanwhile, inferred liquid water contents hardly vary with temperature, suggesting entrainment depletion. Physically, the temperature dependence appears to represent a transition from higher probabilities of stratified boundary layers at cold temperatures to a higher incidence of convective boundary layers at warm temperatures. The combination of our results and the earlier satellite findings imply that the minimum climate sensitivity should be revised upward from 1.5 K.

  10. Progress on an external occulter testbed at flight Fresnel numbers

    NASA Astrophysics Data System (ADS)

    Kim, Yunjong; Sirbu, Dan; Galvin, Michael; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-01-01

    An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The occulter testbed uses 78 m optical propagation distance to realize the flight Fresnel numbers. We will use an etched silicon mask as the occulter. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the mechanical design of the testbed. We compare the experimental results with simulations that predict the ultimate contrast performance.

  11. Towards an autonomous telescope system: the Test-Bed Telescope project

    NASA Astrophysics Data System (ADS)

    Racero, E.; Ocaña, F.; Ponz, D.; the TBT Consortium

    2015-05-01

    In the context of the Space Situational Awareness (SSA) programme of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. It is foreseen that this test-bed environment will be used to validate future prototype software systems as well as to evaluate remote monitoring and control techniques. The test-bed system will be capable to deliver astrometric and photometric data of the observed objects in near real-time. This contribution describes the current status of the project.

  12. NASA's telemedicine testbeds: Commercial benefit

    NASA Astrophysics Data System (ADS)

    Doarn, Charles R.; Whitten, Raymond

    1998-01-01

    The National Aeronautics and Space Administration (NASA) has been developing and applying telemedicine to support space flight since the Agency's beginning. Telemetry of physiological parameters from spacecraft to ground controllers is critical to assess the health status of humans in extreme and remote environments. Requisite systems to support medical care and maintain readiness will evolve as mission duration and complexity increase. Developing appropriate protocols and procedures to support multinational, multicultural missions is a key objective of this activity. NASA has created an Agency-wide strategic plan that focuses on the development and integration of technology into the health care delivery systems for space flight to meet these challenges. In order to evaluate technology and systems that can enhance inflight medical care and medical education, NASA has established and conducted several testbeds. Additionally, in June of 1997, NASA established a Commercial Space Center (CSC) for Medical Informatics and Technology Applications at Yale University School of Medicine. These testbeds and the CSC foster the leveraging of technology and resources between government, academia and industry to enhance health care. This commercial endeavor will influence both the delivery of health care in space and on the ground. To date, NASA's activities in telemedicine have provided new ideas in the application of telecommunications and information systems to health care. NASA's Spacebridge to Russia, an Internet-based telemedicine testbed, is one example of how telemedicine and medical education can be conducted using the Internet and its associated tools. Other NASA activities, including the development of a portable telemedicine workstation, which has been demonstrated on the Crow Indian Reservation and in the Texas Prison System, show promise in serving as significant adjuncts to the delivery of health care. As NASA continues to meet the challenges of space flight, the

  13. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic applications (DMA) and active transportation and demand management (ATDM) programs — leveraging AMS testbed outputs for ATDM analysis – a primer.

    DOT National Transportation Integrated Search

    2017-08-01

    The primary objective of AMS Testbed project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. Throug...

  14. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : Dallas testbed analysis plan. [supporting datasets - Dallas Testbed

    DOT National Transportation Integrated Search

    2017-07-26

    The datasets in this zip file are in support of Intelligent Transportation Systems Joint Program Office (ITS JPO) report FHWA-JPO-16-385, "Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applica...

  15. Project implementation plan : variable dynamic testbed vehicle

    DOT National Transportation Integrated Search

    1997-02-01

    This document is the project implementation plan for the Variable Dynamic Testbed Vehicle (VDTV) program, sponsored by the Jet Propulsion Laboratory for the Office of Crash Avoidance Research (OCAR) programs in support of Thrust One of the National H...

  16. An adaptable, low cost test-bed for unmanned vehicle systems research

    NASA Astrophysics Data System (ADS)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  17. Testing cloud microphysics parameterizations in NCAR CAM5 with ISDAC and M-PACE observations

    NASA Astrophysics Data System (ADS)

    Liu, Xiaohong; Xie, Shaocheng; Boyle, James; Klein, Stephen A.; Shi, Xiangjun; Wang, Zhien; Lin, Wuyin; Ghan, Steven J.; Earle, Michael; Liu, Peter S. K.; Zelenyuk, Alla

    2011-01-01

    Arctic clouds simulated by the National Center for Atmospheric Research (NCAR) Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic spring and fall seasons performed under the Cloud-Associated Parameterizations Testbed framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary-layer mixed-phase stratocumulus and multilayer or deep frontal clouds. However, for low-level stratocumulus, the model significantly underestimates the observed cloud liquid water content in both seasons. As a result, CAM5 significantly underestimates the surface downward longwave radiative fluxes by 20-40 W m-2. Introducing a new ice nucleation parameterization slightly improves the model performance for low-level mixed-phase clouds by increasing cloud liquid water content through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron-Findeisen process. The CAM5 single-column model testing shows that changing the instantaneous freezing temperature of rain to form snow from -5°C to -40°C causes a large increase in modeled cloud liquid water content through the slowing down of cloud liquid and rain-related processes (e.g., autoconversion of cloud liquid to rain). The underestimation of aerosol concentrations in CAM5 in the Arctic also plays an important role in the low bias of cloud liquid water in the single-layer mixed-phase clouds. In addition, numerical issues related to the coupling of model physics and time stepping in CAM5 are responsible for the model biases and will be explored in future studies.

  18. Abstracting application deployment on Cloud infrastructures

    NASA Astrophysics Data System (ADS)

    Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.

    2017-10-01

    Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.

  19. Synergy of stereo cloud top height and ORAC optimal estimation cloud retrieval: evaluation and application to AATSR

    NASA Astrophysics Data System (ADS)

    Fisher, Daniel; Poulsen, Caroline A.; Thomas, Gareth E.; Muller, Jan-Peter

    2016-03-01

    In this paper we evaluate the impact on the cloud parameter retrievals of the ORAC (Optimal Retrieval of Aerosol and Cloud) algorithm following the inclusion of stereo-derived cloud top heights as a priori information. This is performed in a mathematically rigorous way using the ORAC optimal estimation retrieval framework, which includes the facility to use such independent a priori information. Key to the use of a priori information is a characterisation of their associated uncertainty. This paper demonstrates the improvements that are possible using this approach and also considers their impact on the microphysical cloud parameters retrieved. The Along-Track Scanning Radiometer (AATSR) instrument has two views and three thermal channels, so it is well placed to demonstrate the synergy of the two techniques. The stereo retrieval is able to improve the accuracy of the retrieved cloud top height when compared to collocated Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), particularly in the presence of boundary layer inversions and high clouds. The impact of the stereo a priori information on the microphysical cloud properties of cloud optical thickness (COT) and effective radius (RE) was evaluated and generally found to be very small for single-layer clouds conditions over open water (mean RE differences of 2.2 (±5.9) microns and mean COD differences of 0.5 (±1.8) for single-layer ice clouds over open water at elevations of above 9 km, which are most strongly affected by the inclusion of the a priori).

  20. Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed

    DTIC Science & Technology

    2012-01-01

    Cooperative Search with Autonomous Vehicles in a 3D Aquatic Testbed Matthew Keeter1, Daniel Moore2,3, Ryan Muller2,3, Eric Nieters1, Jennifer...Many applications for autonomous vehicles involve three-dimensional domains, notably aerial and aquatic environments. Such applications include mon...TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Cooperative Search With Autonomous Vehicles In A 3D Aquatic Testbed 5a

  1. Cloud Environment Automation: from infrastructure deployment to application monitoring

    NASA Astrophysics Data System (ADS)

    Aiftimiei, C.; Costantini, A.; Bucchi, R.; Italiano, A.; Michelotto, D.; Panella, M.; Pergolesi, M.; Saletta, M.; Traldi, S.; Vistoli, C.; Zizzi, G.; Salomoni, D.

    2017-10-01

    The potential offered by the cloud paradigm is often limited by technical issues, rules and regulations. In particular, the activities related to the design and deployment of the Infrastructure as a Service (IaaS) cloud layer can be difficult to apply and time-consuming for the infrastructure maintainers. In this paper the research activity, carried out during the Open City Platform (OCP) research project [1], aimed at designing and developing an automatic tool for cloud-based IaaS deployment is presented. Open City Platform is an industrial research project funded by the Italian Ministry of University and Research (MIUR), started in 2014. It intends to research, develop and test new technological solutions open, interoperable and usable on-demand in the field of Cloud Computing, along with new sustainable organizational models that can be deployed for and adopted by the Public Administrations (PA). The presented work and the related outcomes are aimed at simplifying the deployment and maintenance of a complete IaaS cloud-based infrastructure.

  2. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system are documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  3. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system is documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  4. Comparison of Point Cloud Registration Algorithms for Better Result Assessment - Towards AN Open-Source Solution

    NASA Astrophysics Data System (ADS)

    Lachat, E.; Landes, T.; Grussenmeyer, P.

    2018-05-01

    Terrestrial and airborne laser scanning, photogrammetry and more generally 3D recording techniques are used in a wide range of applications. After recording several individual 3D datasets known in local systems, one of the first crucial processing steps is the registration of these data into a common reference frame. To perform such a 3D transformation, commercial and open source software as well as programs from the academic community are available. Due to some lacks in terms of computation transparency and quality assessment in these solutions, it has been decided to develop an open source algorithm which is presented in this paper. It is dedicated to the simultaneous registration of multiple point clouds as well as their georeferencing. The idea is to use this algorithm as a start point for further implementations, involving the possibility of combining 3D data from different sources. Parallel to the presentation of the global registration methodology which has been employed, the aim of this paper is to confront the results achieved this way with the above-mentioned existing solutions. For this purpose, first results obtained with the proposed algorithm to perform the global registration of ten laser scanning point clouds are presented. An analysis of the quality criteria delivered by two selected software used in this study and a reflexion about these criteria is also performed to complete the comparison of the obtained results. The final aim of this paper is to validate the current efficiency of the proposed method through these comparisons.

  5. NOAA Testbed and Proving Ground Workshop 2012

    Science.gov Websites

    Goals: Communicate results and future directions for individual testbeds and discuss broader cross theme of "intense precipitation" Identify best practices, understand and discuss improvements . Privacy Policy | FOIA | Information Quality | Disclaimer | Commerce.gov | USA.gov | Ready.gov | Contact

  6. An advanced wide area chemical sensor testbed

    NASA Astrophysics Data System (ADS)

    Seeley, Juliette A.; Kelly, Michael; Wack, Edward; Ryan-Howard, Danette; Weidler, Darryl; O'Brien, Peter; Colonero, Curtis; Lakness, John; Patel, Paras

    2005-11-01

    In order to meet current and emerging needs for remote passive standoff detection of chemical agent threats, MIT Lincoln Laboratory has developed a Wide Area Chemical Sensor (WACS) testbed. A design study helped define the initial concept, guided by current standoff sensor mission requirements. Several variants of this initial design have since been proposed to target other applications within the defense community. The design relies on several enabling technologies required for successful implementation. The primary spectral component is a Wedged Interferometric Spectrometer (WIS) capable of imaging in the LWIR with spectral resolutions as narrow as 4 cm-1. A novel scanning optic will enhance the ability of this sensor to scan over large areas of concern with a compact, rugged design. In this paper, we shall discuss our design, development, and calibration process for this system as well as recent testbed measurements that validate the sensor concept.

  7. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    NASA Technical Reports Server (NTRS)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  8. Diffraction-based analysis of tunnel size for a scaled external occulter testbed

    NASA Astrophysics Data System (ADS)

    Sirbu, Dan; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-07-01

    For performance verification of an external occulter mask (also called a starshade), scaled testbeds have been developed to measure the suppression of the occulter shadow in the pupil plane and contrast in the image plane. For occulter experiments the scaling is typically performed by maintaining an equivalent Fresnel number. The original Princeton occulter testbed was oversized with respect to both input beam and shadow propagation to limit any diffraction effects due to finite testbed enclosure edges; however, to operate at realistic space-mission equivalent Fresnel numbers an extended testbed is currently under construction. With the longer propagation distances involved, diffraction effects due to the edge of the tunnel must now be considered in the experiment design. Here, we present a diffraction-based model of two separate tunnel effects. First, we consider the effect of tunnel-edge induced diffraction ringing upstream from the occulter mask. Second, we consider the diffraction effect due to clipping of the output shadow by the tunnel downstream from the occulter mask. These calculations are performed for a representative point design relevant to the new Princeton occulter experiment, but we also present an analytical relation that can be used for other propagation distances.

  9. Implementation of a virtual link between power system testbeds at Marshall Spaceflight Center and Lewis Research Center

    NASA Technical Reports Server (NTRS)

    Doreswamy, Rajiv

    1990-01-01

    The Marshall Space Flight Center (MSFC) owns and operates a space station module power management and distribution (SSM-PMAD) testbed. This system, managed by expert systems, is used to analyze and develop power system automation techniques for Space Station Freedom. The Lewis Research Center (LeRC), Cleveland, Ohio, has developed and implemented a space station electrical power system (EPS) testbed. This system and its power management controller are representative of the overall Space Station Freedom power system. A virtual link is being implemented between the testbeds at MSFC and LeRC. This link would enable configuration of SSM-PMAD as a load center for the EPS testbed at LeRC. This connection will add to the versatility of both systems, and provide an environment of enhanced realism for operation of both testbeds.

  10. Cognitive Medical Wireless Testbed System (COMWITS)

    DTIC Science & Technology

    2016-11-01

    Number: ...... ...... Sub Contractors (DD882) Names of other research staff Inventions (DD882) Scientific Progress This testbed merges two ARO grants...bit 64 bit CPU Intel Xeon Processor E5-1650v3 (6C, 3.5 GHz, Turbo, HT , 15M, 140W) Intel Core i7-3770 (3.4 GHz Quad Core, 77W) Dual Intel Xeon

  11. INDIGO: Building a DataCloud Framework to support Open Science

    NASA Astrophysics Data System (ADS)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  12. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide

    DTIC Science & Technology

    2016-03-01

    distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This document serves as the quick-start guide for GIFT Cloud, the web -based...to users with a GIFT Account at no cost. GIFT Cloud is a new implementation of GIFT. This web -based application allows learners, authors, and...distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser. Officially, GIFT Cloud has been tested to work on

  13. High Vertically Resolved Atmospheric and Surface/Cloud Parameters Retrieved with Infrared Atmospheric Sounding Interferometer (IASI)

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Larar, Allen M.; Smith, WIlliam L.; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    The Joint Airborne IASI Validation Experiment (JAIVEx) was conducted during April 2007 mainly for validation of the IASI on the MetOp satellite. IASI possesses an ultra-spectral resolution of 0.25/cm and a spectral coverage from 645 to 2760/cm. Ultra-spectral resolution infrared spectral radiance obtained from near nadir observations provide atmospheric, surface, and cloud property information. An advanced retrieval algorithm with a fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. This physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the cloud-free and/or clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals are achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). Preliminary retrievals of atmospheric soundings, surface properties, and cloud optical/microphysical properties with the IASI observations are obtained and presented. These retrievals will be further inter-compared with those obtained from airborne FTS system, such as the NPOESS Airborne Sounder Testbed - Interferometer (NAST-I), dedicated dropsondes, radiosondes, and ground based Raman Lidar. The

  14. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  15. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our contrast assessment and the development, sensing and control of the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraphy (VNC) for exoplanet detection and characterization. Tbe VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center has an established effort to develop VNC technologies, and an incremental sequence of testbeds to advance this approach and its critical technologies. We discuss the development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(exp 8), 10(exp 9) and ideally 10(exp 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the laboratory results, optical configuration, critical technologies and the null sensing and control approach.

  16. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - San Mateo Testbed Analysis Plan [supporting datasets - San Mateo Testbed

    DOT National Transportation Integrated Search

    2017-06-26

    This zip file contains files of data to support FHWA-JPO-16-370, Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Program...

  17. Telescience testbed pilot program, volume 3: Experiment summaries

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.

  18. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  19. Services for domain specific developments in the Cloud

    NASA Astrophysics Data System (ADS)

    Schwichtenberg, Horst; Gemuend, André

    2015-04-01

    We will discuss and demonstrate the possibilities of new Cloud Services where the complete development of code is in the Cloud. We will discuss the possibilities of such services where the complete development cycle from programing to testing is in the cloud. This can be also combined with dedicated research domain specific services and hide the burden of accessing available infrastructures. As an example, we will show a service that is intended to complement the services of the VERCE projects infrastructure, a service that utilizes Cloud resources to offer simplified execution of data pre- and post-processing scripts. It offers users access to the ObsPy seismological toolbox for processing data with the Python programming language, executed on virtual Cloud resources in a secured sandbox. The solution encompasses a frontend with a modern graphical user interface, a messaging infrastructure as well as Python worker nodes for background processing. All components are deployable in the Cloud and have been tested on different environments based on OpenStack and OpenNebula. Deployments on commercial, public Clouds will be tested in the future.

  20. Testbed-based Performance Evaluation of Attack Resilient Control for AGC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashok, Aditya; Sridhar, Siddharth; McKinnon, Archibald D.

    The modern electric power grid is a complex cyber-physical system whose reliable operation is enabled by a wide-area monitoring and control infrastructure. This infrastructure, supported by an extensive communication backbone, enables several control applications functioning at multiple time scales to ensure the grid is maintained within stable operating limits. Recent events have shown that vulnerabilities in this infrastructure may be exploited to manipulate the data being exchanged. Such a scenario could cause the associated control application to mis-operate, potentially causing system-wide instabilities. There is a growing emphasis on looking beyond traditional cybersecurity solutions to mitigate such threats. In this papermore » we perform a testbed-based validation of one such solution - Attack Resilient Control (ARC) - on Iowa State University's \\textit{PowerCyber} testbed. ARC is a cyber-physical security solution that combines domain-specific anomaly detection and model-based mitigation to detect stealthy attacks on Automatic Generation Control (AGC). In this paper, we first describe the implementation architecture of the experiment on the testbed. Next, we demonstrate the capability of stealthy attack templates to cause forced under-frequency load shedding in a 3-area test system. We then validate the performance of ARC by measuring its ability to detect and mitigate these attacks. Our results reveal that ARC is efficient in detecting stealthy attacks and enables AGC to maintain system operating frequency close to its nominal value during an attack. Our studies also highlight the importance of testbed-based experimentation for evaluating the performance of cyber-physical security and control applications.« less

  1. Flight Projects Office Information Systems Testbed (FIST)

    NASA Technical Reports Server (NTRS)

    Liggett, Patricia

    1991-01-01

    Viewgraphs on the Flight Projects Office Information Systems Testbed (FIST) are presented. The goal is to perform technology evaluation and prototyping of information systems to support SFOC and JPL flight projects in order to reduce risk in the development of operational data systems for such projects.

  2. Evolution of a Simulation Testbed into an Operational Tool

    NASA Technical Reports Server (NTRS)

    Sheth, Kapil; Bilimoria, Karl D.; Sridhar, Banavar; Sterenchuk, Mike; Niznik, Tim; O'Neill, Tom; Clymer, Alexis; Gutierrez Nolasco, Sebastian; Edholm, Kaj; Shih, Fu-Tai

    2017-01-01

    This paper describes the evolution over a 20-year period of the Future ATM (Air Traffic Management) Concepts Evaluation Tool (FACET) from a National Airspace System (NAS) based simulation testbed into an operational tool. FACET was developed as a testbed for assessing futuristic ATM concepts, e.g., automated conflict detection and resolution. NAS Constraint Evaluation and Notification Tool (NASCENT) is an application, within FACET, for alerting airspace users of inefficiencies in flight operations and advising time- and fuel-saving reroutes.It is currently in use at American Airlines Integrated Operations Center in Fort Worth, TX. The concepts assessed,research conducted, and the operational capability developed, along with the NASA support and achievements are presented in this paper.

  3. Crew-integration and Automation Testbed (CAT)Program Overview and RUX06 Introduction

    DTIC Science & Technology

    2006-09-20

    unlimited Crew-integration and Automation Testbed ( CAT ) Program Overview and RUX06 Introduction 26-27 July 2006 Patrick Nunez, Terry Tierney, Brian Novak...3. DATES COVERED 4. TITLE AND SUBTITLE Crew-integration and Automation Testbed ( CAT )Program Overview and RUX06 Introduction 5a. CONTRACT...Experiment • Capstone CAT experiment – Evaluate effectiveness of CAT program in improving the performance and/or reducing the workload for a mounted

  4. SPHERES tethered formation flight testbed: advancements in enabling NASA's SPECS mission

    NASA Astrophysics Data System (ADS)

    Chung, Soon-Jo; Adams, Danielle; Saenz-Otero, Alvar; Kong, Edmund; Miller, David W.; Leisawitz, David; Lorenzini, Enrico; Sell, Steve

    2006-06-01

    This paper reports on efforts to control a tethered formation flight spacecraft array for NASA's SPECS mission using the SPHERES test-bed developed by the MIT Space Systems Laboratory. Specifically, advances in methodology and experimental results realized since the 2005 SPIE paper are emphasized. These include a new test-bed setup with a reaction wheel assembly, a novel relative attitude measurement system using force torque sensors, and modeling of non-ideal tethers to account for tether vibration modes. The nonlinear equations of motion of multi-vehicle tethered spacecraft with elastic flexible tethers are derived from Lagrange's equations. The controllability analysis indicates that both array resizing and spin-up are fully controllable by the reaction wheels and the tether motor, thereby saving thruster fuel consumption. Based upon this analysis, linear and nonlinear controllers have been successfully implemented on the tethered SPHERES testbed, and tested at the NASA MSFC's flat floor facility using two and three SPHERES configurations.

  5. TACCDAS Testbed Human Factors Evaluation Methodology,

    DTIC Science & Technology

    1980-03-01

    3 TEST METHOD Development of performance criteria................... 8 Test participant identification ...................... 8 Control of...major milestones involved in the evaluation process leading up to the evaluation of the complete testbed in the field are identified. Test methods and...inevitably will be different in several ways from the intended system as foreseen by the system designers. The system users provide insights into these

  6. Identifying Meteorological Controls on Open and Closed Mesoscale Cellular Convection Associated with Marine Cold Air Outbreaks

    NASA Astrophysics Data System (ADS)

    McCoy, Isabel L.; Wood, Robert; Fletcher, Jennifer K.

    2017-11-01

    Mesoscale cellular convective (MCC) clouds occur in large-scale patterns over the ocean and have important radiative effects on the climate system. An examination of time-varying meteorological conditions associated with satellite-observed open and closed MCC clouds is conducted to illustrate the influence of large-scale meteorological conditions. Marine cold air outbreaks (MCAO) influence the development of open MCC clouds and the transition from closed to open MCC clouds. MCC neural network classifications on Moderate Resolution Imaging Spectroradiometer (MODIS) data for 2008 are collocated with Clouds and the Earth's Radiant Energy System (CERES) data and ERA-Interim reanalysis to determine the radiative effects of MCC clouds and their thermodynamic environments. Closed MCC clouds are found to have much higher albedo on average than open MCC clouds for the same cloud fraction. Three meteorological control metrics are tested: sea-air temperature difference (ΔT), estimated inversion strength (EIS), and a MCAO index (M). These predictive metrics illustrate the importance of atmospheric surface forcing and static stability for open and closed MCC cloud formation. Predictive sigmoidal relations are found between M and MCC cloud frequency globally and regionally: negative for closed MCC cloud and positive for open MCC cloud. The open MCC cloud seasonal cycle is well correlated with M, while the seasonality of closed MCC clouds is well correlated with M in the midlatitudes and EIS in the tropics and subtropics. M is found to best distinguish open and closed MCC clouds on average over shorter time scales. The possibility of a MCC cloud feedback is discussed.

  7. James Webb Space Telescope Optical Simulation Testbed I: overview and first results

    NASA Astrophysics Data System (ADS)

    Perrin, Marshall D.; Soummer, Rémi; Choquet, Élodie; N'Diaye, Mamadou; Levecq, Olivier; Lajoie, Charles-Philippe; Ygouf, Marie; Leboulleux, Lucie; Egron, Sylvain; Anderson, Rachel; Long, Chris; Elliott, Erin; Hartig, George; Pueyo, Laurent; van der Marel, Roeland; Mountain, Matt

    2014-08-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop workbench to study aspects of wavefront sensing and control for a segmented space telescope, including both commissioning and maintenance activities. JOST is complementary to existing optomechanical testbeds for JWST (e.g. the Ball Aerospace Testbed Telescope, TBT) given its compact scale and flexibility, ease of use, and colocation at the JWST Science & Operations Center. We have developed an optical design that reproduces the physics of JWST's three-mirror anastigmat using three aspheric lenses; it provides similar image quality as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at HeNe wavelength. A segmented deformable mirror stands in for the segmented primary mirror and allows control of the 18 segments in piston, tip, and tilt, while the secondary can be controlled in tip, tilt and x, y, z position. This will be sufficient to model many commissioning activities, to investigate field dependence and multiple field point sensing & control, to evaluate alternate sensing algorithms, and develop contingency plans. Testbed data will also be usable for cross-checking of the WFS&C Software Subsystem, and for staff training and development during JWST's five- to ten-year mission.

  8. INFORM Lab: a testbed for high-level information fusion and resource management

    NASA Astrophysics Data System (ADS)

    Valin, Pierre; Guitouni, Adel; Bossé, Eloi; Wehn, Hans; Happe, Jens

    2011-05-01

    DRDC Valcartier and MDA have created an advanced simulation testbed for the purpose of evaluating the effectiveness of Network Enabled Operations in a Coastal Wide Area Surveillance situation, with algorithms provided by several universities. This INFORM Lab testbed allows experimenting with high-level distributed information fusion, dynamic resource management and configuration management, given multiple constraints on the resources and their communications networks. This paper describes the architecture of INFORM Lab, the essential concepts of goals and situation evidence, a selected set of algorithms for distributed information fusion and dynamic resource management, as well as auto-configurable information fusion architectures. The testbed provides general services which include a multilayer plug-and-play architecture, and a general multi-agent framework based on John Boyd's OODA loop. The testbed's performance is demonstrated on 2 types of scenarios/vignettes for 1) cooperative search-and-rescue efforts, and 2) a noncooperative smuggling scenario involving many target ships and various methods of deceit. For each mission, an appropriate subset of Canadian airborne and naval platforms are dispatched to collect situation evidence, which is fused, and then used to modify the platform trajectories for the most efficient collection of further situation evidence. These platforms are fusion nodes which obey a Command and Control node hierarchy.

  9. Preliminary Design of a Galactic Cosmic Ray Shielding Materials Testbed for the International Space Station

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen; Sechkar, Edward A.; Panko, Scott R.

    2012-01-01

    The preliminary design of a testbed to evaluate the effectiveness of galactic cosmic ray (GCR) shielding materials, the MISSE Radiation Shielding Testbed (MRSMAT) is presented. The intent is to mount the testbed on the Materials International Space Station Experiment-X (MISSE-X) which is to be mounted on the International Space Station (ISS) in 2016. A key feature is the ability to simultaneously test nine samples, including standards, which are 5.25 cm thick. This thickness will enable most samples to have an areal density greater than 5 g/sq cm. It features a novel and compact GCR telescope which will be able to distinguish which cosmic rays have penetrated which shielding material, and will be able to evaluate the dose transmitted through the shield. The testbed could play a pivotal role in the development and qualification of new cosmic ray shielding technologies.

  10. MRMS Experimental Testbed for Operational Products (METOP)

    NASA Astrophysics Data System (ADS)

    Zhang, J.

    2016-12-01

    Accurate high-resolution quantitative precipitation estimation (QPE) at the continental scale is of critical importance to the nation's weather, water and climate services. To address this need, a Multi-Radar Multi-Sensor (MRMS) system was developed at the National Severe Storms Lab of National Oceanic and Atmospheric Administration that integrates radar, gauge, model and satellite data and provides a suite of QPE products at 1-km and 2-min resolution. MRMS system consists of three components: 1) an operational system; 2) a real-time research system; 3) an archive testbed. The operational system currently provides instantaneous precipitation rate, type and 1- to 72-hr accumulations for conterminous United Stated and southern Canada. The research system has the similar hardware infrastructure and data environment as the operational system, but runs newer and more advanced algorithms. The newer algorithms are tested on the research system for robustness and computational efficiency in a pseudo operational environment before they are transitioned into operations. The archive testbed, also called the MRMS Experimental Testbed for Operational Products (METOP), consists of a large database that encompasses a wide range of hydroclimatological and geographical regimes. METOP is for the testing and refinements of the most advanced radar QPE techniques, which are often developed on specific data from limited times and locations. The archive data includes quality controlled in-situ observations for the validation of the new radar QPE across all seasons and geographic regions. A number of operational QPE products derived from different sensors/models are also included in METOP for the fusion of multiple sources of complementary precipitation information. This paper is an introduction of the METOP system.

  11. Visible Nulling Coronagraphy Testbed Development for Exoplanet Detection

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; hide

    2010-01-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 10(exp 8) , 10(exp 9) and 10(exp 10) at an inner working angle of 2*lambda/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  12. Gemini Planet Imager coronagraph testbed results

    NASA Astrophysics Data System (ADS)

    Sivaramakrishnan, Anand; Soummer, Rémi; Oppenheimer, Ben R.; Carr, G. Lawrence; Mey, Jacob L.; Brenner, Doug; Mandeville, Charles W.; Zimmerman, Neil; Macintosh, Bruce A.; Graham, James R.; Saddlemyer, Les; Bauman, Brian; Carlotti, Alexis; Pueyo, Laurent; Tuthill, Peter G.; Dorrer, Christophe; Roberts, Robin; Greenbaum, Alexandra

    2010-07-01

    The Gemini Planet Imager (GPI) is an extreme AO coronagraphic integral field unit YJHK spectrograph destined for first light on the 8m Gemini South telescope in 2011. GPI fields a 1500 channel AO system feeding an apodized pupil Lyot coronagraph, and a nIR non-common-path slow wavefront sensor. It targets detection and characterizion of relatively young (<2GYr), self luminous planets up to 10 million times as faint as their primary star. We present the coronagraph subsystem's in-lab performance, and describe the studies required to specify and fabricate the coronagraph. Coronagraphic pupil apodization is implemented with metallic half-tone screens on glass, and the focal plane occulters are deep reactive ion etched holes in optically polished silicon mirrors. Our JH testbed achieves H-band contrast below a million at separations above 5 resolution elements, without using an AO system. We present an overview of the coronagraphic masks and our testbed coronagraphic data. We also demonstrate the performance of an astrometric and photometric grid that enables coronagraphic astrometry relative to the primary star in every exposure, a proven technique that has yielded on-sky precision of the order of a milliarsecond.

  13. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - calibration Report for Phoenix Testbed : Final Report. [supporting datasets - Phoenix Testbed

    DOT National Transportation Integrated Search

    2017-07-26

    The datasets in this zip file are in support of FHWA-JPO-16-379, Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Program...

  14. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs — summary report for the Chicago testbed. [supporting datasets - Chicago Testbed

    DOT National Transportation Integrated Search

    2017-04-01

    The datasets in this zip file are in support of Intelligent Transportation Systems Joint Program Office (ITS JPO) report FHWA-JPO-16-385, "Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applica...

  15. Design and construction of a testbed for the application of real volcanic ash from the Eyjafjallajökull and Grimsvötn eruptions to microgas turbines

    NASA Astrophysics Data System (ADS)

    Weber, Konradin; Fischer, Christian; Lange, Martin; Schulz, Uwe; Naraparaju, Ravisankar; Kramer, Dietmar

    2017-04-01

    It is well known that volcanic ash clouds emitted from erupting volcanoes pose a considerable threat to the aviation. The volcanic ash particles can damage the turbine blades and their thermal barrier coatings as well as the bearings of the turbine. For a detailed investigation of this damaging effect a testbed was designed and constructed, which allowed to study the damaging effects of real volcanic ash to an especially for these investigations modified microgas turbine. The use of this microgas turbine had the advantage that it delivers near reality conditions, using kerosene and operating at similar temperatures as big turbines, but at a very cost effective level. The testbed consisted out of a disperser for the real volcanic ash and all the equipment needed to control the micro gas turbine. Moreover, in front and behind the microgas turbine the concentration and the distribution of the volcanic ash were measured online by optical particle counters (OPCs). The particle concentration and size distribution of the volcanic ash particles in the intake in front of the microgas turbine was measured by an optical particle counter (OPC) combined with an isokinetic intake. Behind the microgas turbine in the exhaust gas additionally to the measurement with a second OPC ash particles were caught with an impactor, in order to enable the later analysis with an electron microscope concerning the morphology to verify possible melting processes of the ash particles. This testbed is of high importance as it allows detailed investigations of the impact of volcanic ash to jet turbines and appropriate countermeasures.

  16. Description of New Inflatable/Rigidizable Hexapod Structure Testbed for Shape and Vibration Control

    NASA Technical Reports Server (NTRS)

    Adetona, O.; Keel, L. H.; Horta, L. G.; Cadogan, D. P.; Sapna, G. H.; Scarborough, S. E.

    2002-01-01

    Larger and more powerful space based instruments are needed to meet increasingly sophisticated scientific demand. To support this need, concepts for telescopes with apertures of 100 meters are being investigated, but the required technologies are not in hand today. Due to the capacity limits of launch vehicles, the idea of deploying, erecting, or inflating large structures in space is being considered. Recently, rigidization concepts of large inflatable structures have demonstrated the capability of weight reductions of up to 50% from current concepts with packaging efficiencies near 80%. One of the important aspects of inflatable structures is vibration mitigation and line-of-sight control. Such control tasks are possible only after actuators/sensors are properly integrated into a rigidizable concept. To study these issues, we have developed an inflatable/rigidizable hexapod structure testbed. The testbed integrates state of the art piezo-electric self-sensing actuators into an inflatable/rigidizable structure and a flat membrane reflector. Using this testbed, we plan to experimentally demonstrate achievable vibration and line-of-sight control. This paper contains a description of the testbed and an outline of the test plan.

  17. Regenerative Fuel Cell System Testbed Program for Government and Commercial Applications

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA Lewis Research Center's Electrochemical Technology Branch has led a multiagency effort to design, fabricate, and operate a regenerative fuel cell (RFC) system testbed. Key objectives of this program are to evaluate, characterize, and demonstrate fully integrated RFC's for space, military, and commercial applications. The Lewis-led team is implementing the program through a unique international coalition that encompasses both Government and industry participants. Construction of the 25-kW RFC testbed at the NASA facility at Edwards Air Force Base was completed in January 1995, and the system has been operational since that time.

  18. Improved Arctic Cloud and Aerosol Research and Model Parameterizations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenneth Sassen

    2007-03-01

    In this report are summarized our contributions to the Atmospheric Measurement (ARM) program supported by the Department of Energy. Our involvement commenced in 1990 during the planning stages of the design of the ARM Cloud and Radiation Testbed (CART) sites. We have worked continuously (up to 2006) on our ARM research objectives, building on our earlier findings to advance our knowledge in several areas. Below we summarize our research over this period, with an emphasis on the most recent work. We have participated in several aircraft-supported deployments at the SGP and NSA sites. In addition to deploying the Polarization Diversitymore » Lidar (PDL) system (Sassen 1994; Noel and Sassen 2005) designed and constructed under ARM funding, we have operated other sophisticated instruments W-band polarimetric Doppler radar, and midinfrared radiometer for intercalibration and student training purposes. We have worked closely with University of North Dakota scientists, twice co-directing the Citation operations through ground-to-air communications, and serving as the CART ground-based mission coordinator with NASA aircraft during the 1996 SUCCESS/IOP campaign. We have also taken a leading role in initiating case study research involving a number of ARM coinvestigators. Analyses of several case studies from these IOPs have been reported in journal articles, as we show in Table 1. The PDL has also participated in other major field projects, including FIRE II and CRYSTAL-FACE. In general, the published results of our IOP research can be divided into two categories: comprehensive cloud case study analyses to shed light on fundamental cloud processes using the unique CART IOP measurement capabilities, and the analysis of in situ data for the testing of remote sensing cloud retrieval algorithms. One of the goals of the case study approach is to provide sufficiently detailed descriptions of cloud systems from the data-rich CART environment to make them suitable for application

  19. Lightning Tracking Tool for Assessment of Total Cloud Lightning within AWIPS II

    NASA Technical Reports Server (NTRS)

    Burks, Jason E.; Stano, Geoffrey T.; Sperow, Ken

    2014-01-01

    Total lightning (intra-cloud and cloud-to-ground) has been widely researched and shown to be a valuable tool to aid real-time warning forecasters in the assessment of severe weather potential of convective storms. The trend of total lightning has been related to the strength of a storm's updraft. Therefore a rapid increase in total lightning signifies the strengthening of the parent thunderstorm. The assessment of severe weather potential occurs in a time limited environment and therefore constrains the use of total lightning. A tool has been developed at NASA's Short-term Prediction Research and Transition (SPoRT) Center to assist in quickly analyzing the total lightning signature of multiple storms. The development of this tool comes as a direct result of forecaster feedback from numerous assessments requesting a real-time display of the time series of total lightning. This tool also takes advantage of the new architecture available within the AWIPS II environment. SPoRT's lightning tracking tool has been tested in the Hazardous Weather Testbed (HWT) Spring Program and significant changes have been made based on the feedback. In addition to the updates in response to the HWT assessment, the lightning tracking tool may also be extended to incorporate other requested displays, such as the intra-cloud to cloud-to-ground ratio as well as incorporate the lightning jump algorithm.

  20. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal planeregion extending from 14 D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. TheVNC is a hybrid interferometriccoronagraphic approach for exoplanet science. It operates with high Lyot stopefficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential futureNASA flight telescopes. NASAGoddard Space Flight Center (GSFC) has a well-established effort to develop the VNCand its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and itsenabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry tounprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a W configurationto accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters.We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, criticaltechnologies and null sensing and control.

  1. EXPERT: An atmospheric re-entry test-bed

    NASA Astrophysics Data System (ADS)

    Massobrio, F.; Viotto, R.; Serpico, M.; Sansone, A.; Caporicci, M.; Muylaert, J.-M.

    2007-06-01

    In recognition of the importance of an independent European access to the International Space Station (ISS) and in preparation for the future needs of exploration missions, ESA is conducting parallel activities to generate flight data using atmospheric re-entry test-beds and to identify vehicle design solutions for human and cargo transportation vehicles serving the ISS and beyond. The EXPERT (European eXPErimental Re-entry Test-bed) vehicle represents the major on-going development in the first class of activities. Its results may also benefit in due time scientific missions to planets with an atmosphere and future reusable launcher programmes. The objective of EXPERT is to provide a test-bed for the validation of aerothermodynamics models, codes and ground test facilities in a representative flight environment, to improve the understanding of issues related to analysis, testing and extrapolation to flight. The vehicle will be launched on a sub-orbital trajectory using a Volna missile. The EXPERT concept is based on a symmetrical re-entry capsule whose shape is composed of simple geometrical elements. The suborbital trajectory will reach 120 km altitude and a re-entry velocity of 5 6km/s. The dimensions of the capsule are 1.6 m high and 1.3 m diameter; the overall mass is in the range of 250 350kg, depending upon the mission parameters and the payload/instrumentation complement. A consistent number of scientific experiments are foreseen on-board, from innovative air data system to shock wave/boundary layer interaction, from sharp hot structures characterisation to natural and induced regime transition. Currently the project is approaching completion of the phase B, with Alenia Spazio leading the industrial team and CIRA coordinating the scientific payload development under ESA contract.

  2. A Laboratory Testbed for Embedded Fuzzy Control

    ERIC Educational Resources Information Center

    Srivastava, S.; Sukumar, V.; Bhasin, P. S.; Arun Kumar, D.

    2011-01-01

    This paper presents a novel scheme called "Laboratory Testbed for Embedded Fuzzy Control of a Real Time Nonlinear System." The idea is based upon the fact that project-based learning motivates students to learn actively and to use their engineering skills acquired in their previous years of study. It also fosters initiative and focuses…

  3. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  4. Improved Cloud resource allocation: how INDIGO-DataCloud is overcoming the current limitations in Cloud schedulers

    NASA Astrophysics Data System (ADS)

    Lopez Garcia, Alvaro; Zangrando, Lisa; Sgaravatto, Massimo; Llorens, Vincent; Vallero, Sara; Zaccolo, Valentina; Bagnasco, Stefano; Taneja, Sonia; Dal Pra, Stefano; Salomoni, Davide; Donvito, Giacinto

    2017-10-01

    Performing efficient resource provisioning is a fundamental aspect for any resource provider. Local Resource Management Systems (LRMS) have been used in data centers for decades in order to obtain the best usage of the resources, providing their fair usage and partitioning for the users. In contrast, current cloud schedulers are normally based on the immediate allocation of resources on a first-come, first-served basis, meaning that a request will fail if there are no resources (e.g. OpenStack) or it will be trivially queued ordered by entry time (e.g. OpenNebula). Moreover, these scheduling strategies are based on a static partitioning of the resources, meaning that existing quotas cannot be exceeded, even if there are idle resources allocated to other projects. This is a consequence of the fact that cloud instances are not associated with a maximum execution time and leads to a situation where the resources are under-utilized. These facts have been identified by the INDIGO-DataCloud project as being too simplistic for accommodating scientific workloads in an efficient way, leading to an underutilization of the resources, a non desirable situation in scientific data centers. In this work, we will present the work done in the scheduling area during the first year of the INDIGO project and the foreseen evolutions.

  5. Lessons learned in deploying a cloud-based knowledge platform for the Earth Science Information Partners Federation (ESIP)

    NASA Astrophysics Data System (ADS)

    Pouchard, L. C.; Depriest, A.; Huhns, M.

    2012-12-01

    Ontologies and semantic technologies are an essential infrastructure component of systems supporting knowledge integration in the Earth Sciences. Numerous earth science ontologies exist, but are hard to discover because they tend to be hosted with the projects that develop them. There are often few quality measures and sparse metadata associated with these ontologies, such as modification dates, versioning, purpose, number of classes, and properties. Projects often develop ontologies for their own needs without considering existing ontology entities or derivations from formal and more basic ontologies. The result is mostly orthogonal ontologies, and ontologies that are not modular enough to reuse in part or adapt for new purposes, in spite of existing, standards for ontology representation. Additional obstacles to sharing and reuse include a lack of maintenance once a project is completed. The obstacles prevent the full exploitation of semantic technologies in a context where they could become needed enablers for service discovery and for matching data with services. To start addressing this gap, we have deployed BioPortal, a mature, domain-independent ontology and semantic service system developed by the National Center for Biomedical Ontologies (NCBO), on the ESIP Testbed under the governance of the ESIP Semantic Web cluster. ESIP provides a forum for a broad-based, distributed community of data and information technology practitioners and stakeholders to coordinate their efforts and develop new ideas for interoperability solutions. The Testbed provides an environment where innovations and best practices can be explored and evaluated. One objective of this deployment is to provide a community platform that would harness the organizational and cyber infrastructure provided by ESIP at minimal costs. Another objective is to host ontology services on a scalable, public cloud and investigate the business case for crowd sourcing of ontology maintenance. We deployed the

  6. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  7. Large Eddy Simulations of Continental Boundary Layer Clouds Observed during the RACORO Field Campaign

    NASA Astrophysics Data System (ADS)

    Endo, S.; Fridlind, A. M.; Lin, W.; Vogelmann, A. M.; Toto, T.; Liu, Y.

    2013-12-01

    Three cases of boundary layer clouds are analyzed in the FAst-physics System TEstbed and Research (FASTER) project, based on continental boundary-layer-cloud observations during the RACORO Campaign [Routine Atmospheric Radiation Measurement (ARM) Aerial Facility (AAF) Clouds with Low Optical Water Depths (CLOWD) Optical Radiative Observations] at the ARM Climate Research Facility's Southern Great Plains (SGP) site. The three 60-hour case study periods are selected to capture the temporal evolution of cumulus, stratiform, and drizzling boundary-layer cloud systems under a range of conditions, intentionally including those that are relatively more mixed or transitional in nature versus being of a purely canonical type. Multi-modal and temporally varying aerosol number size distribution profiles are derived from aircraft observations. Large eddy simulations (LESs) are performed for the three case study periods using the GISS Distributed Hydrodynamic Aerosol and Radiative Modeling Application (DHARMA) model and the WRF-FASTER model, which is the Weather Research and Forecasting (WRF) model implemented with forcing ingestion and other functions to constitute a flexible LES. The two LES models commonly capture the significant transitions of cloud-topped boundary layers in the three periods: diurnal evolution of cumulus layers repeating over multiple days, nighttime evolution/daytime diminution of thick stratus, and daytime breakup of stratus and stratocumulus clouds. Simulated transitions of thermodynamic structures of the cloud-topped boundary layers are examined by balloon-borne soundings and ground-based remote sensors. Aircraft observations are then used to statistically evaluate the predicted cloud droplet number size distributions under varying aerosol and cloud conditions. An ensemble approach is used to refine the model configuration for the combined use of observations with parallel LES and single-column model simulations. See Lin et al. poster for single

  8. Isolating the Liquid Cloud Response to Recent Arctic Sea Ice Variability Using Spaceborne Lidar Observations

    NASA Astrophysics Data System (ADS)

    Morrison, A. L.; Kay, J. E.; Chepfer, H.; Guzman, R.; Yettella, V.

    2018-01-01

    While the radiative influence of clouds on Arctic sea ice is known, the influence of sea ice cover on Arctic clouds is challenging to detect, separate from atmospheric circulation, and attribute to human activities. Providing observational constraints on the two-way relationship between sea ice cover and Arctic clouds is important for predicting the rate of future sea ice loss. Here we use 8 years of CALIPSO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations) spaceborne lidar observations from 2008 to 2015 to analyze Arctic cloud profiles over sea ice and over open water. Using a novel surface mask to restrict our analysis to where sea ice concentration varies, we isolate the influence of sea ice cover on Arctic Ocean clouds. The study focuses on clouds containing liquid water because liquid-containing clouds are the most important cloud type for radiative fluxes and therefore for sea ice melt and growth. Summer is the only season with no observed cloud response to sea ice cover variability: liquid cloud profiles are nearly identical over sea ice and over open water. These results suggest that shortwave summer cloud feedbacks do not slow long-term summer sea ice loss. In contrast, more liquid clouds are observed over open water than over sea ice in the winter, spring, and fall in the 8 year mean and in each individual year. Observed fall sea ice loss cannot be explained by natural variability alone, which suggests that observed increases in fall Arctic cloud cover over newly open water are linked to human activities.

  9. Two-channel microwave radiometer for observations of total column precipitable water vapor and cloud liquid water path

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liljegren, J.C.

    1994-01-01

    The Atmospheric Radiation Measurement (ARM) Program is focused on improving the treatment of radiation transfer in models of the atmospheric general circulation, as well as on improving parameterizations of cloud properties and formation processes in these models (USDOE, 1990). To help achieve these objectives, ARM is deploying several two-channel, microwave radiometers at the Cloud and Radiation Testbed (CART) site in Oklahoma for the purpose of obtaining long time series observations of total precipitable water vapor (PWV) and cloud liquid water path (LWP). The performance of the WVR-1100 microwave radiometer deployed by ARM at the Oklahoma CART site central facility tomore » provide time series measurements precipitable water vapor (PWV) and liquid water path (LWP) has been presented. The instrument has proven to be durable and reliable in continuous field operation since June, 1992. The accuracy of the PWV has been demonstrated to achieve the limiting accuracy of the statistical retrieval under clear sky conditions, degrading with increasing LWP. Improvements are planned to address moisture accumulation on the Teflon window, as well as to identity the presence of clouds with LWP at or below the retrieval uncertainty.« less

  10. A Turbine-powered UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; High, James W.; Guerreiro, Nelson M.; Chambers, Ryan S.; Howard, Keith D.

    2007-01-01

    The latest version of the NASA Flying Controls Testbed (FLiC) integrates commercial-off-the-shelf components including airframe, autopilot, and a small turbine engine to provide a low cost experimental flight controls testbed capable of sustained speeds up to 200 mph. The series of flight tests leading up to the demonstrated performance of the vehicle in sustained, autopiloted 200 mph flight at NASA Wallops Flight Facility's UAV runway in August 2006 will be described. Earlier versions of the FLiC were based on a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate at Fort Eustis, Virginia and NASA Langley Research Center. The newer turbine powered platform (J-FLiC) builds on the successes using the relatively smaller, slower and less expensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches with the implementation of C-coded experimental controllers. Tracking video was taken during the test flights at Wallops and will be available for presentation at the conference. Analysis of flight data from both remotely piloted and autopiloted flights will be presented. Candidate experimental controllers for implementation will be discussed. It is anticipated that flight testing will resume in Spring 2007 and those results will be included, if possible.

  11. The Cloud-Aerosol Transport System (CATS): a New Lidar for Aerosol and Cloud Profiling from the International Space Station

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; McGill, Matthew J.; Yorks, John E.; Hlavka, Dennis L.; Hart, William D.; Palm, Stephen P.; Colarco, Peter R.

    2011-01-01

    Spaceborne lidar profiling of aerosol and cloud layers has been successfully implemented during a number of prior missions, including LITE, ICESat, and CALIPSO. Each successive mission has added increased capability and further expanded the role of these unique measurements in wide variety of applications ranging from climate, to air quality, to special event monitoring (ie, volcanic plumes). Many researchers have come to rely on the availability of profile data from CALIPSO, especially data coincident with measurements from other A-Train sensors. The CALIOP lidar on CALIPSO continues to operate well as it enters its fifth year of operations. However, active instruments have more limited lifetimes than their passive counterparts, and we are faced with a potential gap in lidar profiling from space if the CALIOP lidar fails before a new mission is operational. The ATLID lidar on EarthCARE is not expected to launch until 2015 or later, and the lidar component of NASA's proposed Aerosols, Clouds, and Ecosystems (ACE) mission would not be until after 2020. Here we present a new aerosol and cloud lidar that was recently selected to provide profiling data from the International Space Station (ISS) starting in 2013. The Cloud-Aerosol Transport System (CATS) is a three wavelength (1064, 532, 355 nm) elastic backscatter lidar with HSRL capability at 532 nm. Depolarization measurements will be made at all wavelengths. The primary objective of CATS is to continue the CALIPSO aerosol and cloud profile data record, ideally with overlap between both missions and EarthCARE. In addition, the near real time data capability of the ISS will enable CATS to support operational applications such as air quality and special event monitoring. The HSRL channel will provide a demonstration of technology and a data testbed for direct extinction retrievals in support of ACE mission development. An overview of the instrument and mission will be provided, along with a summary of the science

  12. The Cloud-Aerosol Transport System (CATS): A New Lidar for Aerosol and Cloud Profiling from the International Space Station

    NASA Technical Reports Server (NTRS)

    Welton, Ellsworth J.; McGill, Mathew J.; Yorks. John E.; Hlavka, Dennis L.; Hart, William D.; Palm, Stephen P.; Colarco, Peter R.

    2012-01-01

    Spaceborne lidar profiling of aerosol and cloud layers has been successfully implemented during a number of prior missions, including LITE, ICESat, and CALIPSO. Each successive mission has added increased capability and further expanded the role of these unique measurements in wide variety of applications ranging from climate, to air quality, to special event monitoring (ie, volcanic plumes). Many researchers have come to rely on the availability of profile data from CALIPSO, especially data coincident with measurements from other A-Train sensors. The CALIOP lidar on CALIPSO continues to operate well as it enters its fifth year of operations. However, active instruments have more limited lifetimes than their passive counterparts, and we are faced with a potential gap in lidar profiling from space if the CALIOP lidar fails before a new mission is operational. The ATLID lidar on EarthCARE is not expected to launch until 2015 or later, and the lidar component of NASA's proposed Aerosols, Clouds, and Ecosystems (ACE) mission would not be until after 2020. Here we present a new aerosol and cloud lidar that was recently selected to provide profiling data from the International Space Station (ISS) starting in 2013. The Cloud-Aerosol Transport System (CATS) is a three wavelength (1064,532,355 nm) elastic backscatter lidar with HSRL capability at 532 nm. Depolarization measurements will be made at all wavelengths. The primary objective of CATS is to continue the CALIPSO aerosol and cloud profile data record, ideally with overlap between both missions and EarthCARE. In addition, the near real time (NRT) data capability ofthe ISS will enable CATS to support operational applications such as aerosol and air quality forecasting and special event monitoring. The HSRL channel will provide a demonstration of technology and a data testbed for direct extinction retrievals in support of ACE mission development. An overview of the instrument and mission will be provided, along with a

  13. !CHAOS: A cloud of controls

    NASA Astrophysics Data System (ADS)

    Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.

    2016-01-01

    The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.

  14. Recent select Sample Analysis at Mars (SAM) Testbed analog results

    NASA Astrophysics Data System (ADS)

    Malespin, C.; McAdam, A.; Teinturier, S.; Eigenbrode, J. L.; Freissinet, C.; Knudson, C. A.; Lewis, J. M.; Millan, M.; Steele, A.; Stern, J. C.; Williams, A. J.

    2017-12-01

    The Sample Analysis at Mars (SAM) testbed (TB) is a high fidelity replica of the flight instrument currently onboard the Curiosity rover in Gale Crater, Mars1. The SAM testbed is housed in a Mars environment chamber at NASA Goddard Space Flight Center (GSFC), which can replicate both thermal and environmental conditions. The testbed is used to validate and test new experimental procedures before they are implemented on Mars, but it is also used to analyze analog samples which assists in the interpretation of results from the surface. Samples are heated using the same experimental protocol as on Mars to allow for direct comparison with Martian sampling conditions. Here we report preliminary results from select samples that were loaded into the SAM TB, including meteorites, an organically rich iron oxide, and a synthetic analog to the Martian Cumberland sample drilled by the rover at Yellowknife Bay. Each of these samples have been analyzed under SAM-like conditions using breadboard and lab instrument systems. By comparing the data from the lab systems and SAM TB, further insight on results from Mars can be gained. References: [1] Mahaffy, P. R., et al. (2013), Science, 341(6143), 263-266, doi:10.1126/science.1237966.

  15. Experimental Studies in a Reconfigurable C4 Test-bed for Network Enabled Capability

    DTIC Science & Technology

    2006-06-01

    Cross1, Dr R. Houghton1, and Mr R. McMaster1 Defence Technology Centre for Human factors Integration (DTC HFI ) BITlab, School of Engineering and Design...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defence Technology Centre for Human factors Integration (DTC HFI ) BITlab, School of...studies into NEC by the Human Factors Integration Defence Technology Centre ( HFI -DTC). DEVELOPMENT OF THE TESTBED In brief, the C4 test-bed

  16. A Battery Certification Testbed for Small Satellite Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  17. Wave Clouds over Ireland

    NASA Image and Video Library

    2017-12-08

    Visualization Date 2003-12-18 Clouds ripple over Ireland and Scotland in a wave pattern, similar to the pattern of waves along a seashore. The similarity is not coincidental — the atmosphere behaves like a fluid, so when it encounters an obstacle, it must move around it. This movement forms a wave, and the wave movement can continue for long distances. In this case, the waves were caused by the air moving over and around the mountains of Scotland and Ireland. As the air crested a wave, it cooled, and clouds formed. Then, as the air sank into the trough, the air warmed, and clouds did not form. This pattern repeated itself, with clouds appearing at the peak of every wave. Other types of clouds are also visible in the scene. Along the northwestern and southwestern edges of this true-color image from December 17, 2003, are normal mid-altitude clouds with fairly uniform appearances. High altitude cirrus-clouds float over these, casting their shadows on the lower clouds. Open- and closed-cell clouds formed off the coast of northwestern France, and thin contrail clouds are visible just east of these. Contrail clouds form around the particles carried in airplane exhaust. Fog is also visible in the valleys east of the Cambrian Mountains, along the border between northern/central Wales and England. This is an Aqua MODIS image. Sensor Aqua/MODIS Credit Jacques Descloitres, MODIS Rapid Response Team, NASA/GSFC For more information go to: visibleearth.nasa.gov/view_rec.php?id=6146

  18. The Living With a Star Space Environment Testbed Payload

    NASA Technical Reports Server (NTRS)

    Xapsos, Mike

    2015-01-01

    This presentation outlines a brief description of the Living With a Star (LWS) Program missions and detailed information about the Space Environment Testbed (SET) payload consisting of a space weather monitor and carrier containing 4 board experiments.

  19. Exploring the nonlinear cloud and rain equation

    NASA Astrophysics Data System (ADS)

    Koren, Ilan; Tziperman, Eli; Feingold, Graham

    2017-01-01

    Marine stratocumulus cloud decks are regarded as the reflectors of the climate system, returning back to space a significant part of the income solar radiation, thus cooling the atmosphere. Such clouds can exist in two stable modes, open and closed cells, for a wide range of environmental conditions. This emergent behavior of the system, and its sensitivity to aerosol and environmental properties, is captured by a set of nonlinear equations. Here, using linear stability analysis, we express the transition from steady to a limit-cycle state analytically, showing how it depends on the model parameters. We show that the control of the droplet concentration (N), the environmental carrying-capacity (H0), and the cloud recovery parameter (τ) can be linked by a single nondimensional parameter (μ=√{N }/(ατH0) ) , suggesting that for deeper clouds the transition from open (oscillating) to closed (stable fixed point) cells will occur for higher droplet concentration (i.e., higher aerosol loading). The analytical calculations of the possible states, and how they are affected by changes in aerosol and the environmental variables, provide an enhanced understanding of the complex interactions of clouds and rain.

  20. Remote Sensing of Clouds for Solar Forecasting Applications

    NASA Astrophysics Data System (ADS)

    Mejia, Felipe

    is explored for different products, including surface irradiance, extinction coefficients and Liquid Water Path, as a function of the number of available sky imagers (SIs) and setup distance. Increasing the number of cameras improves the accuracy of the 3-D reconstruction: For surface irradiance, the error decreases significantly up to four imagers at which point the improvements become marginal while k error continues to decrease with more cameras. The ideal distance between imagers was also explored: For a cloud height of 1 km, increasing distance up to 3 km (the domain length) improved the 3-D reconstruction for surface irradiance, while k error continued to decrease with increasing decrease. An iterative reconstruction technique was also used to improve the results of the ART by minimizing the error between input images and reconstructed simulations. For the best case of a nine imager deployment, the ART and iterative method resulted in 53.4% and 33.6% mean average error (MAE) for the extinction coefficients, respectively. The tomographic methods were then tested on real world test cases in the Uni- versity of California San Diego's (UCSD) solar testbed. Five UCSD sky imagers (USI) were installed across the testbed based on the best performing distances in simulations. Topographic obstruction is explored as a source of error by analyzing the increased error with obstruction in the field of view of the horizon. As more of the horizon is obstructed the error increases. If at least a field of view of 70° is available for the camera the accuracy is within 2% of the full field of view. Errors caused by stray light are also explored by removing the circumsolar region from images and comparing the cloud reconstruction to a full image. Removing less than 30% of the circumsolar region image and GHI errors were within 0.2% of the full image while errors in k increased 1%. Removing more than 30° around the sun resulted in inaccurate cloud reconstruction. Using four of the

  1. Sensor System Performance Evaluation and Benefits from the NPOESS Airborne Sounder Testbed-Interferometer (NAST-I)

    NASA Technical Reports Server (NTRS)

    Larar, A.; Zhou, D.; Smith, W.

    2009-01-01

    Advanced satellite sensors are tasked with improving global-scale measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring, and environmental change detection. Validation of the entire measurement system is crucial to achieving this goal and thus maximizing research and operational utility of resultant data. Field campaigns employing satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft are an essential part of this validation task. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Airborne Sounder Testbed-Interferometer (NAST-I) has been a fundamental contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This paper focuses on some of the challenges associated with validating advanced atmospheric sounders and the benefits obtained from employing airborne interferometers such as the NAST-I. Select results from underflights of the Aqua Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) obtained during recent field campaigns will be presented.

  2. Model-Based Diagnosis in a Power Distribution Test-Bed

    NASA Technical Reports Server (NTRS)

    Scarl, E.; McCall, K.

    1998-01-01

    The Rodon model-based diagnosis shell was applied to a breadboard test-bed, modeling an automated power distribution system. The constraint-based modeling paradigm and diagnostic algorithm were found to adequately represent the selected set of test scenarios.

  3. Precision Mapping of the California Connected Vehicle Testbed Corridor

    DOT National Transportation Integrated Search

    2015-11-01

    In this project the University of California Riverside mapping sensor hardware was successfully mounted on an instrumented vehicle to map a segment of the California Connected Vehicle testbed corridor on State Route 82. After calibrating the sensor p...

  4. RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.

    PubMed

    Varghese, Blesson; Patel, Ishan; Barker, Adam

    2015-01-01

    Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.

  5. Cloud and surface textural features in polar regions

    NASA Technical Reports Server (NTRS)

    Welch, Ronald M.; Kuo, Kwo-Sen; Sengupta, Sailes K.

    1990-01-01

    The study examines the textural signatures of clouds, ice-covered mountains, solid and broken sea ice and floes, and open water. The textural features are computed from sum and difference histogram and gray-level difference vector statistics defined at various pixel displacement distances derived from Landsat multispectral scanner data. Polar cloudiness, snow-covered mountainous regions, solid sea ice, glaciers, and open water have distinguishable texture features. This suggests that textural measures can be successfully applied to the detection of clouds over snow-covered mountains, an ability of considerable importance for the modeling of snow-melt runoff. However, broken stratocumulus cloud decks and thin cirrus over broken sea ice remain difficult to distinguish texturally. It is concluded that even with high spatial resolution imagery, it may not be possible to distinguish broken stratocumulus and thin clouds from sea ice in the marginal ice zone using the visible channel textural features alone.

  6. Satellite Testbed for Evaluating Cryogenic-Liquid Behavior in Microgravity

    NASA Technical Reports Server (NTRS)

    Putman, Philip Travis (Inventor)

    2017-01-01

    Provided is a testbed for conducting an experiment on a substance in a cryogenic liquid state in a microgravity environment. The testbed includes a frame with rectangular nominal dimensions, and a source section including a supply of the substance to be evaluated in the cryogenic liquid state. An experiment section includes an experiment vessel in fluid communication with the storage section to receive the substance from the storage section and condense the substance into the cryogenic liquid state. A sensor is adapted to sense a property of the substance in the cryogenic liquid state in the experiment vessel as part of the experiment. A bus section includes a controller configured to control delivery of the substance from the storage section to the experiment vessel, and receive property data indicative of the property sensed by the sensor for subsequent evaluation on Earth.

  7. The SMART-NAS Testbed

    NASA Technical Reports Server (NTRS)

    Aquilina, Rudolph A.

    2015-01-01

    The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.

  8. [Porting Radiotherapy Software of Varian to Cloud Platform].

    PubMed

    Zou, Lian; Zhang, Weisha; Liu, Xiangxiang; Xie, Zhao; Xie, Yaoqin

    2017-09-30

    To develop a low-cost private cloud platform of radiotherapy software. First, a private cloud platform which was based on OpenStack and the virtual GPU hardware was builded. Then on the private cloud platform, all the Varian radiotherapy software modules were installed to the virtual machine, and the corresponding function configuration was completed. Finally the software on the cloud was able to be accessed by virtual desktop client. The function test results of the cloud workstation show that a cloud workstation is equivalent to an isolated physical workstation, and any clients on the LAN can use the cloud workstation smoothly. The cloud platform transplantation in this study is economical and practical. The project not only improves the utilization rates of radiotherapy software, but also makes it possible that the cloud computing technology can expand its applications to the field of radiation oncology.

  9. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  10. SAVA 3: A testbed for integration and control of visual processes

    NASA Technical Reports Server (NTRS)

    Crowley, James L.; Christensen, Henrik

    1994-01-01

    The development of an experimental test-bed to investigate the integration and control of perception in a continuously operating vision system is described. The test-bed integrates a 12 axis robotic stereo camera head mounted on a mobile robot, dedicated computer boards for real-time image acquisition and processing, and a distributed system for image description. The architecture was designed to: (1) be continuously operating, (2) integrate software contributions from geographically dispersed laboratories, (3) integrate description of the environment with 2D measurements, 3D models, and recognition of objects, (4) capable of supporting diverse experiments in gaze control, visual servoing, navigation, and object surveillance, and (5) dynamically reconfiguarable.

  11. Autonomous Flying Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2005-01-01

    The Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis,Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights.

  12. Space Environments Testbed

    NASA Technical Reports Server (NTRS)

    Leucht, David K.; Koslosky, Marie J.; Kobe, David L.; Wu, Jya-Chang C.; Vavra, David A.

    2011-01-01

    The Space Environments Testbed (SET) is a flight controller data system for the Common Carrier Assembly. The SET-1 flight software provides the command, telemetry, and experiment control to ground operators for the SET-1 mission. Modes of operation (see dia gram) include: a) Boot Mode that is initiated at application of power to the processor card, and runs memory diagnostics. It may be entered via ground command or autonomously based upon fault detection. b) Maintenance Mode that allows for limited carrier health monitoring, including power telemetry monitoring on a non-interference basis. c) Safe Mode is a predefined, minimum power safehold configuration with power to experiments removed and carrier functionality minimized. It is used to troubleshoot problems that occur during flight. d) Operations Mode is used for normal experiment carrier operations. It may be entered only via ground command from Safe Mode.

  13. The Research of the Parallel Computing Development from the Angle of Cloud Computing

    NASA Astrophysics Data System (ADS)

    Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun

    2017-10-01

    Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.

  14. Laser-induced plasma cloud interaction and ice multiplication under cirrus cloud conditions.

    PubMed

    Leisner, Thomas; Duft, Denis; Möhler, Ottmar; Saathoff, Harald; Schnaiter, Martin; Henin, Stefano; Stelmaszczyk, Kamil; Petrarca, Massimo; Delagrange, Raphaëlle; Hao, Zuoqiang; Lüder, Johannes; Petit, Yannick; Rohwetter, Philipp; Kasparian, Jérôme; Wolf, Jean-Pierre; Wöste, Ludger

    2013-06-18

    Potential impacts of lightning-induced plasma on cloud ice formation and precipitation have been a subject of debate for decades. Here, we report on the interaction of laser-generated plasma channels with water and ice clouds observed in a large cloud simulation chamber. Under the conditions of a typical storm cloud, in which ice and supercooled water coexist, no direct influence of the plasma channels on ice formation or precipitation processes could be detected. Under conditions typical for thin cirrus ice clouds, however, the plasma channels induced a surprisingly strong effect of ice multiplication. Within a few minutes, the laser action led to a strong enhancement of the total ice particle number density in the chamber by up to a factor of 100, even though only a 10(-9) fraction of the chamber volume was exposed to the plasma channels. The newly formed ice particles quickly reduced the water vapor pressure to ice saturation, thereby increasing the cloud optical thickness by up to three orders of magnitude. A model relying on the complete vaporization of ice particles in the laser filament and the condensation of the resulting water vapor on plasma ions reproduces our experimental findings. This surprising effect might open new perspectives for remote sensing of water vapor and ice in the upper troposphere.

  15. On the reversibility of transitions between closed and open cellular convection

    DOE PAGES

    Feingold, G.; Koren, I.; Yamaguchi, T.; ...

    2015-07-08

    The two-way transition between closed and open cellular convection is addressed in an idealized cloud-resolving modeling framework. A series of cloud-resolving simulations shows that the transition between closed and open cellular states is asymmetrical and characterized by a rapid ("runaway") transition from the closed- to the open-cell state but slower recovery to the closed-cell state. Given that precipitation initiates the closed–open cell transition and that the recovery requires a suppression of the precipitation, we apply an ad hoc time-varying drop concentration to initiate and suppress precipitation. We show that the asymmetry in the two-way transition occurs even for very rapidmore » drop concentration replenishment. The primary barrier to recovery is the loss in turbulence kinetic energy (TKE) associated with the loss in cloud water (and associated radiative cooling) and the vertical stratification of the boundary layer during the open-cell period. In transitioning from the open to the closed state, the system faces the task of replenishing cloud water fast enough to counter precipitation losses, such that it can generate radiative cooling and TKE. It is hampered by a stable layer below cloud base that has to be overcome before water vapor can be transported more efficiently into the cloud layer. Recovery to the closed-cell state is slower when radiative cooling is inefficient such as in the presence of free tropospheric clouds or after sunrise, when it is hampered by the absorption of shortwave radiation. Tests suggest that recovery to the closed-cell state is faster when the drizzle is smaller in amount and of shorter duration, i.e., when the precipitation causes less boundary layer stratification. Cloud-resolving model results on recovery rates are supported by simulations with a simple predator–prey dynamical system analogue. It is suggested that the observed closing of open cells by ship effluent likely occurs when aerosol intrusions are large

  16. Commissioning Results on the JWST Testbed Telescope

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.; Acton, D. Scott

    2006-01-01

    The one-meter 18 segment JWST Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate commissioning operations for the JWST Observatory. Eight different commissioning activities were tested on the TBT: telescope focus sweep, segment ID and Search, image array, global alignment, image stacking, coarse phasing, fine phasing, and multi-field phasing. This paper describes recent commissioning results from experiments performed on the TBT.

  17. Operation Duties on the F-15B Research Testbed

    NASA Technical Reports Server (NTRS)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  18. Terahertz standoff imaging testbed design and performance for concealed weapon and device identification model development

    NASA Astrophysics Data System (ADS)

    Franck, Charmaine C.; Lee, Dave; Espinola, Richard L.; Murrill, Steven R.; Jacobs, Eddie L.; Griffin, Steve T.; Petkie, Douglas T.; Reynolds, Joe

    2007-04-01

    This paper describes the design and performance of the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate's (NVESD), active 0.640-THz imaging testbed, developed in support of the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. The laboratory measurements and standoff images were acquired during the development of a NVESD and Army Research Laboratory terahertz imaging performance model. The imaging testbed is based on a 12-inch-diameter Off-Axis Elliptical (OAE) mirror designed with one focal length at 1 m and the other at 10 m. This paper will describe the design considerations of the OAE-mirror, dual-capability, active imaging testbed, as well as measurement/imaging results used to further develop the model.

  19. Software Defined Networking challenges and future direction: A case study of implementing SDN features on OpenStack private cloud

    NASA Astrophysics Data System (ADS)

    Shamugam, Veeramani; Murray, I.; Leong, J. A.; Sidhu, Amandeep S.

    2016-03-01

    Cloud computing provides services on demand instantly, such as access to network infrastructure consisting of computing hardware, operating systems, network storage, database and applications. Network usage and demands are growing at a very fast rate and to meet the current requirements, there is a need for automatic infrastructure scaling. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Managing complex environments using traditional networks is time-consuming and expensive, especially in the case of generating virtual machines, migration and network configuration. To mitigate the challenges, network operations require efficient, flexible, agile and scalable software defined networks (SDN). This paper discuss various issues in SDN and suggests how to mitigate the network management related issues. A private cloud prototype test bed was setup to implement the SDN on the OpenStack platform to test and evaluate the various network performances provided by the various configurations.

  20. Implementation of a Wireless Time Distribution Testbed Protected with Quantum Key Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonior, Jason D; Evans, Philip G; Sheets, Gregory S

    2017-01-01

    Secure time transfer is critical for many timesensitive applications. the Global Positioning System (GPS) which is often used for this purpose has been shown to be susceptible to spoofing attacks. Quantum Key Distribution offers a way to securely generate encryption keys at two locations. Through careful use of this information it is possible to create a system that is more resistant to spoofing attacks. In this paper we describe our work to create a testbed which utilizes QKD and traditional RF links. This testbed will be used for the development of more secure and spoofing resistant time distribution protocols.

  1. Shaped pupil coronagraphy for WFIRST: high-contrast broadband testbed demonstration

    NASA Astrophysics Data System (ADS)

    Cady, Eric; Balasubramanian, Kunjithapatham; Gersh-Range, Jessica; Kasdin, Jeremy; Kern, Brian; Lam, Raymond; Mejia Prada, Camilo; Moody, Dwight; Patterson, Keith; Poberezhskiy, Ilya; Riggs, A. J. Eldorado; Seo, Byoung-Joon; Shi, Fang; Tang, Hong; Trauger, John; Zhou, Hanying; Zimmerman, Neil

    2017-09-01

    The Shaped Pupil Coronagraph (SPC) is one of the two operating modes of the WFIRST coronagraph instrument. The SPC provides starlight suppression in a pair of wedge-shaped regions over an 18% bandpass, and is well suited for spectroscopy of known exoplanets. To demonstrate this starlight suppression in the presence of expected onorbit input wavefront disturbances, we have recently built a dynamic testbed at JPL analogous to the WFIRST flight instrument architecture, with both Hybrid Lyot Coronagraph (HLC) and SPC architectures and a Low Order Wavefront Sensing and Control (LOWFS/C) subsystem to apply, sense, and correct dynamic wavefront disturbances. We present our best up-to-date results of the SPC mode demonstration from the testbed, in both static and dynamic conditions, along with model comparisons. HLC results will be reported separately.

  2. Phased Array Antenna Testbed Development at the NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Lambert, Kevin M.; Kubat, Gregory; Johnson, Sandra K.; Anzic, Godfrey

    2003-01-01

    Ideal phased array antennas offer advantages for communication systems, such as wide-angle scanning and multibeam operation, which can be utilized in certain NASA applications. However, physically realizable, electronically steered, phased array antennas introduce additional system performance parameters, which must be included in the evaluation of the system. The NASA Glenn Research Center (GRC) is currently conducting research to identify these parameters and to develop the tools necessary to measure them. One of these tools is a testbed where phased array antennas may be operated in an environment that simulates their use. This paper describes the development of the testbed and its use in characterizing a particular K-Band, phased array antenna.

  3. The Wisconsin Snow and Cloud-Terra 2000 Experiment (WISC-T2000)

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Atmospheric scientists take to the skies this winter for the Wisconsin Snow and Cloud-Terra 2000 experiment, Feb. 25 through March 13. Scientists in WISC-T2000 will use instruments on board NASA's ER-2, a high-altitude research plane, to validate new science products from NASA's earth-observing satellite Terra, which began its five-year mission on Dec. 18, 1999. Contact Terri Gregory Public Information Coordinator Space Science and Engineering Center University of Wisconsin-Madison (608) 263-3373; fax (608) 262-5974 terri.gregory@ssec.wisc.edu Science Goals: WISC-T2000 is the third in a series of field experiments sponsored by the University of Wisconsin-Madison's Space Science and Engineering Center. The center helped develop one of the five science instruments on Terra, the Moderate-Resolution Imaging Spectroradiometer (MODIS). MODIS will make global measurements of clouds, oceans, land, and atmospheric properties in an effort to monitor and predict global climate change. Infrastructure: The ER-2 will be based at Madison's Truax Field and will fly over the upper Midwest and Oklahoma. ER-2 measurements will be coordinated with observations at the Department of Energy's Cloud and Radiation Testbed site in Oklahoma (http://www.arm.gov/), which will be engaged in a complementary cloud experiment. The center will work closely with NASA's Goddard Space Flight Center, which will collect and distribute MODIS data and science products. Additional information on the WISC-T2000 field campaign is available at the project's Web site http://cimss.ssec.wisc.edu/wisct2000/

  4. Dynamic Extension of a Virtualized Cluster by using Cloud Resources

    NASA Astrophysics Data System (ADS)

    Oberst, Oliver; Hauth, Thomas; Kernert, David; Riedel, Stephan; Quast, Günter

    2012-12-01

    The specific requirements concerning the software environment within the HEP community constrain the choice of resource providers for the outsourcing of computing infrastructure. The use of virtualization in HPC clusters and in the context of cloud resources is therefore a subject of recent developments in scientific computing. The dynamic virtualization of worker nodes in common batch systems provided by ViBatch serves each user with a dynamically virtualized subset of worker nodes on a local cluster. Now it can be transparently extended by the use of common open source cloud interfaces like OpenNebula or Eucalyptus, launching a subset of the virtual worker nodes within the cloud. This paper demonstrates how a dynamically virtualized computing cluster is combined with cloud resources by attaching remotely started virtual worker nodes to the local batch system.

  5. Integration of Cloud resources in the LHCb Distributed Computing

    NASA Astrophysics Data System (ADS)

    Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel

    2014-06-01

    This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.

  6. Towards Efficient Scientific Data Management Using Cloud Storage

    NASA Technical Reports Server (NTRS)

    He, Qiming

    2013-01-01

    A software prototype allows users to backup and restore data to/from both public and private cloud storage such as Amazon's S3 and NASA's Nebula. Unlike other off-the-shelf tools, this software ensures user data security in the cloud (through encryption), and minimizes users operating costs by using space- and bandwidth-efficient compression and incremental backup. Parallel data processing utilities have also been developed by using massively scalable cloud computing in conjunction with cloud storage. One of the innovations in this software is using modified open source components to work with a private cloud like NASA Nebula. Another innovation is porting the complex backup to- cloud software to embedded Linux, running on the home networking devices, in order to benefit more users.

  7. Optical instruments synergy in determination of optical depth of thin clouds

    NASA Astrophysics Data System (ADS)

    Viviana Vlăduţescu, Daniela; Schwartz, Stephen E.; Huang, Dong

    2018-04-01

    Optically thin clouds have a strong radiative effect and need to be represented accurately in climate models. Cloud optical depth of thin clouds was retrieved using high resolution digital photography, lidar, and a radiative transfer model. The Doppler Lidar was operated at 1.5 μm, minimizing return from Rayleigh scattering, emphasizing return from aerosols and clouds. This approach examined cloud structure on scales 3 to 5 orders of magnitude finer than satellite products, opening new avenues for examination of cloud structure and evolution.

  8. Optical Instruments Synergy in Determination of Optical Depth of Thin Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vladutescu, Daniela V.; Schwartz, Stephen E.

    Optically thin clouds have a strong radiative effect and need to be represented accurately in climate models. Cloud optical depth of thin clouds was retrieved using high resolution digital photography, lidar, and a radiative transfer model. The Doppler Lidar was operated at 1.5 μm, minimizing return from Rayleigh scattering, emphasizing return from aerosols and clouds. This approach examined cloud structure on scales 3 to 5 orders of magnitude finer than satellite products, opening new avenues for examination of cloud structure and evolution.

  9. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...

  10. RACORO Continental Boundary Layer Cloud Investigations: 3. Separation of Parameterization Biases in Single-Column Model CAM5 Simulations of Shallow Cumulus

    NASA Technical Reports Server (NTRS)

    Lin, Wuyin; Liu, Yangang; Vogelmann, Andrew M.; Fridlind, Ann; Endo, Satoshi; Song, Hua; Feng, Sha; Toto, Tami; Li, Zhijin; Zhang, Minghua

    2015-01-01

    Climatically important low-level clouds are commonly misrepresented in climate models. The FAst-physics System TEstbed and Research (FASTER) Project has constructed case studies from the Atmospheric Radiation Measurement Climate Research Facility's Southern Great Plain site during the RACORO aircraft campaign to facilitate research on model representation of boundary-layer clouds. This paper focuses on using the single-column Community Atmosphere Model version 5 (SCAM5) simulations of a multi-day continental shallow cumulus case to identify specific parameterization causes of low-cloud biases. Consistent model biases among the simulations driven by a set of alternative forcings suggest that uncertainty in the forcing plays only a relatively minor role. In-depth analysis reveals that the model's shallow cumulus convection scheme tends to significantly under-produce clouds during the times when shallow cumuli exist in the observations, while the deep convective and stratiform cloud schemes significantly over-produce low-level clouds throughout the day. The links between model biases and the underlying assumptions of the shallow cumulus scheme are further diagnosed with the aid of large-eddy simulations and aircraft measurements, and by suppressing the triggering of the deep convection scheme. It is found that the weak boundary layer turbulence simulated is directly responsible for the weak cumulus activity and the simulated boundary layer stratiform clouds. Increased vertical and temporal resolutions are shown to lead to stronger boundary layer turbulence and reduction of low-cloud biases.

  11. RACORO continental boundary layer cloud investigations. 3. Separation of parameterization biases in single-column model CAM5 simulations of shallow cumulus

    DOE PAGES

    Lin, Wuyin; Liu, Yangang; Vogelmann, Andrew M.; ...

    2015-06-19

    Climatically important low-level clouds are commonly misrepresented in climate models. The FAst-physics System TEstbed and Research (FASTER) project has constructed case studies from the Atmospheric Radiation Measurement (ARM) Climate Research Facility's Southern Great Plain site during the RACORO aircraft campaign to facilitate research on model representation of boundary-layer clouds. This paper focuses on using the single-column Community Atmosphere Model version 5 (SCAM5) simulations of a multi-day continental shallow cumulus case to identify specific parameterization causes of low-cloud biases. Consistent model biases among the simulations driven by a set of alternative forcings suggest that uncertainty in the forcing plays only amore » relatively minor role. In-depth analysis reveals that the model's shallow cumulus convection scheme tends to significantly under-produce clouds during the times when shallow cumuli exist in the observations, while the deep convective and stratiform cloud schemes significantly over-produce low-level clouds throughout the day. The links between model biases and the underlying assumptions of the shallow cumulus scheme are further diagnosed with the aid of large-eddy simulations and aircraft measurements, and by suppressing the triggering of the deep convection scheme. It is found that the weak boundary layer turbulence simulated is directly responsible for the weak cumulus activity and the simulated boundary layer stratiform clouds. Increased vertical and temporal resolutions are shown to lead to stronger boundary layer turbulence and reduction of low-cloud biases.« less

  12. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  13. Design and Development of ChemInfoCloud: An Integrated Cloud Enabled Platform for Virtual Screening.

    PubMed

    Karthikeyan, Muthukumarasamy; Pandit, Deepak; Bhavasar, Arvind; Vyas, Renu

    2015-01-01

    The power of cloud computing and distributed computing has been harnessed to handle vast and heterogeneous data required to be processed in any virtual screening protocol. A cloud computing platorm ChemInfoCloud was built and integrated with several chemoinformatics and bioinformatics tools. The robust engine performs the core chemoinformatics tasks of lead generation, lead optimisation and property prediction in a fast and efficient manner. It has also been provided with some of the bioinformatics functionalities including sequence alignment, active site pose prediction and protein ligand docking. Text mining, NMR chemical shift (1H, 13C) prediction and reaction fingerprint generation modules for efficient lead discovery are also implemented in this platform. We have developed an integrated problem solving cloud environment for virtual screening studies that also provides workflow management, better usability and interaction with end users using container based virtualization, OpenVz.

  14. Recent Successes and Future Plans for NASA's Space Communications and Navigation Testbed on the International Space Station

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Sankovic, John M.; Johnson, Sandra K.; Lux, James P.; Chelmins, David T.

    2014-01-01

    Flexible and extensible space communications architectures and technology are essential to enable future space exploration and science activities. NASA has championed the development of the Space Telecommunications Radio System (STRS) software defined radio (SDR) standard and the application of SDR technology to reduce the costs and risks of using SDRs for space missions, and has developed an on-orbit testbed to validate these capabilities. The Space Communications and Navigation (SCaN) Testbed (previously known as the Communications, Navigation, and Networking reConfigurable Testbed (CoNNeCT)) is advancing SDR, on-board networking, and navigation technologies by conducting space experiments aboard the International Space Station. During its first year(s) on-orbit, the SCaN Testbed has achieved considerable accomplishments to better understand SDRs and their applications. The SDR platforms and software waveforms on each SDR have over 1500 hours of operation and are performing as designed. The Ka-band SDR on the SCaN Testbed is NASAs first space Ka-band transceiver and is NASA's first Ka-band mission using the Space Network. This has provided exciting opportunities to operate at Ka-band and assist with on-orbit tests of NASA newest Tracking and Data Relay Satellites (TDRS). During its first year, SCaN Testbed completed its first on-orbit SDR reconfigurations. SDR reconfigurations occur when implementing new waveforms on an SDR. SDR reconfigurations allow a radio to change minor parameters, such as data rate, or complete functionality. New waveforms which provide new capability and are reusable across different missions provide long term value for reconfigurable platforms such as SDRs. The STRS Standard provides guidelines for new waveform development by third parties. Waveform development by organizations other than the platform provider offers NASA the ability to develop waveforms itself and reduce its dependence and costs on the platform developer. Each of these

  15. The advanced orbiting systems testbed program: Results to date

    NASA Technical Reports Server (NTRS)

    Newsome, Penny A.; Otranto, John F.

    1993-01-01

    The Consultative Committee for Space Data Systems Recommendations for Packet Telemetry and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. Central to the AOST Program are the development of an end-to-end Testbed and its use in a comprehensive testing program. Other Program activities include flight-qualifiable component development, supporting studies, and knowledge dissemination. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations. The results presented in this paper include architectural issues, a draft proposed standardized test suite and flight-qualifiable components.

  16. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2012-01-01

    NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  17. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2013-01-01

    NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  18. A Testbed Environment for Buildings-to-Grid Cyber Resilience Research and Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sridhar, Siddharth; Ashok, Aditya; Mylrea, Michael E.

    The Smart Grid is characterized by the proliferation of advanced digital controllers at all levels of its operational hierarchy from generation to end consumption. Such controllers within modern residential and commercial buildings enable grid operators to exercise fine-grained control over energy consumption through several emerging Buildings-to-Grid (B2G) applications. Though this capability promises significant benefits in terms of operational economics and improved reliability, cybersecurity weaknesses in the supporting infrastructure could be exploited to cause a detrimental effect and this necessitates focused research efforts on two fronts. First, the understanding of how cyber attacks in the B2G space could impact grid reliabilitymore » and to what extent. Second, the development and validation of cyber-physical application-specific countermeasures that are complementary to traditional infrastructure cybersecurity mechanisms for enhanced cyber attack detection and mitigation. The PNNL B2G testbed is currently being developed to address these core research needs. Specifically, the B2G testbed combines high-fidelity buildings+grid simulators, industry-grade building automation and Supervisory Control and Data Acquisition (SCADA) systems in an integrated, realistic, and reconfigurable environment capable of supporting attack-impact-detection-mitigation experimentation. In this paper, we articulate the need for research testbeds to model various B2G applications broadly by looking at the end-to-end operational hierarchy of the Smart Grid. Finally, the paper not only describes the architecture of the B2G testbed in detail, but also addresses the broad spectrum of B2G resilience research it is capable of supporting based on the smart grid operational hierarchy identified earlier.« less

  19. Laser-induced plasma cloud interaction and ice multiplication under cirrus cloud conditions

    PubMed Central

    Leisner, Thomas; Duft, Denis; Möhler, Ottmar; Saathoff, Harald; Schnaiter, Martin; Henin, Stefano; Stelmaszczyk, Kamil; Petrarca, Massimo; Delagrange, Raphaëlle; Hao, Zuoqiang; Lüder, Johannes; Petit, Yannick; Rohwetter, Philipp; Kasparian, Jérôme; Wolf, Jean-Pierre; Wöste, Ludger

    2013-01-01

    Potential impacts of lightning-induced plasma on cloud ice formation and precipitation have been a subject of debate for decades. Here, we report on the interaction of laser-generated plasma channels with water and ice clouds observed in a large cloud simulation chamber. Under the conditions of a typical storm cloud, in which ice and supercooled water coexist, no direct influence of the plasma channels on ice formation or precipitation processes could be detected. Under conditions typical for thin cirrus ice clouds, however, the plasma channels induced a surprisingly strong effect of ice multiplication. Within a few minutes, the laser action led to a strong enhancement of the total ice particle number density in the chamber by up to a factor of 100, even though only a 10−9 fraction of the chamber volume was exposed to the plasma channels. The newly formed ice particles quickly reduced the water vapor pressure to ice saturation, thereby increasing the cloud optical thickness by up to three orders of magnitude. A model relying on the complete vaporization of ice particles in the laser filament and the condensation of the resulting water vapor on plasma ions reproduces our experimental findings. This surprising effect might open new perspectives for remote sensing of water vapor and ice in the upper troposphere. PMID:23733936

  20. Visually guided grasping to study teleprogrammation within the BAROCO testbed

    NASA Technical Reports Server (NTRS)

    Devy, M.; Garric, V.; Delpech, M.; Proy, C.

    1994-01-01

    This paper describes vision functionalities required in future orbital laboratories; in such systems, robots will be needed in order to execute the on-board scientific experiments or servicing and maintenance tasks under the remote control of ground operators. For this sake, ESA has proposed a robotic configuration called EMATS; a testbed has been developed by ESTEC in order to evaluate the potentialities of EMATS-like robot to execute scientific tasks in automatic mode. For the same context, CNES develops the BAROCO testbed to investigate remote control and teleprogrammation, in which high level primitives like 'Pick Object A' are provided as basic primitives. In nominal situations, the system has an a priori knowledge about the position of all objects. These positions are not very accurate, but this knowledge is sufficient in order to predict the position of the object which must be grasped, with respect to the manipulator frame. Vision is required in order to insure a correct grasping and to guarantee a good accuracy for the following operations. We describe our results about a visually guided grasping of static objects. It seems to be a very classical problem, and a lot of results are available. But, in many cases, it lacks a realistic evaluation of the accuracy, because such an evaluation requires tedious experiments. We propose several results about calibration of the experimental testbed, recognition algorithms required to locate a 3D polyhedral object, and the grasping itself.

  1. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  2. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol–radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m –2 (1 σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 Wmore » m –2), while fire POM induces a small effect (–0.05 and 0.04 ± 0.01 W m –2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol–cloud interactions (REaci) of all fire aerosols is –0.70 ± 0.05 W m –2, resulting mainly from the fire POM effect (–0.59 ± 0.03 W m –2). REari (0.43 ± 0.03 W m –2) and REaci (–1.38 ± 0.23 W m –2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and –0.82 ± 0.09 W m –2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to –15 W m –2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. Furthermore, the global annual mean RE due to surface-albedo changes (REsac) over land areas (0.030 ± 0.10 W m –2) is small and statistically insignificant and is mainly due to

  3. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    DOE PAGES

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong; ...

    2016-11-29

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol–radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m –2 (1 σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 Wmore » m –2), while fire POM induces a small effect (–0.05 and 0.04 ± 0.01 W m –2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol–cloud interactions (REaci) of all fire aerosols is –0.70 ± 0.05 W m –2, resulting mainly from the fire POM effect (–0.59 ± 0.03 W m –2). REari (0.43 ± 0.03 W m –2) and REaci (–1.38 ± 0.23 W m –2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and –0.82 ± 0.09 W m –2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to –15 W m –2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. Furthermore, the global annual mean RE due to surface-albedo changes (REsac) over land areas (0.030 ± 0.10 W m –2) is small and statistically insignificant and is mainly due to

  4. Spectroscopic Binary Star Studies with the Palomar Testbed Interferometer II

    NASA Astrophysics Data System (ADS)

    Boden, A. F.; Lane, B. F.; Creech-Eakman, M.; Queloz, D.; PTI Collaboration

    1999-12-01

    The Palomar Testbed Interferometer (PTI) is a long-baseline near-infrared interferometer located at Palomar Observatory. Following our previous work on resolving spectroscopic binary stars with the Palomar Testbed Interferometer (PTI), we will present a number of new visual and physical orbit determinations derived from integrated reductions of PTI visibility and archival radial velocity data. The six systems for which we will present new orbit models are: 12 Boo (HD 123999), 75 Cnc (HD 78418), 47 And (HD 8374), HD 205539, BY Draconis (HDE 234677), and 3 Boo (HD 120064). Most of these systems are double-lined binary systems (SB2), and integrated astrometric/radial velocity orbit modeling provides precise fundamental parameters (mass, luminosity) and system distance determinations comparable with Hipparcos precisions. The work described in this paper was performed under contract with the National Aeronautics and Space Administration.

  5. Collaboration in a Wireless Grid Innovation Testbed by Virtual Consortium

    NASA Astrophysics Data System (ADS)

    Treglia, Joseph; Ramnarine-Rieks, Angela; McKnight, Lee

    This paper describes the formation of the Wireless Grid Innovation Testbed (WGiT) coordinated by a virtual consortium involving academic and non-academic entities. Syracuse University and Virginia Tech are primary university partners with several other academic, government, and corporate partners. Objectives include: 1) coordinating knowledge sharing, 2) defining key parameters for wireless grids network applications, 3) dynamically connecting wired and wireless devices, content and users, 4) linking to VT-CORNET, Virginia Tech Cognitive Radio Network Testbed, 5) forming ad hoc networks or grids of mobile and fixed devices without a dedicated server, 6) deepening understanding of wireless grid application, device, network, user and market behavior through academic, trade and popular publications including online media, 7) identifying policy that may enable evaluated innovations to enter US and international markets and 8) implementation and evaluation of the international virtual collaborative process.

  6. Web Solutions Inspire Cloud Computing Software

    NASA Technical Reports Server (NTRS)

    2013-01-01

    An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.

  7. Developments at the Advanced Design Technologies Testbed

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    2003-01-01

    A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.

  8. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    NASA Astrophysics Data System (ADS)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  9. On-wire lithography-generated molecule-based transport junctions: a new testbed for molecular electronics.

    PubMed

    Chen, Xiaodong; Jeon, You-Moon; Jang, Jae-Won; Qin, Lidong; Huo, Fengwei; Wei, Wei; Mirkin, Chad A

    2008-07-02

    On-wire lithography (OWL) fabricated nanogaps are used as a new testbed to construct molecular transport junctions (MTJs) through the assembly of thiolated molecular wires across a nanogap formed between two Au electrodes. In addition, we show that one can use OWL to rapidly characterize a MTJ and optimize gap size for two molecular wires of different dimensions. Finally, we have used this new testbed to identify unusual temperature-dependent transport mechanisms for alpha,omega-dithiol terminated oligo(phenylene ethynylene).

  10. Updated Electronic Testbed System

    NASA Technical Reports Server (NTRS)

    Brewer, Kevin L.

    2001-01-01

    As we continue to advance in exploring space frontiers, technology must also advance. The need for faster data recovery and data processing is crucial. In this, the less equipment used, and lighter that equipment is, the better. Because integrated circuits become more sensitive in high altitude, experimental verification and quantification is required. The Center for Applied Radiation Research (CARR) at Prairie View A&M University was awarded a grant by NASA to participate in the NASA ER-2 Flight Program, the APEX balloon flight program, and the Student Launch Program. These programs are to test anomalous errors in integrated circuits due to single event effects (SEE). CARR had already begun experiments characterizing the SEE behavior of high speed and high density SRAM's. The research center built a error testing system using a PC-104 computer unit, an Iomega Zip drive for storage, a test board with the components under test, and a latchup detection and reset unit. A test program was written to continuously monitor a stored data pattern in the SRAM chip and record errors. The devices under test were eight 4Mbit memory chips totaling 4Mbytes of memory. CARR was successful at obtaining data using the Electronic TestBed System (EBS) in various NASA ER-2 test flights. These series of high altitude flights of up to 70,000 feet, were effective at yielding the conditions which single event effects usually occur. However, the data received from the series of flights indicated one error per twenty-four hours. Because flight test time is very expensive, the initial design proved not to be cost effective. The need for orders of magnitude with more memory became essential. Therefore, a project which could test more memory within a given time was created. The goal of this project was not only to test more memory within a given time, but also to have a system with a faster processing speed, and which used less peripherals. This paper will describe procedures used to build an

  11. Wavefront Control Testbed (WCT) Experiment Results

    NASA Technical Reports Server (NTRS)

    Burns, Laura A.; Basinger, Scott A.; Campion, Scott D.; Faust, Jessica A.; Feinberg, Lee D.; Hayden, William L.; Lowman, Andrew E.; Ohara, Catherine M.; Petrone, Peter P., III

    2004-01-01

    The Wavefront Control Testbed (WCT) was created to develop and test wavefront sensing and control algorithms and software for the segmented James Webb Space Telescope (JWST). Last year, we changed the system configuration from three sparse aperture segments to a filled aperture with three pie shaped segments. With this upgrade we have performed experiments on fine phasing with line-of-sight and segment-to-segment jitter, dispersed fringe visibility and grism angle;. high dynamic range tilt sensing; coarse phasing with large aberrations, and sampled sub-aperture testing. This paper reviews the results of these experiments.

  12. Application of the Semi-Empirical Force-Limiting Approach for the CoNNeCT SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Staab, Lucas D.; McNelis, Mark E.; Akers, James C.; Suarez, Vicente J.; Jones, Trevor M.

    2012-01-01

    The semi-empirical force-limiting vibration method was developed and implemented for payload testing to limit the structural impedance mismatch (high force) that occurs during shaker vibration testing. The method has since been extended for use in analytical models. The Space Communications and Navigation Testbed (SCAN Testbed), known at NASA as, the Communications, Navigation, and Networking re-Configurable Testbed (CoNNeCT), project utilized force-limiting testing and analysis following the semi-empirical approach. This paper presents the steps in performing a force-limiting analysis and then compares the results to test data recovered during the CoNNeCT force-limiting random vibration qualification test that took place at NASA Glenn Research Center (GRC) in the Structural Dynamics Laboratory (SDL) December 19, 2010 to January 7, 2011. A compilation of lessons learned and considerations for future force-limiting tests is also included.

  13. Dr. Tulga Ersal at NSF Workshop Accessible Remote Testbeds ART'15

    Science.gov Websites

    ;Enabling High-Fidelity Closed-Loop Integration of Remotely Accessible Testbeds" at the NSF Sponsored project (2010-2013) "Internet-Distributed Hardware-in-the-Loop Simulation". Sponsored by U.S

  14. Point Cloud Management Through the Realization of the Intelligent Cloud Viewer Software

    NASA Astrophysics Data System (ADS)

    Costantino, D.; Angelini, M. G.; Settembrini, F.

    2017-05-01

    The paper presents a software dedicated to the elaboration of point clouds, called Intelligent Cloud Viewer (ICV), made in-house by AESEI software (Spin-Off of Politecnico di Bari), allowing to view point cloud of several tens of millions of points, also on of "no" very high performance systems. The elaborations are carried out on the whole point cloud and managed by means of the display only part of it in order to speed up rendering. It is designed for 64-bit Windows and is fully written in C ++ and integrates different specialized modules for computer graphics (Open Inventor by SGI, Silicon Graphics Inc), maths (BLAS, EIGEN), computational geometry (CGAL, Computational Geometry Algorithms Library), registration and advanced algorithms for point clouds (PCL, Point Cloud Library), advanced data structures (BOOST, Basic Object Oriented Supporting Tools), etc. ICV incorporates a number of features such as, for example, cropping, transformation and georeferencing, matching, registration, decimation, sections, distances calculation between clouds, etc. It has been tested on photographic and TLS (Terrestrial Laser Scanner) data, obtaining satisfactory results. The potentialities of the software have been tested by carrying out the photogrammetric survey of the Castel del Monte which was already available in previous laser scanner survey made from the ground by the same authors. For the aerophotogrammetric survey has been adopted a flight height of approximately 1000ft AGL (Above Ground Level) and, overall, have been acquired over 800 photos in just over 15 minutes, with a covering not less than 80%, the planned speed of about 90 knots.

  15. Multi-Dimensional Optimization for Cloud Based Multi-Tier Applications

    ERIC Educational Resources Information Center

    Jung, Gueyoung

    2010-01-01

    Emerging trends toward cloud computing and virtualization have been opening new avenues to meet enormous demands of space, resource utilization, and energy efficiency in modern data centers. By being allowed to host many multi-tier applications in consolidated environments, cloud infrastructure providers enable resources to be shared among these…

  16. Dynamic VM Provisioning for TORQUE in a Cloud Environment

    NASA Astrophysics Data System (ADS)

    Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.

    2014-06-01

    Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.

  17. Evaluation of NCMRWF unified model vertical cloud structure with CloudSat over the Indian summer monsoon region

    NASA Astrophysics Data System (ADS)

    Jayakumar, A.; Mamgain, Ashu; Jisesh, A. S.; Mohandas, Saji; Rakhi, R.; Rajagopal, E. N.

    2016-05-01

    Representation of rainfall distribution and monsoon circulation in the high resolution versions of NCMRWF Unified model (NCUM-REG) for the short-range forecasting of extreme rainfall event is vastly dependent on the key factors such as vertical cloud distribution, convection and convection/cloud relationship in the model. Hence it is highly relevant to evaluate the vertical structure of cloud and precipitation of the model over the monsoon environment. In this regard, we utilized the synergy of the capabilities of CloudSat data for long observational period, by conditioning it for the synoptic situation of the model simulation period. Simulations were run at 4-km grid length with the convective parameterization effectively switched off and on. Since the sample of CloudSat overpasses through the monsoon domain is small, the aforementioned methodology may qualitatively evaluate the vertical cloud structure for the model simulation period. It is envisaged that the present study will open up the possibility of further improvement in the high resolution version of NCUM in the tropics for the Indian summer monsoon associated rainfall events.

  18. In situ observations of Arctic cloud properties across the Beaufort Sea marginal ice zone

    NASA Astrophysics Data System (ADS)

    Corr, C.; Moore, R.; Winstead, E.; Thornhill, K. L., II; Crosbie, E.; Ziemba, L. D.; Beyersdorf, A. J.; Chen, G.; Martin, R.; Shook, M.; Corbett, J.; Smith, W. L., Jr.; Anderson, B. E.

    2016-12-01

    Clouds play an important role in Arctic climate. This is particularly true over the Arctic Ocean where feedbacks between clouds and sea-ice impact the surface radiation budget through modifications of sea-ice extent, ice thickness, cloud base height, and cloud cover. This work summarizes measurements of Arctic cloud properties made aboard the NASA C-130 aircraft over the Beaufort Sea during ARISE (Arctic Radiation - IceBridge Sea&Ice Experiment) in September 2014. The influence of surface-type on cloud properties is also investigated. Specifically, liquid water content (LWC), droplet concentrations, and droplet size distributions are compared for clouds sampled over three distinct regimes in the Beaufort Sea: 1) open water, 2) the marginal ice zone, and 3) sea-ice. Regardless of surface type, nearly all clouds intercepted during ARISE were liquid-phase clouds. However, differences in droplet size distributions and concentrations were evident for the surface types; clouds over the MIZ and sea-ice generally had fewer and larger droplets compared to those over open water. The potential implication these results have for understanding cloud-surface albedo climate feedbacks in Arctic are discussed.

  19. Dynamic testbed demonstration of WFIRST coronagraph low order wavefront sensing and control (LOWFS/C)

    NASA Astrophysics Data System (ADS)

    Shi, Fang; Cady, Eric; Seo, Byoung-Joon; An, Xin; Balasubramanian, Kunjithapatham; Kern, Brian; Lam, Raymond; Marx, David; Moody, Dwight; Mejia Prada, Camilo; Patterson, Keith; Poberezhskiy, Ilya; Shields, Joel; Sidick, Erkin; Tang, Hong; Trauger, John; Truong, Tuan; White, Victor; Wilson, Daniel; Zhou, Hanying

    2017-09-01

    To maintain the required performance of WFIRST Coronagraph in a realistic space environment, a Low Order Wavefront Sensing and Control (LOWFS/C) subsystem is necessary. The LOWFS/C uses a Zernike wavefront sensor (ZWFS) with the phase shifting disk combined with the starlight rejecting occulting mask. For wavefront error corrections, WFIRST LOWFS/C uses a fast steering mirror (FSM) for line-of-sight (LoS) correction, a focusing mirror for focus drift correction, and one of the two deformable mirrors (DM) for other low order wavefront error (WFE) correction. As a part of technology development and demonstration for WFIRST Coronagraph, a dedicated Occulting Mask Coronagraph (OMC) testbed has been built and commissioned. With its configuration similar to the WFIRST flight coronagraph instrument the OMC testbed consists of two coronagraph modes, Shaped Pupil Coronagraph (SPC) and Hybrid Lyot Coronagraph (HLC), a low order wavefront sensor (LOWFS), and an optical telescope assembly (OTA) simulator which can generate realistic LoS drift and jitter as well as low order wavefront error that would be induced by the WFIRST telescope's vibration and thermal changes. In this paper, we will introduce the concept of WFIRST LOWFS/C, describe the OMC testbed, and present the testbed results of LOWFS sensor performance. We will also present our recent results from the dynamic coronagraph tests in which we have demonstrated of using LOWFS/C to maintain the coronagraph contrast with the presence of WFIRST-like line-of-sight and low order wavefront disturbances.

  20. An open, interoperable, transdisciplinary approach to a point cloud data service using OGC standards and open source software.

    NASA Astrophysics Data System (ADS)

    Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley

    2017-04-01

    High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for

  1. James Webb Space Telescope Optical Simulation Testbed: Segmented Mirror Phase Retrieval Testing

    NASA Astrophysics Data System (ADS)

    Laginja, Iva; Egron, Sylvain; Brady, Greg; Soummer, Remi; Lajoie, Charles-Philippe; Bonnefois, Aurélie; Long, Joseph; Michau, Vincent; Choquet, Elodie; Ferrari, Marc; Leboulleux, Lucie; Mazoyer, Johan; N’Diaye, Mamadou; Perrin, Marshall; Petrone, Peter; Pueyo, Laurent; Sivaramakrishnan, Anand

    2018-01-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a hardware simulator designed to produce JWST-like images. A model of the JWST three mirror anastigmat is realized with three lenses in form of a Cooke Triplet, which provides JWST-like optical quality over a field equivalent to a NIRCam module, and an Iris AO segmented mirror with hexagonal elements is standing in for the JWST segmented primary. This setup successfully produces images extremely similar to NIRCam images from cryotesting in terms of the PSF morphology and sampling relative to the diffraction limit.The testbed is used for staff training of the wavefront sensing and control (WFS&C) team and for independent analysis of WFS&C scenarios of the JWST. Algorithms like geometric phase retrieval (GPR) that may be used in flight and potential upgrades to JWST WFS&C will be explored. We report on the current status of the testbed after alignment, implementation of the segmented mirror, and testing of phase retrieval techniques.This optical bench complements other work at the Makidon laboratory at the Space Telescope Science Institute, including the investigation of coronagraphy for segmented aperture telescopes. Beyond JWST we intend to use JOST for WFS&C studies for future large segmented space telescopes such as LUVOIR.

  2. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Loop-scale Testbed Design Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Melin, Alexander M.; Kisner, Roger A.

    2016-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging to design and operate. Extreme environments limit the options for sensors and actuators and degrade their performance. Because sensors and actuators are necessary for feedback control, these limitations mean that designing embedded instrumentation and control systems for the challenging environments of nuclear reactors requires advanced technical solutions that are not available commercially. This report details the development of testbed that will be used for cross-cutting embedded instrumentation and control research for nuclear power applications. This research is funded by the Department of Energy's Nuclear Energy Enabling Technologymore » program's Advanced Sensors and Instrumentation topic. The design goal of the loop-scale testbed is to build a low temperature pump that utilizes magnetic bearing that will be incorporated into a water loop to test control system performance and self-sensing techniques. Specifically, this testbed will be used to analyze control system performance in response to nonlinear and cross-coupling fluid effects between the shaft axes of motion, rotordynamics and gyroscopic effects, and impeller disturbances. This testbed will also be used to characterize the performance losses when using self-sensing position measurement techniques. Active magnetic bearings are a technology that can reduce failures and maintenance costs in nuclear power plants. They are particularly relevant to liquid salt reactors that operate at high temperatures (700 C). Pumps used in the extreme environment of liquid salt reactors provide many engineering challenges that can be overcome with magnetic bearings and their associated embedded instrumentation and control. This report will give details of the mechanical design and electromagnetic design of the loop-scale embedded instrumentation and control testbed.« less

  3. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - AMS Testbed Detailed Requirements

    DOT National Transportation Integrated Search

    2016-04-20

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  4. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - Chicago testbed analysis plan.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  5. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - AMS Testbed Selection Criteria

    DOT National Transportation Integrated Search

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  6. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : Dallas testbed analysis plan.

    DOT National Transportation Integrated Search

    2016-06-16

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate theimpacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM)strategies. The outputs (mo...

  7. Airborne Subscale Transport Aircraft Research Testbed: Aircraft Model Development

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas L.; Langford, William M.; Hill, Jeffrey S.

    2005-01-01

    The Airborne Subscale Transport Aircraft Research (AirSTAR) testbed being developed at NASA Langley Research Center is an experimental flight test capability for research experiments pertaining to dynamics modeling and control beyond the normal flight envelope. An integral part of that testbed is a 5.5% dynamically scaled, generic transport aircraft. This remotely piloted vehicle (RPV) is powered by twin turbine engines and includes a collection of sensors, actuators, navigation, and telemetry systems. The downlink for the plane includes over 70 data channels, plus video, at rates up to 250 Hz. Uplink commands for aircraft control include over 30 data channels. The dynamic scaling requirement, which includes dimensional, weight, inertial, actuator, and data rate scaling, presents distinctive challenges in both the mechanical and electrical design of the aircraft. Discussion of these requirements and their implications on the development of the aircraft along with risk mitigation strategies and training exercises are included here. Also described are the first training (non-research) flights of the airframe. Additional papers address the development of a mobile operations station and an emulation and integration laboratory.

  8. Spacelab system analysis: A study of the Marshall Avionics System Testbed (MAST)

    NASA Astrophysics Data System (ADS)

    Ingels, Frank M.; Owens, John K.; Daniel, Steven P.; Ahmad, F.; Couvillion, W.

    1988-09-01

    An analysis of the Marshall Avionics Systems Testbed (MAST) communications requirements is presented. The average offered load for typical nodes is estimated. Suitable local area networks are determined.

  9. Spacelab system analysis: A study of the Marshall Avionics System Testbed (MAST)

    NASA Technical Reports Server (NTRS)

    Ingels, Frank M.; Owens, John K.; Daniel, Steven P.; Ahmad, F.; Couvillion, W.

    1988-01-01

    An analysis of the Marshall Avionics Systems Testbed (MAST) communications requirements is presented. The average offered load for typical nodes is estimated. Suitable local area networks are determined.

  10. Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approach

    NASA Astrophysics Data System (ADS)

    Baker, B.; Lee, T.; Buban, M.; Dumas, E. J.

    2017-12-01

    Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approachC. Bruce Baker1, Ed Dumas1,2, Temple Lee1,2, Michael Buban1,21NOAA ARL, Atmospheric Turbulence and Diffusion Division, Oak Ridge, TN2Oak Ridge Associated Universities, Oak Ridge, TN The development of a small Unmanned Aerial System (sUAS) testbeds that can be used to validate, integrate, calibrate and evaluate new technology and sensors for routine boundary layer research, validation of operational weather models, improvement of model parameterizations, and recording observations within high-impact storms is important for understanding the importance and impact of using sUAS's routinely as a new observing platform. The goal of the multi-testbed approach is to build a robust set of protocols to assess the cost and operational feasibility of unmanned observations for routine applications using various combinations of sUAS aircraft and sensors in different locations and field experiments. All of these observational testbeds serve different community needs, but they also use a diverse suite of methodologies for calibration and evaluation of different sensors and platforms for severe weather and boundary layer research. The primary focus will be to evaluate meteorological sensor payloads to measure thermodynamic parameters and define surface characteristics with visible, IR, and multi-spectral cameras. This evaluation will lead to recommendations for sensor payloads for VTOL and fixed-wing sUAS.

  11. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  12. VR Simulation Testbed: Improving Surface Telerobotics for the Deep Space Gateway

    NASA Astrophysics Data System (ADS)

    Walker, M. E.; Burns, J. O.; Szafir, D. J.

    2018-02-01

    Design of a virtual reality simulation testbed for prototyping surface telerobotics. The goal is to create a framework with robust physics and kinematics to allow simulated teleoperation and supervised control of lunar rovers and rapid UI prototyping.

  13. High-performance scientific computing in the cloud

    NASA Astrophysics Data System (ADS)

    Jorissen, Kevin; Vila, Fernando; Rehr, John

    2011-03-01

    Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.

  14. A network approach to the geometric structure of shallow cloud fields

    NASA Astrophysics Data System (ADS)

    Glassmeier, F.; Feingold, G.

    2017-12-01

    The representation of shallow clouds and their radiative impact is one of the largest challenges for global climate models. While the bulk properties of cloud fields, including effects of organization, are a very active area of research, the potential of the geometric arrangement of cloud fields for the development of new parameterizations has hardly been explored. Self-organized patterns are particularly evident in the cellular structure of Stratocumulus (Sc) clouds so readily visible in satellite imagery. Inspired by similar patterns in biology and physics, we approach pattern formation in Sc fields from the perspective of natural cellular networks. Our network analysis is based on large-eddy simulations of open- and closed-cell Sc cases. We find the network structure to be neither random nor characteristic to natural convection. It is independent of macroscopic cloud fields properties like the Sc regime (open vs closed) and its typical length scale (boundary layer height). The latter is a consequence of entropy maximization (Lewis's Law with parameter 0.16). The cellular pattern is on average hexagonal, where non-6 sided cells occur according to a neighbor-number distribution variance of about 2. Reflecting the continuously renewing dynamics of Sc fields, large (many-sided) cells tend to neighbor small (few-sided) cells (Aboav-Weaire Law with parameter 0.9). These macroscopic network properties emerge independent of the Sc regime because the different processes governing the evolution of closed as compared to open cells correspond to topologically equivalent network dynamics. By developing a heuristic model, we show that open and closed cell dynamics can both be mimicked by versions of cell division and cell disappearance and are biased towards the expansion of smaller cells. This model offers for the first time a fundamental and universal explanation for the geometric pattern of Sc clouds. It may contribute to the development of advanced Sc parameterizations

  15. Control structural interaction testbed: A model for multiple flexible body verification

    NASA Technical Reports Server (NTRS)

    Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.

    1993-01-01

    Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.

  16. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    NASA Technical Reports Server (NTRS)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  17. Flight Testing of Guidance, Navigation and Control Systems on the Mighty Eagle Robotic Lander Testbed

    NASA Technical Reports Server (NTRS)

    Hannan, Mike; Rickman, Doug; Chavers, Greg; Adam, Jason; Becker, Chris; Eliser, Joshua; Gunter, Dan; Kennedy, Logan; O'Leary, Patrick

    2015-01-01

    During 2011 a series of progressively more challenging flight tests of the Mighty Eagle autonomous terrestrial lander testbed were conducted primarily to validate the GNC system for a proposed lunar lander. With the successful completion of this GNC validation objective the opportunity existed to utilize the Mighty Eagle as a flying testbed for a variety of technologies. In 2012 an Autonomous Rendezvous and Capture (AR&C) algorithm was implemented in flight software and demonstrated in a series of flight tests. In 2012 a hazard avoidance system was developed and flight tested on the Mighty Eagle. Additionally, GNC algorithms from Moon Express and a MEMs IMU were tested in 2012. All of the testing described herein was above and beyond the original charter for the Mighty Eagle. In addition to being an excellent testbed for a wide variety of systems the Mighty Eagle also provided a great learning opportunity for many engineers and technicians to work a flight program.

  18. Phenomenology tools on cloud infrastructures using OpenStack

    NASA Astrophysics Data System (ADS)

    Campos, I.; Fernández-del-Castillo, E.; Heinemeyer, S.; Lopez-Garcia, A.; Pahlen, F.; Borges, G.

    2013-04-01

    We present a new environment for computations in particle physics phenomenology employing recent developments in cloud computing. On this environment users can create and manage "virtual" machines on which the phenomenology codes/tools can be deployed easily in an automated way. We analyze the performance of this environment based on "virtual" machines versus the utilization of physical hardware. In this way we provide a qualitative result for the influence of the host operating system on the performance of a representative set of applications for phenomenology calculations.

  19. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    NASA Technical Reports Server (NTRS)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  20. An operational open-end file transfer protocol for mobile satellite communications

    NASA Technical Reports Server (NTRS)

    Wang, Charles; Cheng, Unjeng; Yan, Tsun-Yee

    1988-01-01

    This paper describes an operational open-end file transfer protocol which includes the connecting procedure, data transfer, and relinquishment procedure for mobile satellite communications. The protocol makes use of the frame level and packet level formats of the X.25 standard for the data link layer and network layer, respectively. The structure of a testbed for experimental simulation of this protocol over a mobile fading channel is also introduced.

  1. Enhancing Security by System-Level Virtualization in Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei

    Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.

  2. The Fourier-Kelvin Stellar Interferometer (FKSI): A Progress Report and Preliminary Results from Our Laboratory Testbed

    NASA Technical Reports Server (NTRS)

    Berry, Richard; Rajagopa, J.; Danchi, W. C.; Allen, R. J.; Benford, D. J.; Deming, D.; Gezari, D. Y.; Kuchner, M.; Leisawitz, D. T.; Linfield, R.

    2005-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) is a mission concept for an imaging and nulling interferometer for the near-infrared to mid-infrared spectral region (3-8 microns). FKSI is conceived as a scientific and technological pathfinder to TPF/DARWIN as well as SPIRIT, SPECS, and SAFIR. It will also be a high angular resolution system complementary to JWST. The scientific emphasis of the mission is on the evolution of protostellar systems, from just after the collapse of the precursor molecular cloud core, through the formation of the disk surrounding the protostar, the formation of planets in the disk, and eventual dispersal of the disk material. FKSI will also search for brown dwarfs and Jupiter mass and smaller planets, and could also play a very powerful role in the investigation of the structure of active galactic nuclei and extra-galactic star formation. We report additional studies of the imaging capabilities of the FKSI with various configurations of two to five telescopes, studies of the capabilities of FKSI assuming an increase in long wavelength response to 10 or 12 microns (depending on availability of detectors), and preliminary results from our nulling testbed.

  3. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - San Diego testbed analysis plan.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  4. Atmospheric cloud physics laboratory project study

    NASA Technical Reports Server (NTRS)

    Schultz, W. E.; Stephen, L. A.; Usher, L. H.

    1976-01-01

    Engineering studies were performed for the Zero-G Cloud Physics Experiment liquid cooling and air pressure control systems. A total of four concepts for the liquid cooling system was evaluated, two of which were found to closely approach the systems requirements. Thermal insulation requirements, system hardware, and control sensor locations were established. The reservoir sizes and initial temperatures were defined as well as system power requirements. In the study of the pressure control system, fluid analyses by the Atmospheric Cloud Physics Laboratory were performed to determine flow characteristics of various orifice sizes, vacuum pump adequacy, and control systems performance. System parameters predicted in these analyses as a function of time include the following for various orifice sizes: (1) chamber and vacuum pump mass flow rates, (2) the number of valve openings or closures, (3) the maximum cloud chamber pressure deviation from the allowable, and (4) cloud chamber and accumulator pressure.

  5. Self-Similar Spin Images for Point Cloud Matching

    NASA Astrophysics Data System (ADS)

    Pulido, Daniel

    The rapid growth of Light Detection And Ranging (Lidar) technologies that collect, process, and disseminate 3D point clouds have allowed for increasingly accurate spatial modeling and analysis of the real world. Lidar sensors can generate massive 3D point clouds of a collection area that provide highly detailed spatial and radiometric information. However, a Lidar collection can be expensive and time consuming. Simultaneously, the growth of crowdsourced Web 2.0 data (e.g., Flickr, OpenStreetMap) have provided researchers with a wealth of freely available data sources that cover a variety of geographic areas. Crowdsourced data can be of varying quality and density. In addition, since it is typically not collected as part of a dedicated experiment but rather volunteered, when and where the data is collected is arbitrary. The integration of these two sources of geoinformation can provide researchers the ability to generate products and derive intelligence that mitigate their respective disadvantages and combine their advantages. Therefore, this research will address the problem of fusing two point clouds from potentially different sources. Specifically, we will consider two problems: scale matching and feature matching. Scale matching consists of computing feature metrics of each point cloud and analyzing their distributions to determine scale differences. Feature matching consists of defining local descriptors that are invariant to common dataset distortions (e.g., rotation and translation). Additionally, after matching the point clouds they can be registered and processed further (e.g., change detection). The objective of this research is to develop novel methods to fuse and enhance two point clouds from potentially disparate sources (e.g., Lidar and crowdsourced Web 2.0 datasets). The scope of this research is to investigate both scale and feature matching between two point clouds. The specific focus of this research will be in developing a novel local descriptor

  6. Acquisition and Development of a Cognitive Radio Based Wireless Monitoring and Surveillance Testbed for Future Battlefield Communications Research

    DTIC Science & Technology

    2015-03-01

    for Public Release; Distribution Unlimited Final Report: Acquisition and Development of A Cognitive Radio based Wireless Monitoring and Surveillance...journals: Final Report: Acquisition and Development of A Cognitive Radio based Wireless Monitoring and Surveillance Testbed for Future Battlefield...Opeyemi Oduola, Nan Zou, Xiangfang Li, Husheng Li, Lijun Qian. Distributed Spectrum Monitoring and Surveillance using a Cognitive Radio based Testbed

  7. ASDC Advances in the Utilization of Microservices and Hybrid Cloud Environments

    NASA Astrophysics Data System (ADS)

    Baskin, W. E.; Herbert, A.; Mazaika, A.; Walter, J.

    2017-12-01

    The Atmospheric Science Data Center (ASDC) is transitioning many of its software tools and applications to standalone microservices deployable in a hybrid cloud, offering benefits such as scalability and efficient environment management. This presentation features several projects the ASDC staff have implemented leveraging the OpenShift Container Application Platform and OpenStack Hybrid Cloud Environment focusing on key tools and techniques applied to: Earth Science data processing Spatial-Temporal metadata generation, validation, repair, and curation Archived Data discovery, visualization, and access

  8. Homomorphic encryption experiments on IBM's cloud quantum computing platform

    NASA Astrophysics Data System (ADS)

    Huang, He-Liang; Zhao, You-Wei; Li, Tan; Li, Feng-Guang; Du, Yu-Tao; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su

    2017-02-01

    Quantum computing has undergone rapid development in recent years. Owing to limitations on scalability, personal quantum computers still seem slightly unrealistic in the near future. The first practical quantum computer for ordinary users is likely to be on the cloud. However, the adoption of cloud computing is possible only if security is ensured. Homomorphic encryption is a cryptographic protocol that allows computation to be performed on encrypted data without decrypting them, so it is well suited to cloud computing. Here, we first applied homomorphic encryption on IBM's cloud quantum computer platform. In our experiments, we successfully implemented a quantum algorithm for linear equations while protecting our privacy. This demonstration opens a feasible path to the next stage of development of cloud quantum information technology.

  9. Planning and reasoning in the JPL telerobot testbed

    NASA Technical Reports Server (NTRS)

    Peters, Stephen; Mittman, David; Collins, Carol; Omeara, Jacquie; Rokey, Mark

    1990-01-01

    The Telerobot Interactive Planning System is developed to serve as the highest autonomous-control level of the Telerobot Testbed. A recent prototype is described which integrates an operator interface for supervisory control, a task planner supporting disassembly and re-assembly operations, and a spatial planner for collision-free manipulator motion through the workspace. Each of these components is described in detail. Descriptions of the technical problem, approach, and lessons learned are included.

  10. Long Duration Sorbent Testbed

    NASA Technical Reports Server (NTRS)

    Howard, David F.; Knox, James C.; Long, David A.; Miller, Lee; Cmaric, Gregory; Thomas, John

    2016-01-01

    The Long Duration Sorbent Testbed (LDST) is a flight experiment demonstration designed to expose current and future candidate carbon dioxide removal system sorbents to an actual crewed space cabin environment to assess and compare sorption working capacity degradation resulting from long term operation. An analysis of sorbent materials returned to Earth after approximately one year of operation in the International Space Station's (ISS) Carbon Dioxide Removal Assembly (CDRA) indicated as much as a 70% loss of working capacity of the silica gel desiccant material at the extreme system inlet location, with a gradient of capacity loss down the bed. The primary science objective is to assess the degradation of potential sorbents for exploration class missions and ISS upgrades when operated in a true crewed space cabin environment. A secondary objective is to compare degradation of flight test to a ground test unit with contaminant dosing to determine applicability of ground testing.

  11. X-34 Technology Testbed Demonstrator being mated with the L-1011 mothership

    NASA Image and Video Library

    1999-03-11

    This is the X-34 Technology Testbed Demonstrator being mated with the L-1011 mothership. The X-34 will demonstrate key vehicle and operational technologies applicable to future low-cost resuable launch vehicles.

  12. Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.

    PubMed

    Trudgian, David C; Mirzaei, Hamid

    2012-12-07

    We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.

  13. Cloud Coverage and Height Distribution from the GLAS Polar Orbiting Lidar: Comparison to Passive Cloud Retrievals

    NASA Technical Reports Server (NTRS)

    Spinhime, J. D.; Palm, S. P.; Hlavka, D. L.; Hart, W. D.; Mahesh, A.

    2004-01-01

    The Geoscience Laser Altimeter System (GLAS) began full on orbit operations in September 2003. A main application of the two-wavelength GLAS lidar is highly accurate detection and profiling of global cloud cover. Initial analysis indicates that cloud and aerosol layers are consistently detected on a global basis to cross-sections down to 10(exp -6) per meter. Images of the lidar data dramatically and accurately show the vertical structure of cloud and aerosol to the limit of signal attenuation. The GLAS lidar has made the most accurate measurement of global cloud coverage and height to date. In addition to the calibrated lidar signal, GLAS data products include multi level boundaries and optical depth of all transmissive layers. Processing includes a multi-variable separation of cloud and aerosol layers. An initial application of the data results is to compare monthly cloud means from several months of GLAS observations in 2003 to existing cloud climatologies from other satellite measurement. In some cases direct comparison to passive cloud retrievals is possible. A limitation of the lidar measurements is nadir only sampling. However monthly means exhibit reasonably good global statistics and coverage results, at other than polar regions, compare well with other measurements but show significant differences in height distribution. For polar regions where passive cloud retrievals are problematic and where orbit track density is greatest, the GLAS results are particularly an advance in cloud cover information. Direct comparison to MODIS retrievals show a better than 90% agreement in cloud detection for daytime, but less than 60% at night. Height retrievals are in much less agreement. GLAS is a part of the NASA EOS project and data products are thus openly available to the science community (see http://glo.gsfc.nasa.gov).

  14. One-Dimensional Spacecraft Formation Flight Testbed for Terrestrial Charged Relative Motion Experiments

    NASA Astrophysics Data System (ADS)

    Seubert, Carl R.

    Spacecraft operating in a desired formation offers an abundance of attractive mission capabilities. One proposed method of controlling a close formation of spacecraft is with Coulomb (electrostatic) forces. The Coulomb formation flight idea utilizes charge emission to drive the spacecraft to kilovolt-level potentials and generate adjustable, micronewton- to millinewton-level Coulomb forces for relative position control. In order to advance the prospects of the Coulomb formation flight concept, this dissertation presents the design and implementation of a unique one-dimensional testbed. The disturbances of the testbed are identified and reduced below 1 mN. This noise level offers a near-frictionless platform that is used to perform relative motion actuation with electrostatics in a terrestrial atmospheric environment. Potentials up to 30 kV are used to actuate a cart over a translational range of motion of 40 cm. A challenge to both theoretical and hardware implemented electrostatic actuation developments is correctly modeling the forces between finite charged bodies, outside a vacuum. To remedy this, studies of Earth orbit plasmas and Coulomb force theory is used to derive and propose a model of the Coulomb force between finite spheres in close proximity, in a plasma. This plasma force model is then used as a basis for a candidate terrestrial force model. The plasma-like parameters of this terrestrial model are estimated using charged motion data from fixed-potential, single-direction experiments on the testbed. The testbed is advanced to the level of autonomous feedback position control using solely Coulomb force actuation. This allows relative motion repositioning on a flat and level track as well as an inclined track that mimics the dynamics of two charged spacecraft that are aligned with the principal orbit axis. This controlled motion is accurately predicted with simulations using the terrestrial force model. This demonstrates similarities between the partial

  15. Climatic Implications of the Observed Temperature Dependence of the Liquid Water Path of Low Clouds in the Southern Great Plains

    NASA Technical Reports Server (NTRS)

    DelGenio, Anthony

    1999-01-01

    Satellite observations of low-level clouds have challenged the assumption that adiabatic liquid water content combined with constant physical thickness will lead to a negative cloud optics feedback in a decadal climate change. We explore the reasons for the satellite results using four years of surface remote sensing data from the Atmospheric Radiation Measurement Program Cloud and Radiation Testbed site in the Southern Great Plains of the United States. We find that low cloud liquid water path is approximately invariant with temperature in winter but decreases strongly with temperature in summer, consistent with the satellite inferences at this latitude. This behavior occurs because liquid water content shows no detectable temperature dependence while cloud physical thickness decreases with warming. Thinning of clouds with warming is observed on seasonal, synoptic, and diurnal time scales; it is most obvious in the warm sectors of baroclinic waves. Although cloud top is observed to slightly descend with warming, the primary cause of thinning, is the ascent of cloud base due to the reduction in surface relative humidity and the concomitant increase in the lifting condensation level of surface air. Low cloud liquid water path is not observed to be a continuous function of temperature. Rather, the behavior we observe is best explained as a transition in the frequency of occurrence of different boundary layer types. At cold temperatures, a mixture of stratified and convective boundary layers is observed, leading to a broad distribution of liquid water path values, while at warm temperatures, only convective boundary layers with small liquid water paths, some of them decoupled, are observed. Our results, combined with the earlier satellite inferences, imply that the commonly quoted 1.5C lower limit for the equilibrium global climate sensitivity to a doubling of CO2 which is based on models with near-adiabatic liquid water behavior and constant physical thickness

  16. Climatic Implications of the Observed Temperature Dependence of the Liquid Water Path of Low Clouds in the Southern Great Plains

    NASA Technical Reports Server (NTRS)

    DelGenio, Anthony D.; Wolf, Audrey B.

    1999-01-01

    Satellite observations of low-level clouds have challenged the assumption that adiabatic liquid water content combined with constant physical thickness will lead to a negative cloud optics feedback in a decadal climate change. We explore the reasons for the satellite results using four years of surface remote sensing data from the Atmospheric Radiation Measurement Program Cloud and Radiation Testbed site in the Southern Great Plains of the United States. We find that low cloud liquid water path is approximately invariant with temperature in winter but decreases strongly with temperature in summer, consistent with the satellite inferences at this latitude. This behavior occurs because liquid water content shows no detectable temperature dependence while cloud physical thickness decreases with warming. Thinning of clouds with warming is observed on seasonal, synoptic, and diurnal time scales; it is most obvious in the warm sectors of baroclinic waves. Although cloud top is observed to slightly descend with warming, the primary cause of thinning is the ascent of cloud base due to the reduction in surface relative humidity and the concomitant increase in the lifting condensation level of surface air. Low cloud liquid water path is not observed to be a continuous function of temperature. Rather, the behavior we observe is best explained as a transition in the frequency of occurrence of different boundary layer types: At cold temperatures, a mixture of stratified and convective boundary layers is observed, leading to a broad distribution of liquid water path values, while at warm temperatures, only convective boundary layers with small liquid water paths, some of them decoupled, are observed. Our results, combined with the earlier satellite inferences, imply that the commonly quoted 1.50 C lower limit for the equilibrium global climate sensitivity to a doubling of CO2, which is based on models with near-adiabatic liquid water behavior and constant physical thickness

  17. Cloud Response to Arctic Sea Ice Loss and Implications for Feedbacks in the CESM1 Climate Model

    NASA Astrophysics Data System (ADS)

    Morrison, A.; Kay, J. E.; Chepfer, H.; Guzman, R.; Bonazzola, M.

    2017-12-01

    Clouds have the potential to accelerate or slow the rate of Arctic sea ice loss through their radiative influence on the surface. Cloud feedbacks can therefore play into Arctic warming as clouds respond to changes in sea ice cover. As the Arctic moves toward an ice-free state, understanding how cloud - sea ice relationships change in response to sea ice loss is critical for predicting the future climate trajectory. From satellite observations we know the effect of present-day sea ice cover on clouds, but how will clouds respond to sea ice loss as the Arctic transitions to a seasonally open water state? In this study we use a lidar simulator to first evaluate cloud - sea ice relationships in the Community Earth System Model (CESM1) against present-day observations (2006-2015). In the current climate, the cloud response to sea ice is well-represented in CESM1: we see no summer cloud response to changes in sea ice cover, but more fall clouds over open water than over sea ice. Since CESM1 is credible for the current Arctic climate, we next assess if our process-based understanding of Arctic cloud feedbacks related to sea ice loss is relevant for understanding future Arctic clouds. In the future Arctic, summer cloud structure continues to be insensitive to surface conditions. As the Arctic warms in the fall, however, the boundary layer deepens and cloud fraction increases over open ocean during each consecutive decade from 2020 - 2100. This study will also explore seasonal changes in cloud properties such as opacity and liquid water path. Results thus far suggest that a positive fall cloud - sea ice feedback exists in the present-day and future Arctic climate.

  18. SPHERES as Formation Flight Algorithm Development and Validation Testbed: Current Progress and Beyond

    NASA Technical Reports Server (NTRS)

    Kong, Edmund M.; Saenz-Otero, Alvar; Nolet, Simon; Berkovitz, Dustin S.; Miller, David W.; Sell, Steve W.

    2004-01-01

    The MIT-SSL SPHERES testbed provides a facility for the development of algorithms necessary for the success of Distributed Satellite Systems (DSS). The initial development contemplated formation flight and docking control algorithms; SPHERES now supports the study of metrology, control, autonomy, artificial intelligence, and communications algorithms and their effects on DSS projects. To support this wide range of topics, the SPHERES design contemplated the need to support multiple researchers, as echoed from both the hardware and software designs. The SPHERES operational plan further facilitates the development of algorithms by multiple researchers, while the operational locations incrementally increase the ability of the tests to operate in a representative environment. In this paper, an overview of the SPHERES testbed is first presented. The SPHERES testbed serves as a model of the design philosophies that allow for the various researches being carried out on such a facility. The implementation of these philosophies are further highlighted in the three different programs that are currently scheduled for testing onboard the International Space Station (ISS) and three that are proposed for a re-flight mission: Mass Property Identification, Autonomous Rendezvous and Docking, TPF Multiple Spacecraft Formation Flight in the first flight and Precision Optical Pointing, Tethered Formation Flight and Mars Orbit Sample Retrieval for the re-flight mission.

  19. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  20. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs : summary report for the Chicago testbed.

    DOT National Transportation Integrated Search

    2017-04-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  1. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the Chicago Testbed

    DOT National Transportation Integrated Search

    2017-04-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  2. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - Pasadena testbed analysis plan : final report.

    DOT National Transportation Integrated Search

    2016-06-30

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  3. Space Station technology testbed: 2010 deep space transport

    NASA Technical Reports Server (NTRS)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  4. Design and construction of a 76m long-travel laser enclosure for a space occulter testbed

    NASA Astrophysics Data System (ADS)

    Galvin, Michael; Kim, Yunjong; Kasdin, N. Jeremy; Sirbu, Dan; Vanderbei, Robert; Echeverri, Dan; Sagolla, Giuseppe; Rousing, Andreas; Balasubramanian, Kunjithapatham; Ryan, Daniel; Shaklan, Stuart; Lisman, Doug

    2016-07-01

    Princeton University is upgrading our space occulter testbed. In particular, we are lengthening it to 76m to achieve flightlike Fresnel numbers. This much longer testbed required an all-new enclosure design. In this design, we prioritized modularity and the use of commercial off-the-shelf (COTS) and semi-COTS components. Several of the technical challenges encountered included an unexpected slow beam drift and black paint selection. Herein we describe the design and construction of this long-travel laser enclosure.

  5. Identification of Program Signatures from Cloud Computing System Telemetry Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nichols, Nicole M.; Greaves, Mark T.; Smith, William P.

    Malicious cloud computing activity can take many forms, including running unauthorized programs in a virtual environment. Detection of these malicious activities while preserving the privacy of the user is an important research challenge. Prior work has shown the potential viability of using cloud service billing metrics as a mechanism for proxy identification of malicious programs. Previously this novel detection method has been evaluated in a synthetic and isolated computational environment. In this paper we demonstrate the ability of billing metrics to identify programs, in an active cloud computing environment, including multiple virtual machines running on the same hypervisor. The openmore » source cloud computing platform OpenStack, is used for private cloud management at Pacific Northwest National Laboratory. OpenStack provides a billing tool (Ceilometer) to collect system telemetry measurements. We identify four different programs running on four virtual machines under the same cloud user account. Programs were identified with up to 95% accuracy. This accuracy is dependent on the distinctiveness of telemetry measurements for the specific programs we tested. Future work will examine the scalability of this approach for a larger selection of programs to better understand the uniqueness needed to identify a program. Additionally, future work should address the separation of signatures when multiple programs are running on the same virtual machine.« less

  6. High-resolution photography of clouds from the surface: Retrieval of optical depth of thin clouds down to centimeter scales: High-Resolution Photography of Clouds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, Stephen E.; Huang, Dong; Vladutescu, Daniela Viviana

    This article describes the approach and presents initial results, for a period of several minutes in north central Oklahoma, of an examination of clouds by high resolution digital photography from the surface looking vertically upward. A commercially available camera having 35-mm equivalent focal length up to 1200 mm (nominal resolution as fine as 6 µrad, which corresponds to 9 mm for cloud height 1.5 km) is used to obtain a measure of zenith radiance of a 30 m × 30 m domain as a two-dimensional image consisting of 3456 × 3456 pixels (12 million pixels). Downwelling zenith radiance varies substantiallymore » within single images and between successive images obtained at 4-s intervals. Variation in zenith radiance found on scales down to about 10 cm is attributed to variation in cloud optical depth (COD). Attention here is directed primarily to optically thin clouds, COD less than about 2. A radiation transfer model used to relate downwelling zenith radiance to COD and to relate the counts in the camera image to zenith radiance, permits determination of COD on a pixel-by-pixel basis. COD for thin clouds determined in this way exhibits considerable variation, for example, an order of magnitude within 15 m, a factor of 2 within 4 m, and 25% (0.12 to 0.15) over 14 cm. In conclusion, this approach, which examines cloud structure on scales 3 to 5 orders of magnitude finer than satellite products, opens new avenues for examination of cloud structure and evolution.« less

  7. High-resolution photography of clouds from the surface: Retrieval of optical depth of thin clouds down to centimeter scales: High-Resolution Photography of Clouds

    DOE PAGES

    Schwartz, Stephen E.; Huang, Dong; Vladutescu, Daniela Viviana

    2017-03-08

    This article describes the approach and presents initial results, for a period of several minutes in north central Oklahoma, of an examination of clouds by high resolution digital photography from the surface looking vertically upward. A commercially available camera having 35-mm equivalent focal length up to 1200 mm (nominal resolution as fine as 6 µrad, which corresponds to 9 mm for cloud height 1.5 km) is used to obtain a measure of zenith radiance of a 30 m × 30 m domain as a two-dimensional image consisting of 3456 × 3456 pixels (12 million pixels). Downwelling zenith radiance varies substantiallymore » within single images and between successive images obtained at 4-s intervals. Variation in zenith radiance found on scales down to about 10 cm is attributed to variation in cloud optical depth (COD). Attention here is directed primarily to optically thin clouds, COD less than about 2. A radiation transfer model used to relate downwelling zenith radiance to COD and to relate the counts in the camera image to zenith radiance, permits determination of COD on a pixel-by-pixel basis. COD for thin clouds determined in this way exhibits considerable variation, for example, an order of magnitude within 15 m, a factor of 2 within 4 m, and 25% (0.12 to 0.15) over 14 cm. In conclusion, this approach, which examines cloud structure on scales 3 to 5 orders of magnitude finer than satellite products, opens new avenues for examination of cloud structure and evolution.« less

  8. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  9. Inter-comparison of soil moisture sensors from the soil moisture active passive marena Oklahoma in situ sensor testbed (SMAP-MOISST)

    USDA-ARS?s Scientific Manuscript database

    The diversity of in situ soil moisture network protocols and instrumentation led to the development of a testbed for comparing in situ soil moisture sensors. Located in Marena, Oklahoma on the Oklahoma State University Range Research Station, the testbed consists of four base stations. Each station ...

  10. Large-scale, high-performance and cloud-enabled multi-model analytics experiments in the context of the Earth System Grid Federation

    NASA Astrophysics Data System (ADS)

    Fiore, S.; Płóciennik, M.; Doutriaux, C.; Blanquer, I.; Barbera, R.; Williams, D. N.; Anantharaj, V. G.; Evans, B. J. K.; Salomoni, D.; Aloisio, G.

    2017-12-01

    The increased models resolution in the development of comprehensive Earth System Models is rapidly leading to very large climate simulations output that pose significant scientific data management challenges in terms of data sharing, processing, analysis, visualization, preservation, curation, and archiving.Large scale global experiments for Climate Model Intercomparison Projects (CMIP) have led to the development of the Earth System Grid Federation (ESGF), a federated data infrastructure which has been serving the CMIP5 experiment, providing access to 2PB of data for the IPCC Assessment Reports. In such a context, running a multi-model data analysis experiment is very challenging, as it requires the availability of a large amount of data related to multiple climate models simulations and scientific data management tools for large-scale data analytics. To address these challenges, a case study on climate models intercomparison data analysis has been defined and implemented in the context of the EU H2020 INDIGO-DataCloud project. The case study has been tested and validated on CMIP5 datasets, in the context of a large scale, international testbed involving several ESGF sites (LLNL, ORNL and CMCC), one orchestrator site (PSNC) and one more hosting INDIGO PaaS services (UPV). Additional ESGF sites, such as NCI (Australia) and a couple more in Europe, are also joining the testbed. The added value of the proposed solution is summarized in the following: it implements a server-side paradigm which limits data movement; it relies on a High-Performance Data Analytics (HPDA) stack to address performance; it exploits the INDIGO PaaS layer to support flexible, dynamic and automated deployment of software components; it provides user-friendly web access based on the INDIGO Future Gateway; and finally it integrates, complements and extends the support currently available through ESGF. Overall it provides a new "tool" for climate scientists to run multi-model experiments. At the

  11. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  12. Cloudweaver: Adaptive and Data-Driven Workload Manager for Generic Clouds

    NASA Astrophysics Data System (ADS)

    Li, Rui; Chen, Lei; Li, Wen-Syan

    Cloud computing denotes the latest trend in application development for parallel computing on massive data volumes. It relies on clouds of servers to handle tasks that used to be managed by an individual server. With cloud computing, software vendors can provide business intelligence and data analytic services for internet scale data sets. Many open source projects, such as Hadoop, offer various software components that are essential for building a cloud infrastructure. Current Hadoop (and many others) requires users to configure cloud infrastructures via programs and APIs and such configuration is fixed during the runtime. In this chapter, we propose a workload manager (WLM), called CloudWeaver, which provides automated configuration of a cloud infrastructure for runtime execution. The workload management is data-driven and can adapt to dynamic nature of operator throughput during different execution phases. CloudWeaver works for a single job and a workload consisting of multiple jobs running concurrently, which aims at maximum throughput using a minimum set of processors.

  13. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write

  14. Photonically enabled Ka-band radar and infrared sensor subscale testbed

    NASA Astrophysics Data System (ADS)

    Lohr, Michele B.; Sova, Raymond M.; Funk, Kevin B.; Airola, Marc B.; Dennis, Michael L.; Pavek, Richard E.; Hollenbeck, Jennifer S.; Garrison, Sean K.; Conard, Steven J.; Terry, David H.

    2014-10-01

    A subscale radio frequency (RF) and infrared (IR) testbed using novel RF-photonics techniques for generating radar waveforms is currently under development at The Johns Hopkins University Applied Physics Laboratory (JHU/APL) to study target scenarios in a laboratory setting. The linearity of Maxwell's equations allows the use of millimeter wavelengths and scaled-down target models to emulate full-scale RF scene effects. Coupled with passive IR and visible sensors, target motions and heating, and a processing and algorithm development environment, this testbed provides a means to flexibly and cost-effectively generate and analyze multi-modal data for a variety of applications, including verification of digital model hypotheses, investigation of correlated phenomenology, and aiding system capabilities assessment. In this work, concept feasibility is demonstrated for simultaneous RF, IR, and visible sensor measurements of heated, precessing, conical targets and of a calibration cylinder. Initial proof-of-principle results are shown of the Ka-band subscale radar, which models S-band for 1/10th scale targets, using stretch processing and Xpatch models.

  15. Improving data workflow systems with cloud services and use of open data for bioinformatics research.

    PubMed

    Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich

    2017-04-16

    Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.

  16. Passive Thermal Design Approach for the Space Communications and Navigation (SCaN) Testbed Experiment on the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Siamidis, John; Yuko, Jim

    2014-01-01

    The Space Communications and Navigation (SCaN) Program Office at NASA Headquarters oversees all of NASAs space communications activities. SCaN manages and directs the ground-based facilities and services provided by the Deep Space Network (DSN), Near Earth Network (NEN), and the Space Network (SN). Through the SCaN Program Office, NASA GRC developed a Software Defined Radio (SDR) testbed experiment (SCaN testbed experiment) for use on the International Space Station (ISS). It is comprised of three different SDR radios, the Jet Propulsion Laboratory (JPL) radio, Harris Corporation radio, and the General Dynamics Corporation radio. The SCaN testbed experiment provides an on-orbit, adaptable, SDR Space Telecommunications Radio System (STRS) - based facility to conduct a suite of experiments to advance the Software Defined Radio, Space Telecommunications Radio Systems (STRS) standards, reduce risk (Technology Readiness Level (TRL) advancement) for candidate Constellation future space flight hardware software, and demonstrate space communication links critical to future NASA exploration missions. The SCaN testbed project provides NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, software defined radio platforms and the STRS Architecture.The SCaN testbed is resident on the P3 Express Logistics Carrier (ELC) on the exterior truss of the International Space Station (ISS). The SCaN testbed payload launched on the Japanese Aerospace Exploration Agency (JAXA) H-II Transfer Vehicle (HTV) and was installed on the ISS P3 ELC located on the inboard RAM P3 site. The daily operations and testing are managed out of NASA GRC in the Telescience Support Center (TSC).

  17. Cloud Infrastructure & Applications - CloudIA

    NASA Astrophysics Data System (ADS)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  18. Frequency and causes of failed MODIS cloud property retrievals for liquid phase clouds over global oceans.

    PubMed

    Cho, Hyoun-Myoung; Zhang, Zhibo; Meyer, Kerry; Lebsock, Matthew; Platnick, Steven; Ackerman, Andrew S; Di Girolamo, Larry; C-Labonnote, Laurent; Cornet, Céline; Riedi, Jerome; Holz, Robert E

    2015-05-16

    Moderate Resolution Imaging Spectroradiometer (MODIS) retrieves cloud droplet effective radius ( r e ) and optical thickness ( τ ) by projecting observed cloud reflectances onto a precomputed look-up table (LUT). When observations fall outside of the LUT, the retrieval is considered "failed" because no combination of τ and r e within the LUT can explain the observed cloud reflectances. In this study, the frequency and potential causes of failed MODIS retrievals for marine liquid phase (MLP) clouds are analyzed based on 1 year of Aqua MODIS Collection 6 products and collocated CALIOP and CloudSat observations. The retrieval based on the 0.86 µm and 2.1 µm MODIS channel combination has an overall failure rate of about 16% (10% for the 0.86 µm and 3.7 µm combination). The failure rates are lower over stratocumulus regimes and higher over the broken trade wind cumulus regimes. The leading type of failure is the " r e too large" failure accounting for 60%-85% of all failed retrievals. The rest is mostly due to the " r e too small" or τ retrieval failures. Enhanced retrieval failure rates are found when MLP cloud pixels are partially cloudy or have high subpixel inhomogeneity, are located at special Sun-satellite viewing geometries such as sunglint, large viewing or solar zenith angles, or cloudbow and glory angles, or are subject to cloud masking, cloud overlapping, and/or cloud phase retrieval issues. The majority (more than 84%) of failed retrievals along the CALIPSO track can be attributed to at least one or more of these potential reasons. The collocated CloudSat radar reflectivity observations reveal that the remaining failed retrievals are often precipitating. It remains an open question whether the extremely large r e values observed in these clouds are the consequence of true cloud microphysics or still due to artifacts not included in this study.

  19. Frequency and causes of failed MODIS cloud property retrievals for liquid phase clouds over global oceans

    PubMed Central

    Cho, Hyoun‐Myoung; Meyer, Kerry; Lebsock, Matthew; Platnick, Steven; Ackerman, Andrew S.; Di Girolamo, Larry; C.‐Labonnote, Laurent; Cornet, Céline; Riedi, Jerome; Holz, Robert E.

    2015-01-01

    Abstract Moderate Resolution Imaging Spectroradiometer (MODIS) retrieves cloud droplet effective radius (r e) and optical thickness (τ) by projecting observed cloud reflectances onto a precomputed look‐up table (LUT). When observations fall outside of the LUT, the retrieval is considered “failed” because no combination of τ and r e within the LUT can explain the observed cloud reflectances. In this study, the frequency and potential causes of failed MODIS retrievals for marine liquid phase (MLP) clouds are analyzed based on 1 year of Aqua MODIS Collection 6 products and collocated CALIOP and CloudSat observations. The retrieval based on the 0.86 µm and 2.1 µm MODIS channel combination has an overall failure rate of about 16% (10% for the 0.86 µm and 3.7 µm combination). The failure rates are lower over stratocumulus regimes and higher over the broken trade wind cumulus regimes. The leading type of failure is the “r e too large” failure accounting for 60%–85% of all failed retrievals. The rest is mostly due to the “r e too small” or τ retrieval failures. Enhanced retrieval failure rates are found when MLP cloud pixels are partially cloudy or have high subpixel inhomogeneity, are located at special Sun‐satellite viewing geometries such as sunglint, large viewing or solar zenith angles, or cloudbow and glory angles, or are subject to cloud masking, cloud overlapping, and/or cloud phase retrieval issues. The majority (more than 84%) of failed retrievals along the CALIPSO track can be attributed to at least one or more of these potential reasons. The collocated CloudSat radar reflectivity observations reveal that the remaining failed retrievals are often precipitating. It remains an open question whether the extremely large r e values observed in these clouds are the consequence of true cloud microphysics or still due to artifacts not included in this study. PMID:27656330

  20. A study on strategic provisioning of cloud computing services.

    PubMed

    Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.

  1. A Study on Strategic Provisioning of Cloud Computing Services

    PubMed Central

    Rejaul Karim Chowdhury, Md

    2014-01-01

    Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243

  2. The Orlando TDWR testbed and airborne wind shear date comparison results

    NASA Technical Reports Server (NTRS)

    Campbell, Steven; Berke, Anthony; Matthews, Michael

    1992-01-01

    The focus of this talk is on comparing terminal Doppler Weather Radar (TDWR) and airborne wind shear data in computing a microburst hazard index called the F factor. The TDWR is a ground-based system for detecting wind shear hazards to aviation in the terminal area. The Federal Aviation Administration will begin deploying TDWR units near 45 airports in late 1992. As part of this development effort, M.I.T. Lincoln Laboratory operates under F.A.A. support a TDWR testbed radar in Orlando, FL. During the past two years, a series of flight tests has been conducted with instrumented aircraft penetrating microburst events while under testbed radar surveillance. These tests were carried out with a Cessna Citation 2 aircraft operated by the University of North Dakota (UND) Center for Aerospace Sciences in 1990, and a Boeing 737 operated by NASA Langley Research Center in 1991. A large data base of approximately 60 instrumented microburst penetrations has been obtained from these flights.

  3. TUNABLE IRRADIATION TESTBED

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wootan, David W.; Casella, Andrew M.; Asner, David M.

    PNNL has developed and continues to develop innovative methods for characterizing irradiated materials from nuclear reactors and particle accelerators for various clients and collaborators around the world. The continued development of these methods, in addition to the ability to perform unique scientific investigations of the effects of radiation on materials could be greatly enhanced with easy access to irradiation facilities. A Tunable Irradiation Testbed with customized targets (a 30 MeV, 1mA cyclotron or similar coupled to a unique target system) is shown to provide a much more flexible and cost-effective source of irradiating particles than a test reactor or isotopicmore » source. The configuration investigated was a single shielded building with multiple beam lines from a small, flexible, high flux irradiation source. Potential applications investigated were the characterization of radiation damage to materials applicable to advanced reactors, fusion reactor, legacy waste, (via neutron spectra tailored to HTGR, molten salt, LWR, LMR, fusion environments); 252Cf replacement; characterization of radiation damage to materials of interest to High Energy Physics to enable the neutrino program; and research into production of short lived isotopes for potential medical and other applications.« less

  4. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs — Calibration Report for San Mateo Testbed.

    DOT National Transportation Integrated Search

    2016-08-22

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  5. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - calibration report for Dallas testbed : final report.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  6. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - calibration Report for Phoenix Testbed : Final Report.

    DOT National Transportation Integrated Search

    2016-10-01

    The primary objective of this project is to develop multiple simulation Testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  7. Analysis, modeling, and simulation (AMS) testbed development and evaluation to support dynamic mobility applications (DMA) and active transportation and demand management (ATDM) programs - evaluation summary for the San Diego testbed

    DOT National Transportation Integrated Search

    2017-08-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  8. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - San Mateo Testbed Analysis Plan : Final Report.

    DOT National Transportation Integrated Search

    2016-06-29

    The primary objective of this project is to develop multiple simulation testbeds/transportation models to evaluate the impacts of DMA connected vehicle applications and the active and dynamic transportation management (ATDM) strategies. The outputs (...

  9. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs - Evaluation Report for the San Diego Testbed

    DOT National Transportation Integrated Search

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  10. The ISB Cancer Genomics Cloud: A Flexible Cloud-Based Platform for Cancer Genomics Research.

    PubMed

    Reynolds, Sheila M; Miller, Michael; Lee, Phyliss; Leinonen, Kalle; Paquette, Suzanne M; Rodebaugh, Zack; Hahn, Abigail; Gibbs, David L; Slagel, Joseph; Longabaugh, William J; Dhankani, Varsha; Reyes, Madelyn; Pihl, Todd; Backus, Mark; Bookman, Matthew; Deflaux, Nicole; Bingham, Jonathan; Pot, David; Shmulevich, Ilya

    2017-11-01

    The ISB Cancer Genomics Cloud (ISB-CGC) is one of three pilot projects funded by the National Cancer Institute to explore new approaches to computing on large cancer datasets in a cloud environment. With a focus on Data as a Service, the ISB-CGC offers multiple avenues for accessing and analyzing The Cancer Genome Atlas, TARGET, and other important references such as GENCODE and COSMIC using the Google Cloud Platform. The open approach allows researchers to choose approaches best suited to the task at hand: from analyzing terabytes of data using complex workflows to developing new analysis methods in common languages such as Python, R, and SQL; to using an interactive web application to create synthetic patient cohorts and to explore the wealth of available genomic data. Links to resources and documentation can be found at www.isb-cgc.org Cancer Res; 77(21); e7-10. ©2017 AACR . ©2017 American Association for Cancer Research.

  11. Integrating Cloud-Computing-Specific Model into Aircraft Design

    NASA Astrophysics Data System (ADS)

    Zhimin, Tian; Qi, Lin; Guangwen, Yang

    Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.

  12. Cloud streets in Davis Strait

    NASA Image and Video Library

    2017-12-08

    The late winter sun shone brightly on a stunning scene of clouds and ice in the Davis Strait in late February, 2013. The Moderate Resolution Imaging Spectroradiometer aboard NASA’s Aqua satellite captured this true-color image on February 22 at 1625 UTC. The Davis Strait connects the Labrador Sea (part of the Atlantic Ocean) in the south with Baffin Bay to the north, and separates Canada, to the west, from Greenland to the east. Strong, steady winds frequently blow southward from the colder Baffin Bay to the warmer waters of the Labrador Sea. Over ice, the air is dry and no clouds form. However, as the Arctic air moves over the warmer, open water the rising moist air and the temperature differential gives rise to lines of clouds. In this image, the clouds are aligned in a beautiful, parallel pattern. Known as “cloud streets”, this pattern is formed in a low-level wind, with the clouds aligning in the direction of the wind. Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  13. Development of performance specifications for collision avoidance systems for lane change crashes. Task 6, interim report : testbed systems design and associated facilities

    DOT National Transportation Integrated Search

    2001-11-01

    This report documents the design of an on-road testbed vehicle. The purposes of this testbed are twofold: (1) Establish a foundation for estimating lane change collision avoidance effectiveness, and (2) provide information pertinent to setting perfor...

  14. Benefits of Model Updating: A Case Study Using the Micro-Precision Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Neat, Gregory W.; Kissil, Andrew; Joshi, Sanjay S.

    1997-01-01

    This paper presents a case study on the benefits of model updating using the Micro-Precision Interferometer (MPI) testbed, a full-scale model of a future spaceborne optical interferometer located at JPL.

  15. A land-surface Testbed for EOSDIS

    NASA Technical Reports Server (NTRS)

    Emery, William; Kelley, Tim

    1994-01-01

    The main objective of the Testbed project was to deliver satellite images via the Internet to scientific and educational users free of charge. The main method of operations was to store satellite images on a low cost tape library system, visually browse the raw satellite data, access the raw data filed, navigate the imagery through 'C' programming and X-Windows interface software, and deliver the finished image to the end user over the Internet by means of file transfer protocol methods. The conclusion is that the distribution of satellite imagery by means of the Internet is feasible, and the archiving of large data sets can be accomplished with low cost storage systems allowing multiple users.

  16. Instrument for Aircraft-Icing and Cloud-Physics Measurements

    NASA Technical Reports Server (NTRS)

    Lilie, Lyle; Bouley, Dan; Sivo, Chris

    2006-01-01

    The figure shows a compact, rugged, simple sensor head that is part of an instrumentation system for making measurements to characterize the severity of aircraft-icing conditions and/or to perform research on cloud physics. The quantities that are calculated from measurement data acquired by this system and that are used to quantify the severity of icing conditions include sizes of cloud water drops, cloud liquid water content (LWC), cloud ice water content (IWC), and cloud total water content (TWC). The sensor head is mounted on the outside of an aircraft, positioned and oriented to intercept the ambient airflow. The sensor head consists of an open housing that is heated in a controlled manner to keep it free of ice and that contains four hot-wire elements. The hot-wire sensing elements have different shapes and sizes and, therefore, exhibit different measurement efficiencies with respect to droplet size and water phase (liquid, frozen, or mixed). Three of the hot-wire sensing elements are oriented across the airflow so as to intercept incoming cloud water. For each of these elements, the LWC or TWC affects the power required to maintain a constant temperature in the presence of cloud water.

  17. Development of a Rotor-Body Coupled Analysis for an Active Mount Aeroelastic Rotor Testbed. Degree awarded by George Washington Univ., May 1996

    NASA Technical Reports Server (NTRS)

    Wilbur, Matthew L.

    1998-01-01

    At the Langley Research Center an active mount rotorcraft testbed is being developed for use in the Langley Transonic Dynamics Tunnel. This testbed, the second generation version of the Aeroelastic Rotor Experimental System (ARES-II), can impose rotor hub motions and measure the response so that rotor-body coupling phenomena may be investigated. An analytical method for coupling an aeroelastically scaled model rotor system to the ARES-II is developed in the current study. Models of the testbed and the rotor system are developed in independent analyses, and an impedance-matching approach is used to couple the rotor system to the testbed. The development of the analytical models and the coupling method is examined, and individual and coupled results are presented for the testbed and rotor system. Coupled results are presented with and without applied hub motion, and system loads and displacements are examined. The results show that a closed-loop control system is necessary to achieve desired hub motions, that proper modeling requires including the loads at the rotor hub and rotor control system, and that the strain-gauge balance placed in the rotating system of the ARES-II provided the best loads results.

  18. Military application of flat panel displays in the Vetronics Technology Testbed prototype vehicle

    NASA Astrophysics Data System (ADS)

    Downs, Greg; Roller, Gordon; Brendle, Bruce E., Jr.; Tierney, Terrance

    2000-08-01

    The ground combat vehicle crew of tomorrow must be able to perform their mission more effectively and efficiently if they are to maintain dominance over ever more lethal enemy forces. Increasing performance, however, becomes even more challenging when the soldier is subject to reduced crew sizes, a never- ending requirement to adapt to ever-evolving technologies and the demand to assimilate an overwhelming array of battlefield data. This, combined with the requirement to fight with equal effectiveness at any time of the day or night in all types of weather conditions, makes it clear that this crew of tomorrow will need timely, innovative solutions to overcome this multitude of barriers if they are to achieve their objectives. To this end, the U.S. Army is pursuing advanced crew stations with human-computer interfaces that will allow the soldier to take full advantage of emerging technologies and make efficient use of the battlefield information available to him in a program entitled 'Vetronics Technology Testbed.' Two critical components of the testbed are a compliment of panoramic indirect vision displays to permit drive-by-wire and multi-function displays for managing lethality, mobility, survivability, situational awareness and command and control of the vehicle. These displays are being developed and built by Computing Devices Canada, Ltd. This paper addresses the objectives of the testbed program and the technical requirements and design of the displays.

  19. Statistical Analyses of Satellite Cloud Object Data From CERES. Part 4; Boundary-layer Cloud Objects During 1998 El Nino

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce A.; Parker, Lindsay

    2006-01-01

    Three boundary-layer cloud object types, stratus, stratocumulus and cumulus, that occurred over the Pacific Ocean during January-August 1998, are identified from the CERES (Clouds and the Earth s Radiant Energy System) single scanner footprint (SSF) data from the TRMM (Tropical Rainfall Measuring Mission) satellite. This study emphasizes the differences and similarities in the characteristics of each cloud-object type between the tropical and subtropical regions and among different size categories and among small geographic areas. Both the frequencies of occurrence and statistical distributions of cloud physical properties are analyzed. In terms of frequencies of occurrence, stratocumulus clouds dominate the entire boundary layer cloud population in all regions and among all size categories. Stratus clouds are more prevalent in the subtropics and near the coastal regions, while cumulus clouds are relatively prevalent over open ocean and the equatorial regions, particularly, within the small size categories. The largest size category of stratus cloud objects occurs more frequently in the subtropics than in the tropics and has much larger average size than its cumulus and stratocumulus counterparts. Each of the three cloud object types exhibits small differences in statistical distributions of cloud optical depth, liquid water path, TOA albedo and perhaps cloud-top height, but large differences in those of cloud-top temperature and OLR between the tropics and subtropics. Differences in the sea surface temperature (SST) distributions between the tropics and subtropics influence some of the cloud macrophysical properties, but cloud microphysical properties and albedo for each cloud object type are likely determined by (local) boundary-layer dynamics and structures. Systematic variations of cloud optical depth, TOA albedo, cloud-top height, OLR and SST with cloud object sizes are pronounced for the stratocumulus and stratus types, which are related to systematic

  20. The Cloud Area Padovana: from pilot to production

    NASA Astrophysics Data System (ADS)

    Andreetto, P.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Sgaravatto, M.; Traldi, S.; Verlato, M.; Zangrando, L.

    2017-10-01

    The Cloud Area Padovana has been running for almost two years. This is an OpenStack-based scientific cloud, spread across two different sites: the INFN Padova Unit and the INFN Legnaro National Labs. The hardware resources have been scaled horizontally and vertically, by upgrading some hypervisors and by adding new ones: currently it provides about 1100 cores. Some in-house developments were also integrated in the OpenStack dashboard, such as a tool for user and project registrations with direct support for the INFN-AAI Identity Provider as a new option for the user authentication. In collaboration with the EU-funded Indigo DataCloud project, the integration with Docker-based containers has been experimented with and will be available in production soon. This computing facility now satisfies the computational and storage demands of more than 70 users affiliated with about 20 research projects. We present here the architecture of this Cloud infrastructure, the tools and procedures used to operate it. We also focus on the lessons learnt in these two years, describing the problems that were found and the corrective actions that had to be applied. We also discuss about the chosen strategy for upgrades, which combines the need to promptly integrate the OpenStack new developments, the demand to reduce the downtimes of the infrastructure, and the need to limit the effort requested for such updates. We also discuss how this Cloud infrastructure is being used. In particular we focus on two big physics experiments which are intensively exploiting this computing facility: CMS and SPES. CMS deployed on the cloud a complex computational infrastructure, composed of several user interfaces for job submission in the Grid environment/local batch queues or for interactive processes; this is fully integrated with the local Tier-2 facility. To avoid a static allocation of the resources, an elastic cluster, based on cernVM, has been configured: it allows to automatically create and

  1. Using Cloud-based Storage Technologies for Earth Science Data

    NASA Astrophysics Data System (ADS)

    Michaelis, A.; Readey, J.; Votava, P.

    2016-12-01

    Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.

  2. Operational processing and cloud boundary detection from micro pulse lidar data

    NASA Technical Reports Server (NTRS)

    Campbell, James R.; Hlavka, Dennis L.; Spinhirne, James D.; Scott, V. Stanley., III; Turner, David D.

    1998-01-01

    -owned systems have served as the basis for this development. With two operating at the southern Great Plains Cloud and Radiation Testbed Site (SGP CART) since December 1993 and another at the Manus Island Atmospheric Radiation and Cloud Station (TWP ARCS) location in the tropical western Pacific since February 1997, the ARM archive contains over 4 years of observations. In addition, high resolution systems planning to come on-line at the North Slope, AK CART shortly with another scheduled to follow at the TWP ARCS-II will diversify this archive with more extensive observations.

  3. CSNS computing environment Based on OpenStack

    NASA Astrophysics Data System (ADS)

    Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu

    2017-10-01

    Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.

  4. CloudDOE: a user-friendly tool for deploying Hadoop clouds and analyzing high-throughput sequencing data with MapReduce.

    PubMed

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.

  5. CloudDOE: A User-Friendly Tool for Deploying Hadoop Clouds and Analyzing High-Throughput Sequencing Data with MapReduce

    PubMed Central

    Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung

    2014-01-01

    may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343

  6. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  7. An active co-phasing imaging testbed with segmented mirrors

    NASA Astrophysics Data System (ADS)

    Zhao, Weirui; Cao, Genrui

    2011-06-01

    An active co-phasing imaging testbed with high accurate optical adjustment and control in nanometer scale was set up to validate the algorithms of piston and tip-tilt error sensing and real-time adjusting. Modularization design was adopted. The primary mirror was spherical and divided into three sub-mirrors. One of them was fixed and worked as reference segment, the others were adjustable respectively related to the fixed segment in three freedoms (piston, tip and tilt) by using sensitive micro-displacement actuators in the range of 15mm with a resolution of 3nm. The method of twodimension dispersed fringe analysis was used to sense the piston error between the adjacent segments in the range of 200μm with a repeatability of 2nm. And the tip-tilt error was gained with the method of centroid sensing. Co-phasing image could be realized by correcting the errors measured above with the sensitive micro-displacement actuators driven by a computer. The process of co-phasing error sensing and correcting could be monitored in real time by a scrutiny module set in this testbed. A FISBA interferometer was introduced to evaluate the co-phasing performance, and finally a total residual surface error of about 50nm rms was achieved.

  8. A Space Testbed for Photovoltaics

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Bailey, Sheila G.

    1998-01-01

    The Ohio Aerospace Institute and the NASA Lewis Research Center are designing and building a solar-cell calibration facility, the Photovoltaic Engineering Testbed (PET) to fly on the International Space Station to test advanced solar cell types in the space environment. A wide variety of advanced solar cell types have become available in the last decade. Some of these solar cells offer more than twice the power per unit area of the silicon cells used for the space station power system. They also offer the possibilities of lower cost, lighter weight, and longer lifetime. The purpose of the PET facility is to reduce the cost of validating new technologies and bringing them to spaceflight readiness. The facility will be used for three primary functions: calibration, measurement, and qualification. It is scheduled to be launched in June of 2002.

  9. The NSA/SHEBA Cloud & Radiation Comparison Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Janet M. Intrieri; Matthew D. Shupe

    2004-08-23

    Cloud and radiation data from two distinctly different Arctic areas are analyzed to study the differences between coastal Alaskan and open Arctic Ocean region clouds and their respective influence on the surface radiation budget. The cloud and radiation datasets were obtained from 1) the DOE North Slope of Alaska (NSA) facility in the coastal town of Barrow, Alaska, and 2) the SHEBA field program, which was conducted from an icebreaker frozen in, and drifting with, the sea-ice for one year in the Western Arctic Ocean. Radar, lidar, radiometer, and sounding measurements from both locations were used to produce annual cyclesmore » of cloud occurrence and height, atmospheric temperature and humidity, surface longwave and shortwave broadband fluxes, surface albedo, and cloud radiative forcing. In general, both regions revealed a similar annual trend of cloud occurrence fraction with minimum values in winter (60-75%) and maximum values during spring, summer and fall (80-90%). However, the annual average cloud occurrence fraction for SHEBA (76%) was lower than the 6-year average cloud occurrence at NSA (92%). Both Arctic areas also showed similar annual cycle trends of cloud forcing with clouds warming the surface through most of the year and a period of surface cooling during the summer, when cloud shading effects overwhelm cloud greenhouse effects. The greatest difference between the two regions was observed in the magnitude of the cloud cooling effect (i.e., shortwave cloud forcing), which was significantly stronger at NSA and lasted for a longer period of time than at SHEBA. This is predominantly due to the longer and stronger melt season at NSA (i.e., albedo values that are much lower coupled with Sun angles that are somewhat higher) than the melt season observed over the ice pack at SHEBA. Longwave cloud forcing values were comparable between the two sites indicating a general similarity in cloudiness and atmospheric temperature and humidity structure between the

  10. Phoenix Missile Hypersonic Testbed (PMHT): Project Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    An over view of research into a low cost hypersonic research flight test capability to increase the amount of hypersonic flight data to help bridge the large developmental gap between ground testing/analysis and major flight demonstrator Xplanes is provided. The major objectives included: develop an air launched missile booster research testbed; accurately deliver research payloads through programmable guidance to hypersonic test conditions; low cost; a high flight rate minimum of two flights per year and utilize surplus air launched missiles and NASA aircraft.

  11. Star formation in evolving molecular clouds

    NASA Astrophysics Data System (ADS)

    Völschow, M.; Banerjee, R.; Körtgen, B.

    2017-09-01

    Molecular clouds are the principle stellar nurseries of our universe; they thus remain a focus of both observational and theoretical studies. From observations, some of the key properties of molecular clouds are well known but many questions regarding their evolution and star formation activity remain open. While numerical simulations feature a large number and complexity of involved physical processes, this plethora of effects may hide the fundamentals that determine the evolution of molecular clouds and enable the formation of stars. Purely analytical models, on the other hand, tend to suffer from rough approximations or a lack of completeness, limiting their predictive power. In this paper, we present a model that incorporates central concepts of astrophysics as well as reliable results from recent simulations of molecular clouds and their evolutionary paths. Based on that, we construct a self-consistent semi-analytical framework that describes the formation, evolution, and star formation activity of molecular clouds, including a number of feedback effects to account for the complex processes inside those objects. The final equation system is solved numerically but at much lower computational expense than, for example, hydrodynamical descriptions of comparable systems. The model presented in this paper agrees well with a broad range of observational results, showing that molecular cloud evolution can be understood as an interplay between accretion, global collapse, star formation, and stellar feedback.

  12. Optimizing the resource usage in Cloud based environments: the Synergy approach

    NASA Astrophysics Data System (ADS)

    Zangrando, L.; Llorens, V.; Sgaravatto, M.; Verlato, M.

    2017-10-01

    Managing resource allocation in a cloud based data centre serving multiple virtual organizations is a challenging issue. In fact, while batch systems are able to allocate resources to different user groups according to specific shares imposed by the data centre administrator, without a static partitioning of such resources, this is not so straightforward in the most common cloud frameworks, e.g. OpenStack. In the current OpenStack implementation, it is only possible to grant fixed quotas to the different user groups and these resources cannot be exceeded by one group even if there are unused resources allocated to other groups. Moreover in the existing OpenStack implementation, when there aren’t resources available, new requests are simply rejected: it is then up to the client to later re-issue the request. The recently started EU-funded INDIGO-DataCloud project is addressing this issue through “Synergy”, a new advanced scheduling service targeted for OpenStack. Synergy adopts a fair-share model for resource provisioning which guarantees that resources are distributed among users following the fair-share policies defined by the administrator, taken also into account the past usage of such resources. We present the architecture of Synergy, the status of its implementation, some preliminary results and the foreseen evolution of the service.

  13. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Taylor, Jaime; Rakoczy, John; Steincamp, James

    2003-01-01

    Phase retrieval requires calculation of the real-valued phase of the pupil fimction from the image intensity distribution and characteristics of an optical system. Genetic 'algorithms were used to solve two one-dimensional phase retrieval problem. A GA successfully estimated the coefficients of a polynomial expansion of the phase when the number of coefficients was correctly specified. A GA also successfully estimated the multiple p h e s of a segmented optical system analogous to the seven-mirror Systematic Image-Based Optical Alignment (SIBOA) testbed located at NASA s Marshall Space Flight Center. The SIBOA testbed was developed to investigate phase retrieval techniques. Tiphilt and piston motions of the mirrors accomplish phase corrections. A constant phase over each mirror can be achieved by an independent tip/tilt correction: the phase Conection term can then be factored out of the Discrete Fourier Tranform (DFT), greatly reducing computations.

  14. Design and Prototyping of a Satellite Antenna Slew Testbed

    DTIC Science & Technology

    2013-12-01

    polycarbonate plastic PC personal computer PD proportional derivative PDO process data objects PVC polyvinyl chloride PVT position...part. The CAD model of the current design iteration can be exported to 3D printer to produce a plastic prototype of the testbed assembly. The 3D... extruded shaft) and connected by set screws as shown in Figure 8. The set screws translate the force from motor to gears to shaft, thus creating an

  15. Evaluating the feasibility of global climate models to simulate cloud cover effect controlled by Marine Stratocumulus regime transitions

    NASA Astrophysics Data System (ADS)

    Goren, Tom; Muelmenstaedt, Johannes; Rosenfeld, Daniel; Quaas, Johannes

    2017-04-01

    Marine stratocumulus clouds (MSC) occur in two main cloud regimes of open and closed cells that differ significantly by their cloud cover. Closed cells gradually get cleansed of high CCN concentrations in a process that involves initiation of drizzle that breaks the full cloud cover into open cells. The drizzle creates downdrafts that organize the convection along converging gust fronts, which in turn produce stronger updrafts that can sustain more cloud water that compensates the depletion of the cloud water by the rain. In addition, having stronger updrafts allow the clouds to grow relatively deep before rain starts to deplete its cloud water. Therefore, lower droplet concentrations and stronger rain would lead to lower cloud fraction, but not necessary also to lower liquid water path (LWP). The fundamental relationships between these key variables derived from global climate model (GCM) simulations are analyzed with respect to observations in order to determine whether the GCM parameterizations can represent well the governing physical mechanisms upon MSC regime transitions. The results are used to evaluate the feasibility of GCM's for estimating aerosol cloud-mediated radiative forcing upon MSC regime transitions, which are responsible for the largest aerosol cloud-mediated radiative forcing.

  16. Metabolizing Data in the Cloud.

    PubMed

    Warth, Benedikt; Levin, Nadine; Rinehart, Duane; Teijaro, John; Benton, H Paul; Siuzdak, Gary

    2017-06-01

    Cloud-based bioinformatic platforms address the fundamental demands of creating a flexible scientific environment, facilitating data processing and general accessibility independent of a countries' affluence. These platforms have a multitude of advantages as demonstrated by omics technologies, helping to support both government and scientific mandates of a more open environment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Hybrid Cloud Computing Environment for EarthCube and Geoscience Community

    NASA Astrophysics Data System (ADS)

    Yang, C. P.; Qin, H.

    2016-12-01

    The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.

  18. Cloud-Scale Numerical Modeling of the Arctic Boundary Layer

    NASA Technical Reports Server (NTRS)

    Krueger, Steven K.

    1998-01-01

    The interactions between sea ice, open ocean, atmospheric radiation, and clouds over the Arctic Ocean exert a strong influence on global climate. Uncertainties in the formulation of interactive air-sea-ice processes in global climate models (GCMs) result in large differences between the Arctic, and global, climates simulated by different models. Arctic stratus clouds are not well-simulated by GCMs, yet exert a strong influence on the surface energy budget of the Arctic. Leads (channels of open water in sea ice) have significant impacts on the large-scale budgets during the Arctic winter, when they contribute about 50 percent of the surface fluxes over the Arctic Ocean, but cover only 1 to 2 percent of its area. Convective plumes generated by wide leads may penetrate the surface inversion and produce condensate that spreads up to 250 km downwind of the lead, and may significantly affect the longwave radiative fluxes at the surface and thereby the sea ice thickness. The effects of leads and boundary layer clouds must be accurately represented in climate models to allow possible feedbacks between them and the sea ice thickness. The FIRE III Arctic boundary layer clouds field program, in conjunction with the SHEBA ice camp and the ARM North Slope of Alaska and Adjacent Arctic Ocean site, will offer an unprecedented opportunity to greatly improve our ability to parameterize the important effects of leads and boundary layer clouds in GCMs.

  19. A Cloud Microphysics Model for the Gas Giant Planets

    NASA Astrophysics Data System (ADS)

    Palotai, Csaba J.; Le Beau, Raymond P.; Shankar, Ramanakumar; Flom, Abigail; Lashley, Jacob; McCabe, Tyler

    2016-10-01

    Recent studies have significantly increased the quality and the number of observed meteorological features on the jovian planets, revealing banded cloud structures and discrete features. Our current understanding of the formation and decay of those clouds also defines the conceptual modes about the underlying atmospheric dynamics. The full interpretation of the new observational data set and the related theories requires modeling these features in a general circulation model (GCM). Here, we present details of our bulk cloud microphysics model that was designed to simulate clouds in the Explicit Planetary Hybrid-Isentropic Coordinate (EPIC) GCM for the jovian planets. The cloud module includes hydrological cycles for each condensable species that consist of interactive vapor, cloud and precipitation phases and it also accounts for latent heating and cooling throughout the transfer processes (Palotai and Dowling, 2008. Icarus, 194, 303-326). Previously, the self-organizing clouds in our simulations successfully reproduced the vertical and horizontal ammonia cloud structure in the vicinity of Jupiter's Great Red Spot and Oval BA (Palotai et al. 2014, Icarus, 232, 141-156). In our recent work, we extended this model to include water clouds on Jupiter and Saturn, ammonia clouds on Saturn, and methane clouds on Uranus and Neptune. Details of our cloud parameterization scheme, our initial results and their comparison with observations will be shown. The latest version of EPIC model is available as open source software from NASA's PDS Atmospheres Node.

  20. Development of optical packet and circuit integrated ring network testbed.

    PubMed

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate < 1×10(-4)) operation was achieved with optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated. © 2011 Optical Society of America

  1. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Steincamp, James; Taylor, Jaime

    2003-01-01

    A reduced surrogate, one point crossover genetic algorithm with random rank-based selection was used successfully to estimate the multiple phases of a segmented optical system modeled on the seven-mirror Systematic Image-Based Optical Alignment testbed located at NASA's Marshall Space Flight Center.

  2. Low Cost, Scalable Proteomics Data Analysis Using Amazon's Cloud Computing Services and Open Source Search Algorithms

    PubMed Central

    Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.

    2009-01-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578

  3. Low cost, scalable proteomics data analysis using Amazon's cloud computing services and open source search algorithms.

    PubMed

    Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N

    2009-06-01

    One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).

  4. Astronomy In The Cloud: Using Mapreduce For Image Coaddition

    NASA Astrophysics Data System (ADS)

    Wiley, Keith; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-01-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computational challenges such as anomaly detection, classification, and moving object tracking. Since such studies require the highest quality data, methods such as image coaddition, i.e., registration, stacking, and mosaicing, will be critical to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources, e.g., asteroids, or transient objects, e.g., supernovae, these datastreams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this paper we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data is partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources, i.e., platforms where Hadoop is offered as a service. We report on our experience implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multi-terabyte imaging dataset provides a good testbed for algorithm development since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image coaddition to the MapReduce framework. Then we describe a number of optimizations to our basic approach and report experimental results compring their performance. This work is funded by

  5. Design, Development, and Testing of a UAV Hardware-in-the-Loop Testbed for Aviation and Airspace Prognostics Research

    NASA Technical Reports Server (NTRS)

    Kulkarni, Chetan; Teubert, Chris; Gorospe, George; Burgett, Drew; Quach, Cuong C.; Hogge, Edward

    2016-01-01

    The airspace is becoming more and more complicated, and will continue to do so in the future with the integration of Unmanned Aerial Vehicles (UAVs), autonomy, spacecraft, other forms of aviation technology into the airspace. The new technology and complexity increases the importance and difficulty of safety assurance. Additionally, testing new technologies on complex aviation systems & systems of systems can be very difficult, expensive, and sometimes unsafe in real life scenarios. Prognostic methodology provides an estimate of the health and risks of a component, vehicle, or airspace and knowledge of how that will change over time. That measure is especially useful in safety determination, mission planning, and maintenance scheduling. The developed testbed will be used to validate prediction algorithms for the real-time safety monitoring of the National Airspace System (NAS) and the prediction of unsafe events. The framework injects flight related anomalies related to ground systems, routing, airport congestion, etc. to test and verify algorithms for NAS safety. In our research work, we develop a live, distributed, hardware-in-the-loop testbed for aviation and airspace prognostics along with exploring further research possibilities to verify and validate future algorithms for NAS safety. The testbed integrates virtual aircraft using the X-Plane simulator and X-PlaneConnect toolbox, UAVs using onboard sensors and cellular communications, and hardware in the loop components. In addition, the testbed includes an additional research framework to support and simplify future research activities. It enables safe, accurate, and inexpensive experimentation and research into airspace and vehicle prognosis that would not have been possible otherwise. This paper describes the design, development, and testing of this system. Software reliability, safety and latency are some of the critical design considerations in development of the testbed. Integration of HITL elements in

  6. CCPP-ARM Parameterization Testbed Model Forecast Data

    DOE Data Explorer

    Klein, Stephen

    2008-01-15

    Dataset contains the NCAR CAM3 (Collins et al., 2004) and GFDL AM2 (GFDL GAMDT, 2004) forecast data at locations close to the ARM research sites. These data are generated from a series of multi-day forecasts in which both CAM3 and AM2 are initialized at 00Z every day with the ECMWF reanalysis data (ERA-40), for the year 1997 and 2000 and initialized with both the NASA DAO Reanalyses and the NCEP GDAS data for the year 2004. The DOE CCPP-ARM Parameterization Testbed (CAPT) project assesses climate models using numerical weather prediction techniques in conjunction with high quality field measurements (e.g. ARM data).

  7. Cloud-based Jupyter Notebooks for Water Data Analysis

    NASA Astrophysics Data System (ADS)

    Castronova, A. M.; Brazil, L.; Seul, M.

    2017-12-01

    The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative

  8. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Overview

    NASA Astrophysics Data System (ADS)

    Cui, C.; Yu, C.; Xiao, J.; He, B.; Li, C.; Fan, D.; Wang, C.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Cao, Z.; Wang, J.; Yin, S.; Fan, Y.; Wang, J.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Tasks such as proposal submission, proposal peer-review, data archiving, data quality control, data release and open access, Cloud based data processing and analyzing, will be all supported on the platform. It will act as a full lifecycle management system for astronomical data and telescopes. Achievements from international Virtual Observatories and Cloud Computing are adopted heavily. In this paper, backgrounds of the project, key features of the system, and latest progresses are introduced.

  9. Analysis, Modeling, and Simulation (AMS) Testbed Development and Evaluation to Support Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) Programs : Evaluation Report for the San Diego Testbed : Draft Report.

    DOT National Transportation Integrated Search

    2017-07-01

    The primary objective of this project is to develop multiple simulation testbeds and transportation models to evaluate the impacts of Connected Vehicle Dynamic Mobility Applications (DMA) and Active Transportation and Demand Management (ATDM) strateg...

  10. The CSM testbed software system: A development environment for structural analysis methods on the NAS CRAY-2

    NASA Technical Reports Server (NTRS)

    Gillian, Ronnie E.; Lotts, Christine G.

    1988-01-01

    The Computational Structural Mechanics (CSM) Activity at Langley Research Center is developing methods for structural analysis on modern computers. To facilitate that research effort, an applications development environment has been constructed to insulate the researcher from the many computer operating systems of a widely distributed computer network. The CSM Testbed development system was ported to the Numerical Aerodynamic Simulator (NAS) Cray-2, at the Ames Research Center, to provide a high end computational capability. This paper describes the implementation experiences, the resulting capability, and the future directions for the Testbed on supercomputers.

  11. Cloud radiative properties and aerosol - cloud interaction

    NASA Astrophysics Data System (ADS)

    Viviana Vladutescu, Daniela; Gross, Barry; Li, Clement; Han, Zaw

    2015-04-01

    The presented research discusses different techniques for improvement of cloud properties measurements and analysis. The need for these measurements and analysis arises from the high errors noticed in existing methods that are currently used in retrieving cloud properties and implicitly cloud radiative forcing. The properties investigated are cloud fraction (cf) and cloud optical thickness (COT) measured with a suite of collocated remote sensing instruments. The novel approach makes use of a ground based "poor man's camera" to detect cloud and sky radiation in red, green, and blue with a high spatial resolution of 30 mm at 1km. The surface-based high resolution photography provides a new and interesting view of clouds. As the cloud fraction cannot be uniquely defined or measured, it depends on threshold and resolution. However as resolution decreases, cloud fraction tends to increase if the threshold is below the mean, and vice versa. Additionally cloud fractal dimension also depends on threshold. Therefore these findings raise concerns over the ability to characterize clouds by cloud fraction or fractal dimension. Our analysis indicate that Principal Component analysis may lead to a robust means of quantifying cloud contribution to radiance. The cloud images are analyzed in conjunction with a collocated CIMEL sky radiometer, Microwave Radiometer and LIDAR to determine homogeneity and heterogeneity. Additionally, MFRSR measurements are used to determine the cloud radiative properties as a validation tool to the results obtained from the other instruments and methods. The cloud properties to be further studied are aerosol- cloud interaction, cloud particle radii, and vertical homogeneity.

  12. Cloud Infrastructures for In Silico Drug Discovery: Economic and Practical Aspects

    PubMed Central

    Clematis, Andrea; Quarati, Alfonso; Cesini, Daniele; Milanesi, Luciano; Merelli, Ivan

    2013-01-01

    Cloud computing opens new perspectives for small-medium biotechnology laboratories that need to perform bioinformatics analysis in a flexible and effective way. This seems particularly true for hybrid clouds that couple the scalability offered by general-purpose public clouds with the greater control and ad hoc customizations supplied by the private ones. A hybrid cloud broker, acting as an intermediary between users and public providers, can support customers in the selection of the most suitable offers, optionally adding the provisioning of dedicated services with higher levels of quality. This paper analyses some economic and practical aspects of exploiting cloud computing in a real research scenario for the in silico drug discovery in terms of requirements, costs, and computational load based on the number of expected users. In particular, our work is aimed at supporting both the researchers and the cloud broker delivering an IaaS cloud infrastructure for biotechnology laboratories exposing different levels of nonfunctional requirements. PMID:24106693

  13. Design of a solar array simulator for the NASA EOS testbed

    NASA Technical Reports Server (NTRS)

    Butler, Steve J.; Sable, Dan M.; Lee, Fred C.; Cho, Bo H.

    1992-01-01

    The present spacecraft solar array simulator addresses both dc and ac characteristics as well as changes in illumination and temperature and performance degradation over the course of array service life. The computerized control system used allows simulation of a complete orbit cycle, in addition to automated diagnostics. The simulator is currently interfaced with the NASA EOS testbed.

  14. Development of performance specifications for collision avoidance systems for lane change, merging, and backing. Task 6, Interim report : testbed systems design and associated facilities

    DOT National Transportation Integrated Search

    1997-05-01

    This report represents the documentation of the design of the testbed. The purposes of the testbed are twofold 1) Establish a foundation for estimating collision avoidance effectiveness and 2) Provide information pertinent to setting performance spec...

  15. Long Term Performance Metrics of the GD SDR on the SCaN Testbed: The First Year on the ISS

    NASA Technical Reports Server (NTRS)

    Nappier, Jennifer; Wilson, Molly C.

    2014-01-01

    The General Dynamics (GD) S-Band software defined radio (SDR) in the Space Communications and Navigation (SCaN) Testbed on the International Space Station (ISS) provides experimenters an opportunity to develop and demonstrate experimental waveforms in space. The SCaN Testbed was installed on the ISS in August of 2012. After installation, the initial checkout and commissioning phases were completed and experimental operations commenced. One goal of the SCaN Testbed is to collect long term performance metrics for SDRs operating in space in order to demonstrate long term reliability. These metrics include the time the SDR powered on, the time the power amplifier (PA) is powered on, temperature trends, error detection and correction (EDAC) behavior, and waveform operational usage time. This paper describes the performance of the GD SDR over the first year of operations on the ISS.

  16. Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing

    NASA Technical Reports Server (NTRS)

    Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce

    2011-01-01

    Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases

  17. SparkClouds: visualizing trends in tag clouds.

    PubMed

    Lee, Bongshin; Riche, Nathalie Henry; Karlson, Amy K; Carpendale, Sheelash

    2010-01-01

    Tag clouds have proliferated over the web over the last decade. They provide a visual summary of a collection of texts by visually depicting the tag frequency by font size. In use, tag clouds can evolve as the associated data source changes over time. Interesting discussions around tag clouds often include a series of tag clouds and consider how they evolve over time. However, since tag clouds do not explicitly represent trends or support comparisons, the cognitive demands placed on the person for perceiving trends in multiple tag clouds are high. In this paper, we introduce SparkClouds, which integrate sparklines into a tag cloud to convey trends between multiple tag clouds. We present results from a controlled study that compares SparkClouds with two traditional trend visualizations—multiple line graphs and stacked bar charts—as well as Parallel Tag Clouds. Results show that SparkClouds ability to show trends compares favourably to the alternative visualizations.

  18. Exploration of cloud computing late start LDRD #149630 : Raincoat. v. 2.1.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Echeverria, Victor T.; Metral, Michael David; Leger, Michelle A.

    This report contains documentation from an interoperability study conducted under the Late Start LDRD 149630, Exploration of Cloud Computing. A small late-start LDRD from last year resulted in a study (Raincoat) on using Virtual Private Networks (VPNs) to enhance security in a hybrid cloud environment. Raincoat initially explored the use of OpenVPN on IPv4 and demonstrates that it is possible to secure the communication channel between two small 'test' clouds (a few nodes each) at New Mexico Tech and Sandia. We extended the Raincoat study to add IPSec support via Vyatta routers, to interface with a public cloud (Amazon Elasticmore » Compute Cloud (EC2)), and to be significantly more scalable than the previous iteration. The study contributed to our understanding of interoperability in a hybrid cloud.« less

  19. Observations and simulations of three-dimensional radiative interactions between Arctic boundary layer clouds and ice floes

    NASA Astrophysics Data System (ADS)

    Schäfer, M.; Bierwirth, E.; Ehrlich, A.; Jäkel, E.; Wendisch, M.

    2015-01-01

    Based on airborne spectral imaging observations three-dimensional (3-D) radiative effects between Arctic boundary layer clouds and ice floes have been identified and quantified. A method is presented to discriminate sea ice and open water in case of clouds from imaging radiance measurements. This separation simultaneously reveals that in case of clouds the transition of radiance between open water and sea ice is not instantaneously but horizontally smoothed. In general, clouds reduce the nadir radiance above bright surfaces in the vicinity of sea ice - open water boundaries, while the nadir radiance above dark surfaces is enhanced compared to situations with clouds located above horizontal homogeneous surfaces. With help of the observations and 3-D radiative transfer simulations, this effect was quantified to range between 0 and 2200 m distance to the sea ice edge. This affected distance Δ L was found to depend on both, cloud and sea ice properties. For a ground overlaying cloud in 0-200 m altitude, increasing the cloud optical thickness from τ = 1 to τ = 10 decreases Δ L from 600 to 250 m, while increasing cloud base altitude or cloud geometrical thickness can increase Δ L; Δ L(τ = 1/10) = 2200 m/1250 m for 500-1000 m cloud altitude. To quantify the effect for different shapes and sizes of the ice floes, various albedo fields (infinite straight ice edge, circles, squares, realistic ice floe field) were modelled. Simulations show that Δ L increases by the radius of the ice floe and for sizes larger than 6 km (500-1000 m cloud altitude) asymptotically reaches maximum values, which corresponds to an infinite straight ice edge. Furthermore, the impact of these 3-D-radiative effects on retrieval of cloud optical properties was investigated. The enhanced brightness of a dark pixel next to an ice edge results in uncertainties of up to 90 and 30% in retrievals of cloud optical thickness and effective radius reff, respectively. With help of Δ L quantified here, an

  20. Enhancing data utilization through adoption of cloud-based data architectures (Invited Paper 211869)

    NASA Astrophysics Data System (ADS)

    Kearns, E. J.

    2017-12-01

    A traditional approach to data distribution and utilization of open government data involves continuously moving those data from a central government location to each potential user, who would then utilize them on their local computer systems. An alternate approach would be to bring those users to the open government data, where users would also have access to computing and analytics capabilities that would support data utilization. NOAA's Big Data Project is exploring such an alternate approach through an experimental collaboration with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium. As part of this ongoing experiment, NOAA is providing open data of interest which are freely hosted by the Big Data Project Collaborators, who provide a variety of cloud-based services and capabilities to enable utilization by data users. By the terms of the agreement, the Collaborators may charge for those value-added services and processing capacities to recover their costs to freely host the data and to generate profits if so desired. Initial results have shown sustained increases in data utilization from 2 to over 100 times previously-observed access patterns from traditional approaches. Significantly increased utilization speed as compared to the traditional approach has also been observed by NOAA data users who have volunteered their experiences on these cloud-based systems. The potential for implementing and sustaining the alternate cloud-based approach as part of a change in operational data utilization strategies will be discussed.

  1. Accelerating Innovation that Enhances Resource Recovery in the Wastewater Sector: Advancing a National Testbed Network.

    PubMed

    Mihelcic, James R; Ren, Zhiyong Jason; Cornejo, Pablo K; Fisher, Aaron; Simon, A J; Snyder, Seth W; Zhang, Qiong; Rosso, Diego; Huggins, Tyler M; Cooper, William; Moeller, Jeff; Rose, Bob; Schottel, Brandi L; Turgeon, Jason

    2017-07-18

    This Feature examines significant challenges and opportunities to spur innovation and accelerate adoption of reliable technologies that enhance integrated resource recovery in the wastewater sector through the creation of a national testbed network. The network is a virtual entity that connects appropriate physical testing facilities, and other components needed for a testbed network, with researchers, investors, technology providers, utilities, regulators, and other stakeholders to accelerate the adoption of innovative technologies and processes that are needed for the water resource recovery facility of the future. Here we summarize and extract key issues and developments, to provide a strategy for the wastewater sector to accelerate a path forward that leads to new sustainable water infrastructures.

  2. Key Technology Research on Open Architecture for The Sharing of Heterogeneous Geographic Analysis Models

    NASA Astrophysics Data System (ADS)

    Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.

    2013-10-01

    In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.

  3. Propfan test assessment testbed aircraft flutter model test report

    NASA Technical Reports Server (NTRS)

    Jenness, C. M. J.

    1987-01-01

    The PropFan Test Assessment (PTA) program includes flight tests of a propfan power plant mounted on the left wind of a modified Gulfstream II testbed aircraft. A static balance boom is mounted on the right wing tip for lateral balance. Flutter analyses indicate that these installations reduce the wing flutter stabilizing speed and that torsional stiffening and the installation of a flutter stabilizing tip boom are required on the left wing for adequate flutter safety margins. Wind tunnel tests of a 1/9th scale high speed flutter model of the testbed aircraft were conducted. The test program included the design, fabrication, and testing of the flutter model and the correlation of the flutter test data with analysis results. Excellent correlations with the test data were achieved in posttest flutter analysis using actual model properties. It was concluded that the flutter analysis method used was capable of accurate flutter predictions for both the (symmetric) twin propfan configuration and the (unsymmetric) single propfan configuration. The flutter analysis also revealed that the differences between the tested model configurations and the current aircraft design caused the (scaled) model flutter speed to be significantly higher than that of the aircraft, at least for the single propfan configuration without a flutter boom. Verification of the aircraft final design should, therefore, be based on flutter predictions made with the test validated analysis methods.

  4. Development of a cloud-based Bioinformatics Training Platform.

    PubMed

    Revote, Jerico; Watson-Haigh, Nathan S; Quenette, Steve; Bethwaite, Blair; McGrath, Annette; Shang, Catherine A

    2017-05-01

    The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. © The Author 2016. Published by Oxford University Press.

  5. Development of a cloud-based Bioinformatics Training Platform

    PubMed Central

    Revote, Jerico; Watson-Haigh, Nathan S.; Quenette, Steve; Bethwaite, Blair; McGrath, Annette

    2017-01-01

    Abstract The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. PMID:27084333

  6. Investigation of tropical diurnal convection biases in a climate model using TWP-ICE observations and convection-permitting simulations

    NASA Astrophysics Data System (ADS)

    Lin, W.; Xie, S.; Jackson, R. C.; Endo, S.; Vogelmann, A. M.; Collis, S. M.; Golaz, J. C.

    2017-12-01

    Climate models are known to have difficulty in simulating tropical diurnal convections that exhibit distinct characteristics over land and open ocean. While the causes are rooted in deficiencies in convective parameterization in general, lack of representations of mesoscale dynamics in terms of land-sea breeze, convective organization, and propagation of convection-induced gravity waves also play critical roles. In this study, the problem is investigated at the process-level with the U.S. Department of Energy Accelerated Climate Modeling for Energy (ACME) model in short-term hindcast mode using the Cloud Associated Parameterization Testbed (CAPT) framework. Convective-scale radar retrievals and observation-driven convection-permitting simulations for the Tropical Warm Pool-International Cloud Experiment (TWP-ICE) cases are used to guide the analysis of the underlying processes. The emphasis will be on linking deficiencies in representation of detailed process elements to the model biases in diurnal convective properties and their contrast among inland, coastal and open ocean conditions.

  7. Menu-driven cloud computing and resource sharing for R and Bioconductor.

    PubMed

    Bolouri, Hamid; Dulepet, Rajiv; Angerman, Michael

    2011-08-15

    We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. hbolouri@fhcrc.org.

  8. Cloud-Top Entrainment in Stratocumulus Clouds

    NASA Astrophysics Data System (ADS)

    Mellado, Juan Pedro

    2017-01-01

    Cloud entrainment, the mixing between cloudy and clear air at the boundary of clouds, constitutes one paradigm for the relevance of small scales in the Earth system: By regulating cloud lifetimes, meter- and submeter-scale processes at cloud boundaries can influence planetary-scale properties. Understanding cloud entrainment is difficult given the complexity and diversity of the associated phenomena, which include turbulence entrainment within a stratified medium, convective instabilities driven by radiative and evaporative cooling, shear instabilities, and cloud microphysics. Obtaining accurate data at the required small scales is also challenging, for both simulations and measurements. During the past few decades, however, high-resolution simulations and measurements have greatly advanced our understanding of the main mechanisms controlling cloud entrainment. This article reviews some of these advances, focusing on stratocumulus clouds, and indicates remaining challenges.

  9. An Approach for Smart Antenna Testbed

    NASA Astrophysics Data System (ADS)

    Kawitkar, R. S.; Wakde, D. G.

    2003-07-01

    The use of wireless, mobile, personal communications services are expanding rapidly. Adaptive or "Smart" antenna arrays can increase channel capacity through spatial division. Adaptive antennas can also track mobile users, improving both signal range and quality. For these reasons, smart antenna systems have attracted widespread interest in the telecommunications industry for applications to third generation wireless systems.This paper aims to design and develop an advanced antennas testbed to serve as a common reference for testing adaptive antenna arrays and signal combining algorithms, as well as complete systems. A flexible suite of off line processing software should be written using matlab to perform system calibration, test bed initialization, data acquisition control, data storage/transfer, off line signal processing and analysis and graph plotting. The goal of this paper is to develop low complexity smart antenna structures for 3G systems. The emphasis will be laid on ease of implementation in a multichannel / multi-user environment. A smart antenna test bed will be developed, and various state-of-the-art DSP structures and algorithms will be investigated.Facing the soaring demand for mobile communications, the use of smart antenna arrays in mobile communications systems to exploit spatial diversity to further improve spectral efficiency has recently received considerable attention. Basically, a smart antenna array comprises a number of antenna elements combined via a beamforming network (amplitude and phase control network). Some of the benefits that can be achieved by using SAS (Smart Antenna System) include lower mobile terminal power consumption, range extension, ISI reduction, higher data rate support, and ease of integration into the existing base station system. In terms of economic benefits, adaptive antenna systems employed at base station, though increases the per base station cost, can increase coverage area of each cell site, thereby reducing

  10. A comparison of shock-cloud and wind-cloud interactions: effect of increased cloud density contrast on cloud evolution

    NASA Astrophysics Data System (ADS)

    Goldsmith, K. J. A.; Pittard, J. M.

    2018-05-01

    The similarities, or otherwise, of a shock or wind interacting with a cloud of density contrast χ = 10 were explored in a previous paper. Here, we investigate such interactions with clouds of higher density contrast. We compare the adiabatic hydrodynamic interaction of a Mach 10 shock with a spherical cloud of χ = 103 with that of a cloud embedded in a wind with identical parameters to the post-shock flow. We find that initially there are only minor morphological differences between the shock-cloud and wind-cloud interactions, compared to when χ = 10. However, once the transmitted shock exits the cloud, the development of a turbulent wake and fragmentation of the cloud differs between the two simulations. On increasing the wind Mach number, we note the development of a thin, smooth tail of cloud material, which is then disrupted by the fragmentation of the cloud core and subsequent `mass-loading' of the flow. We find that the normalized cloud mixing time (tmix) is shorter at higher χ. However, a strong Mach number dependence on tmix and the normalized cloud drag time, t_{drag}^' }, is not observed. Mach-number-dependent values of tmix and t_{drag}^' } from comparable shock-cloud interactions converge towards the Mach-number-independent time-scales of the wind-cloud simulations. We find that high χ clouds can be accelerated up to 80-90 per cent of the wind velocity and travel large distances before being significantly mixed. However, complete mixing is not achieved in our simulations and at late times the flow remains perturbed.

  11. A Testbed to Evaluate the FIWARE-Based IoT Platform in the Domain of Precision Agriculture.

    PubMed

    Martínez, Ramón; Pastor, Juan Ángel; Álvarez, Bárbara; Iborra, Andrés

    2016-11-23

    Wireless sensor networks (WSNs) represent one of the most promising technologies for precision farming. Over the next few years, a significant increase in the use of such systems on commercial farms is expected. WSNs present a number of problems, regarding scalability, interoperability, communications, connectivity with databases and data processing. Different Internet of Things middleware is appearing to overcome these challenges. This paper checks whether one of these middleware, FIWARE, is suitable for the development of agricultural applications. To the authors' knowledge, there are no works that show how to use FIWARE in precision agriculture and study its appropriateness, its scalability and its efficiency for this kind of applications. To do this, a testbed has been designed and implemented to simulate different deployments and load conditions. The testbed is a typical FIWARE application, complete, yet simple and comprehensible enough to show the main features and components of FIWARE, as well as the complexity of using this technology. Although the testbed has been deployed in a laboratory environment, its design is based on the analysis of an Internet of Things use case scenario in the domain of precision agriculture.

  12. A Testbed to Evaluate the FIWARE-Based IoT Platform in the Domain of Precision Agriculture

    PubMed Central

    Martínez, Ramón; Pastor, Juan Ángel; Álvarez, Bárbara; Iborra, Andrés

    2016-01-01

    Wireless sensor networks (WSNs) represent one of the most promising technologies for precision farming. Over the next few years, a significant increase in the use of such systems on commercial farms is expected. WSNs present a number of problems, regarding scalability, interoperability, communications, connectivity with databases and data processing. Different Internet of Things middleware is appearing to overcome these challenges. This paper checks whether one of these middleware, FIWARE, is suitable for the development of agricultural applications. To the authors’ knowledge, there are no works that show how to use FIWARE in precision agriculture and study its appropriateness, its scalability and its efficiency for this kind of applications. To do this, a testbed has been designed and implemented to simulate different deployments and load conditions. The testbed is a typical FIWARE application, complete, yet simple and comprehensible enough to show the main features and components of FIWARE, as well as the complexity of using this technology. Although the testbed has been deployed in a laboratory environment, its design is based on the analysis of an Internet of Things use case scenario in the domain of precision agriculture. PMID:27886091

  13. Cloud-Coffee: implementation of a parallel consistency-based multiple alignment algorithm in the T-Coffee package and its benchmarking on the Amazon Elastic-Cloud.

    PubMed

    Di Tommaso, Paolo; Orobitg, Miquel; Guirado, Fernando; Cores, Fernado; Espinosa, Toni; Notredame, Cedric

    2010-08-01

    We present the first parallel implementation of the T-Coffee consistency-based multiple aligner. We benchmark it on the Amazon Elastic Cloud (EC2) and show that the parallelization procedure is reasonably effective. We also conclude that for a web server with moderate usage (10K hits/month) the cloud provides a cost-effective alternative to in-house deployment. T-Coffee is a freeware open source package available from http://www.tcoffee.org/homepage.html

  14. Cloud System Evolution in the Trades—CSET

    NASA Astrophysics Data System (ADS)

    Albrecht, B. A.; Zuidema, P.; Bretherton, C. S.; Wood, R.; Ghate, V. P.

    2015-12-01

    made over open areas of the North Pacific along 2-day trajectories during CSET is unprecedented and will enable focused modeling studies of cloud system evolution and the role of aerosol-cloud-precipitation interactions in that evolution.

  15. Development and experimentation of an eye/brain/task testbed

    NASA Technical Reports Server (NTRS)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  16. Supersonic combustion engine testbed, heat lightning

    NASA Technical Reports Server (NTRS)

    Hoying, D.; Kelble, C.; Langenbahn, A.; Stahl, M.; Tincher, M.; Walsh, M.; Wisler, S.

    1990-01-01

    The design of a supersonic combustion engine testbed (SCET) aircraft is presented. The hypersonic waverider will utilize both supersonic combustion ramjet (SCRAMjet) and turbofan-ramjet engines. The waverider concept, system integration, electrical power, weight analysis, cockpit, landing skids, and configuration modeling are addressed in the configuration considerations. The subsonic, supersonic and hypersonic aerodynamics are presented along with the aerodynamic stability and landing analysis of the aircraft. The propulsion design considerations include: engine selection, turbofan ramjet inlets, SCRAMjet inlets and the SCRAMjet diffuser. The cooling requirements and system are covered along with the topics of materials and the hydrogen fuel tanks and insulation system. A cost analysis is presented and the appendices include: information about the subsonic wind tunnel test, shock expansion calculations, and an aerodynamic heat flux program.

  17. Arctic PBL Cloud Height and Motion Retrievals from MISR and MINX

    NASA Technical Reports Server (NTRS)

    Wu, Dong L.

    2012-01-01

    How Arctic clouds respond and feedback to sea ice loss is key to understanding of the rapid climate change seen in the polar region. As more open water becomes available in the Arctic Ocean, cold air outbreaks (aka. off-ice flow from polar lows) produce a vast sheet of roll clouds in the planetary boundary layer (PBl). The cold air temperature and wind velocity are the critical parameters to determine and understand the PBl structure formed under these roll clouds. It has been challenging for nadir visible/IR sensors to detect Arctic clouds due to lack of contrast between clouds and snowy/icy surfaces. In addition) PBl temperature inversion creates a further problem for IR sensors to relate cloud top temperature to cloud top height. Here we explore a new method with the Multiangle Imaging Spectro-Radiometer (MISR) instrument to measure cloud height and motion over the Arctic Ocean. Employing a stereoscopic-technique, MISR is able to measure cloud top height accurately and distinguish between clouds and snowy/icy surfaces with the measured height. We will use the MISR INteractive eXplorer (MINX) to quantify roll cloud dynamics during cold-air outbreak events and characterize PBl structures over water and over sea ice.

  18. Vacuum Nuller Testbed Performance, Characterization and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  19. Adventures in Private Cloud: Balancing Cost and Capability at the CloudSat Data Processing Center

    NASA Astrophysics Data System (ADS)

    Partain, P.; Finley, S.; Fluke, J.; Haynes, J. M.; Cronk, H. Q.; Miller, S. D.

    2016-12-01

    Since the beginning of the CloudSat Mission in 2006, The CloudSat Data Processing Center (DPC) at the Cooperative Institute for Research in the Atmosphere (CIRA) has been ingesting data from the satellite and other A-Train sensors, producing data products, and distributing them to researchers around the world. The computing infrastructure was specifically designed to fulfill the requirements as specified at the beginning of what nominally was a two-year mission. The environment consisted of servers dedicated to specific processing tasks in a rigid workflow to generate the required products. To the benefit of science and with credit to the mission engineers, CloudSat has lasted well beyond its planned lifetime and is still collecting data ten years later. Over that period requirements of the data processing system have greatly expanded and opportunities for providing value-added services have presented themselves. But while demands on the system have increased, the initial design allowed for very little expansion in terms of scalability and flexibility. The design did change to include virtual machine processing nodes and distributed workflows but infrastructure management was still a time consuming task when system modification was required to run new tests or implement new processes. To address the scalability, flexibility, and manageability of the system Cloud computing methods and technologies are now being employed. The use of a public cloud like Amazon Elastic Compute Cloud or Google Compute Engine was considered but, among other issues, data transfer and storage cost becomes a problem especially when demand fluctuates as a result of reprocessing and the introduction of new products and services. Instead, the existing system was converted to an on premises private Cloud using the OpenStack computing platform and Ceph software defined storage to reap the benefits of the Cloud computing paradigm. This work details the decisions that were made, the benefits that

  20. Impacts of cloud immersion on microclimate, photosynthesis and water relations of Abies fraseri (Pursh.) Poiret in a temperate mountain cloud forest.

    PubMed

    Reinhardt, Keith; Smith, William K

    2008-11-01

    The red spruce-Fraser fir ecosystem [Picea rubens Sarg.-Abies fraseri (Pursh) Poir.] of the southern Appalachian mountains, USA, is a temperate zone cloud forest immersed in clouds for 30-40% of a typical summer day, and experiencing immersion on about 65% of all days annually. We compared the microclimate, photosynthetic gas exchange, and water relations of Fraser fir trees in open areas during cloud-immersed, low-cloud, or sunny periods. In contrast to sunny periods, cloud immersion reduced instantaneous sunlight irradiance by 10-50%, and midday atmospheric vapor pressure deficit (VPD) was 85% lower. Needle surfaces were wet for up to 16 h per day during cloud-immersed days compared to <1 h for clear days. Shoot-level light-saturated photosynthesis (A (sat)) on both cloud-immersed (16.0 micromol m(-2) s(-1)) and low-cloud (17.9 micromol m(-2) s(-1)) days was greater than A (sat) on sunny days (14.4 micromol m(-2) s(-1)). Daily mean A was lowest on cloud-immersed days due to reduced sunlight levels, while leaf conductance (g) was significantly higher, with a mean value of 0.30 mol m(-2) s(-1). These g values were greater than commonly reported for conifer tree species with needle-like leaves, and declined exponentially with increasing leaf-to-air VPD. Daily mean transpiration (E) on immersed days was 43 and 20% lower compared to sunny and low-cloud days, respectively. As a result, daily mean water use efficiency (A/E) was lowest on cloud-immersed days due to light limitation of A, and high humidity resulted in greater uncoupling of A from g. Thus, substantial differences in photosynthetic CO2 uptake, and corresponding water relations, were strongly associated with cloud conditions that occur over substantial periods of the summer growth season.