Science.gov

Sample records for open cloud testbed

  1. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  2. Southern Great Plains cloud and radiation testbed site

    SciTech Connect

    1996-09-01

    This document presents information about the Cloud and Radiation Testbed Site and the Atmospheric Radiation Measurement program. Topics include; measuring methods, general circulation methods, milestones, instrumentation, meteorological observations, and computing facilities.

  3. Use of cloud and radiation testbed measurements to evaluate cloud cover and convective parameterizations

    SciTech Connect

    Walcek, C.J.; Hu, Q.

    1995-04-01

    We have used temperature and humidity soundings and radiation measurements from the Atmospheric Radiation Measurement (ARM) Cloud and Radiation Testbed (CART) site in northern Oklahoma to evaluate an improved cloud cover algorithm. We have also used a new single-column model cumulus parameterization to estimate convective heating and moistening tendencies at the CART site. Our earlier analysis of cloud cover showed that relatively dry atmospheres contain small cloud amounts. We have found numerous periods during 1993 where maximum relative humidities within any layer of the atmosphere over the CART site are well below 60-80%, yet clouds are clearly reducing shortwave irradiance measured by a rotating shadowband radiometer. These ARM measurements support our earlier findings that most current climate models probably underestimate cloud coverage when relative humidities fall below the threshold humidities where clear skies are assumed. We have applied a {open_quotes}detraining-plume{close_quotes} model of cumulus convection to the June 1993 intensive observation period (16-25 June 1993). This model was previously verified with GARP Atlantic Tropical Experiment (GATE) measurements. During the June intensive observing period (IOP), relative humidities over the CART site are typically 20% less than tropical Atlantic GATE relative humidities. Our convective model calculates that evaporation of convectively induced cloud and rainwater plays a much more important role in the heating and moistening convective tendencies at the drier CART location. In particular, we predict that considerable cooling and moistening in the lower troposphere should occur due to the evaporation of convectively initiated precipitation.

  4. A boundary-layer cloud study using Southern Great Plains Cloud and radiation testbed (CART) data

    SciTech Connect

    Albrecht, B.; Mace, G.; Dong, X.; Syrett, W.

    1996-04-01

    Boundary layer clouds-stratus and fairweather cumulus - are closely coupled involves the radiative impact of the clouds on the surface energy budget and the strong dependence of cloud formation and maintenance on the turbulent fluxes of heat and moisture in the boundary layer. The continuous data collection at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site provides a unique opportunity to study components of the coupling processes associated with boundary layer clouds and to provide descriptions of cloud and boundary layer structure that can be used to test parameterizations used in climate models. But before the CART data can be used for process studies and parameterization testing, it is necessary to evaluate and validate data and to develop techniques for effectively combining the data to provide meaningful descriptions of cloud and boundary layer characteristics. In this study we use measurements made during an intensive observing period we consider a case where low-level stratus were observed at the site for about 18 hours. This case is being used to examine the temporal evolution of cloud base, cloud top, cloud liquid water content, surface radiative fluxes, and boundary layer structure. A method for inferring cloud microphysics from these parameters is currently being evaluated.

  5. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  6. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  7. Site/Systems Operations, Maintenance and Facilities Management of the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site

    SciTech Connect

    Wu, Susan

    2005-08-01

    This contract covered the site/systems operations, maintenance, and facilities management of the DOE Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site.

  8. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-12-31

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  9. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  10. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.; Wasmer, F.; Pentecost, E.D.

    1992-03-01

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth`s atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy`s Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climate change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described.

  11. An open science cloud for scientific research

    NASA Astrophysics Data System (ADS)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  12. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  13. Satellite-derived surface characterization and surface fluxes across the Southern Great Plains Cloud and Radiation Testbed Site.

    SciTech Connect

    Gao, W.; Coulter, R. L.; Lesht, B. M.; Qiu, J.; Wesely, M. L.; Environmental Research

    1996-01-01

    Atmospheric processes in the lower boundary layer are strongly modulated by energy and mass fluxes from and to the underlying surface. The atmosphere-surface interactions usually occur at small temporal (seconds to minutes) and spatial (centimeters to meters) scales, which causes difficulties with including surface processes in atmospheric models, which can only handle much larger scales (kilometers). Developing schemes to characterize spatial variabilities in surface fluxes over heterogeneous surfaces for a regionally representative surface flux that can be correctly used in atmospheric models becomes an important issue. The Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) has an outline area of 350 x 450 km, across which land surface type, topography, vegetation, and soil conditions vary widely (Gao 1994). Surface flux measurements at a limited number of surface stations, including surface latent and sensible heat fluxes, net radiation, and soil heat flux by energy balance Bowen ratio (EBBR) stations, heat and momentum fluxes by eddy correlation stations, and upwelling radiation flux by surface radiation stations, are influenced by local surface conditions surrounding the stations and thus may not be able to provide fluxes representative of the entire CART site. Use of these data to represent the entire CART site in modeling studies and in comparing with large-scale satellite observations could lead to significant uncertainties. This study uses high-resolution ({approx}1 km) remote sensing by National Oceanic and Atmospheric Administration (NOAA) polar-orbiting environmental satellites to characterize spatial and temporal variations in land surface conditions and then to develop methods for estimating spatial variations and CART-representative values of surface fluxes.

  14. Research on OpenStack of open source cloud computing in colleges and universities’ computer room

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Zhang, Dandan

    2017-06-01

    In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.

  15. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-02

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com.

  16. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  17. Open-cell and closed-cell clouds off Peru

    NASA Image and Video Library

    2017-09-27

    2010/107 - 04/17 at 21 :05 UTC. Open-cell and closed-cell clouds off Peru, Pacific Ocean Resembling a frosted window on a cold winter's day, this lacy pattern of marine clouds was captured off the coast of Peru in the Pacific Ocean by the MODIS on the Aqua satellite on April 19, 2010. The image reveals both open- and closed-cell cumulus cloud patterns. These cells, or parcels of air, often occur in roughly hexagonal arrays in a layer of fluid (the atmosphere often behaves like a fluid) that begins to "boil," or convect, due to heating at the base or cooling at the top of the layer. In "closed" cells warm air is rising in the center, and sinking around the edges, so clouds appear in cell centers, but evaporate around cell edges. This produces cloud formations like those that dominate the lower left. The reverse flow can also occur: air can sink in the center of the cell and rise at the edge. This process is called "open cell" convection, and clouds form at cell edges around open centers, which creates a lacy, hollow-looking pattern like the clouds in the upper right. Closed and open cell convection represent two stable atmospheric configurations — two sides of the convection coin. But what determines which path the "boiling" atmosphere will take? Apparently the process is highly chaotic, and there appears to be no way to predict whether convection will result in open or closed cells. Indeed, the atmosphere may sometimes flip between one mode and another in no predictable pattern. Satellite: Aqua NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team To learn more about MODIS go to: rapidfire.sci.gsfc.nasa.gov/gallery/?latest NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

  18. Kontur: Observations of cloud streets and open cellular structures

    NASA Astrophysics Data System (ADS)

    Brümmer, B.; Bakan, S.; Hinzpeter, H.

    1985-08-01

    In September and October 1981 the experiment KonTur (Convection and turbulence) was conducted over the North Sea. Its objectives were to investigate organized convective patterns, like cloud streets (boundary layer rolls) and cellular cloud structures. Two aircraft (British Hercules C-130 and German Falcon 20) performed detailed measurements within these patterns. Several cases of cloud streets and open cells were observed. Boundary layer rolls appear to be connected with an inflection point in the cross-roll wind component. The aspect ratio of the rolls (wavelength versus depth) is between three and four in accordance with other observations and linear stability analysis. Four scales of motion are involved: the mean flow, the roll circulation, individual clouds and turbulence. The vertical transport are dominated at lower levels by turbulence and at higher levels by roll-scale motions. Open cellular cloud structures are connected with large air-sea temperature differences due to cold air outbreaks from the northwest. The aspect ratio of the cells is of the order of 10. The bulk contribution to the total transport of heat and momentum originates from the cloudy walls of the cells. A vertical cross section through a composite open cell is presented.

  19. Open-cell cloud formation over the Bahamas

    NASA Technical Reports Server (NTRS)

    2002-01-01

    What atmospheric scientists refer to as open cell cloud formation is a regular occurrence on the back side of a low-pressure system or cyclone in the mid-latitudes. In the Northern Hemisphere, a low-pressure system will draw in surrounding air and spin it counterclockwise. That means that on the back side of the low-pressure center, cold air will be drawn in from the north, and on the front side, warm air will be drawn up from latitudes closer to the equator. This movement of an air mass is called advection, and when cold air advection occurs over warmer waters, open cell cloud formations often result. This MODIS image shows open cell cloud formation over the Atlantic Ocean off the southeast coast of the United States on February 19, 2002. This particular formation is the result of a low-pressure system sitting out in the North Atlantic Ocean a few hundred miles east of Massachusetts. (The low can be seen as the comma-shaped figure in the GOES-8 Infrared image from February 19, 2002.) Cold air is being drawn down from the north on the western side of the low and the open cell cumulus clouds begin to form as the cold air passes over the warmer Caribbean waters. For another look at the scene, check out the MODIS Direct Broadcast Image from the University of Wisconsin. Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  20. Analytical study of the effects of the Low-Level Jet on moisture convergence and vertical motion fields at the Southern Great Plains Cloud and Radiation Testbed site

    SciTech Connect

    Bian, X.; Zhong, S.; Whiteman, C.D.; Stage, S.A.

    1996-04-01

    The Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) is located in a region that is strongly affected by a prominent meteorological phenomenon--the Great Plains Low-Level Jet (LLJ). Observations have shown that the LLJ plays a vital role in spring and summertime cloud formation and precipitation over the Great Plains. An improved understanding of the LLJ characteristics and its impact on the environment is necessary for addressing the fundamental issue of development and testing of radiational transfer and cloud parameterization schemes for the general circulation models (GCMs) using data from the SGP CART site. A climatological analysis of the summertime LLJ over the SGP has been carried out using hourly observations from the National Oceanic and Atmospheric Administration (NOAA) Wind Profiler Demonstration Network and from the ARM June 1993 Intensive Observation Period (IOP). The hourly data provide an enhanced temporal and spatial resolution relative to earlier studies which used 6- and 12-hourly rawinsonde observations at fewer stations.

  1. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  2. Cloud-Based Model Calibration Using OpenStudio: Preprint

    SciTech Connect

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  3. Tidal disruption of open clusters in their parent molecular clouds

    NASA Technical Reports Server (NTRS)

    Long, Kevin

    1989-01-01

    A simple model of tidal encounters has been applied to the problem of an open cluster in a clumpy molecular cloud. The parameters of the clumps are taken from the Blitz, Stark, and Long (1988) catalog of clumps in the Rosette molecular cloud. Encounters are modeled as impulsive, rectilinear collisions between Plummer spheres, but the tidal approximation is not invoked. Mass and binding energy changes during an encounter are computed by considering the velocity impulses given to individual stars in a random realization of a Plummer sphere. Mean rates of mass and binding energy loss are then computed by integrating over many encounters. Self-similar evolutionary calculations using these rates indicate that the disruption process is most sensitive to the cluster radius and relatively insensitive to cluster mass. The calculations indicate that clusters which are born in a cloud similar to the Rosette with a cluster radius greater than about 2.5 pc will not survive long enough to leave the cloud. The majority of clusters, however, have smaller radii and will survive the passage through their parent cloud.

  4. Trace explosives sensor testbed (TESTbed)

    NASA Astrophysics Data System (ADS)

    Collins, Greg E.; Malito, Michael P.; Tamanaha, Cy R.; Hammond, Mark H.; Giordano, Braden C.; Lubrano, Adam L.; Field, Christopher R.; Rogers, Duane A.; Jeffries, Russell A.; Colton, Richard J.; Rose-Pehrsson, Susan L.

    2017-03-01

    A novel vapor delivery testbed, referred to as the Trace Explosives Sensor Testbed, or TESTbed, is demonstrated that is amenable to both high- and low-volatility explosives vapors including nitromethane, nitroglycerine, ethylene glycol dinitrate, triacetone triperoxide, 2,4,6-trinitrotoluene, pentaerythritol tetranitrate, and hexahydro-1,3,5-trinitro-1,3,5-triazine. The TESTbed incorporates a six-port dual-line manifold system allowing for rapid actuation between a dedicated clean air source and a trace explosives vapor source. Explosives and explosives-related vapors can be sourced through a number of means including gas cylinders, permeation tube ovens, dynamic headspace chambers, and a Pneumatically Modulated Liquid Delivery System coupled to a perfluoroalkoxy total-consumption microflow nebulizer. Key features of the TESTbed include continuous and pulseless control of trace vapor concentrations with wide dynamic range of concentration generation, six sampling ports with reproducible vapor profile outputs, limited low-volatility explosives adsorption to the manifold surface, temperature and humidity control of the vapor stream, and a graphical user interface for system operation and testing protocol implementation.

  5. A comparison of radiometric fluxes influenced by parameterization cirrus clouds with observed fluxes at the Southern Great Plains (SGP) cloud and radiation testbed (CART) site

    SciTech Connect

    Mace, G.G.; Ackerman, T.P.; George, A.T.

    1996-04-01

    The data from the Atmospheric Radiation Measurement (ARM) Program`s Southern Great plains Site (SCP) is a valuable resource. We have developed an operational data processing and analysis methodology that allows us to examine continuously the influence of clouds on the radiation field and to test new and existing cloud and radiation parameterizations.

  6. Identity federation in OpenStack - an introduction to hybrid clouds

    NASA Astrophysics Data System (ADS)

    Denis, Marek; Castro Leon, Jose; Ormancey, Emmanuel; Tedesco, Paolo

    2015-12-01

    We are evaluating cloud identity federation available in the OpenStack ecosystem that allows for on premise bursting into remote clouds with use of local identities (i.e. domain accounts). Further enhancements to identity federation are a clear way to hybrid cloud architectures - virtualized infrastructures layered across independent private and public clouds.

  7. The EPOS Vision for the Open Science Cloud

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  8. Creating a Rackspace and NASA Nebula compatible cloud using the OpenStack project (Invited)

    NASA Astrophysics Data System (ADS)

    Clark, R.

    2010-12-01

    NASA and Rackspace have both provided technology to the OpenStack that allows anyone to create a private Infrastructure as a Service (IaaS) cloud using open source software and commodity hardware. OpenStack is designed and developed completely in the open and with an open governance process. NASA donated Nova, which powers the compute portion of NASA Nebula Cloud Computing Platform, and Rackspace donated Swift, which powers Rackspace Cloud Files. The project is now in continuous development by NASA, Rackspace, and hundreds of other participants. When you create a private cloud using Openstack, you will have the ability to easily interact with your private cloud, a government cloud, and an ecosystem of public cloud providers, using the same API.

  9. Cooling Earth's temperature by seeding marine stratocumulus clouds for increasing cloud cover by closing open cells

    NASA Astrophysics Data System (ADS)

    Daniel, R.

    2008-12-01

    The transition from open to closed cellular convection in marine stratocumulus is very sensitive to small concentrations of cloud condensation nuclei (CCN) aerosols. Addition of small amounts of CCN (about 100 cm-3) to the marine boundary layer (MBL) can close the open cells and by that increase the cloud cover from about 40% to nearly 100%, with negative radiative forcing exceeding 100 wm-2. We show satellite measurements that demonstrate this sensitivity by inadvertent experiments of old and diluted ship tracks. With the methodology suggested by Salter and Latham for spraying sub-micron sea water drops that serve as CCN, it is possible to close sufficiently large area of open cells for achieving the negative radiative forcing that is necessary to balance the greenhouse gases positive forcing. We show calculations of the feasibility of such an undertaking, and suggest that this is an economically feasible method with the least potential risks, when compared to seeding marine stratocumulus for enhancing their albedo or with seeding the stratosphere with bright or dark aerosols. Global Circulation models coupled with the ocean and the ice are necessary to calculate the impact and the possible side effects.

  10. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  11. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick Start Guide (Revision 1)

    DTIC Science & Technology

    2017-06-21

    ARL-CR-0816 ● JUNE 2017 US Army Research Laboratory Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open ...Cloud/Virtual Open Campus Quick-Start Guide (Revision 1) prepared by Scott Ososky Oak Ridge Associated Universities Oak Ridge, TN under...instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection information

  12. Fast Physics Testbed for the FASTER Project

    SciTech Connect

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  13. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  14. Optical interferometer testbed

    NASA Astrophysics Data System (ADS)

    Blackwood, Gary H.

    1991-07-01

    Viewgraphs on optical interferometer testbed presented at the MIT Space Research Engineering Center 3rd Annual Symposium are included. Topics covered include: space-based optical interferometer; optical metrology; sensors and actuators; real time control hardware; controlled structures technology (CST) design methodology; identification for MIMO control; FEM/ID correlation for the naked truss; disturbance modeling; disturbance source implementation; structure design: passive damping; low authority control; active isolation of lightweight mirrors on flexible structures; open loop transfer function of mirror; and global/high authority control.

  15. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  16. OpenID Connect as a security service in cloud-based medical imaging systems

    PubMed Central

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-01-01

    Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  17. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  18. A Business-to-Business Interoperability Testbed: An Overview

    SciTech Connect

    Kulvatunyou, Boonserm; Ivezic, Nenad; Monica, Martin; Jones, Albert

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  19. Point Cloud Visualization in AN Open Source 3d Globe

    NASA Astrophysics Data System (ADS)

    De La Calle, M.; Gómez-Deck, D.; Koehler, O.; Pulido, F.

    2011-09-01

    During the last years the usage of 3D applications in GIS is becoming more popular. Since the appearance of Google Earth, users are familiarized with 3D environments. On the other hand, nowadays computers with 3D acceleration are common, broadband access is widespread and the public information that can be used in GIS clients that are able to use data from the Internet is constantly increasing. There are currently several libraries suitable for this kind of applications. Based on these facts, and using libraries that are already developed and connected to our own developments, we are working on the implementation of a real 3D GIS with analysis capabilities. Since a 3D GIS such as this can be very interesting for tasks like LiDAR or Laser Scanner point clouds rendering and analysis, special attention is given to get an optimal handling of very large data sets. Glob3 will be a multidimensional GIS in which 3D point clouds could be explored and analysed, even if they are consist of several million points.The latest addition to our visualization libraries is the development of a points cloud server that works regardless of the cloud's size. The server receives and processes petitions from a 3d client (for example glob3, but could be any other, such as one based on WebGL) and delivers the data in the form of pre-processed tiles, depending on the required level of detail.

  20. Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun

    NASA Technical Reports Server (NTRS)

    Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.

    2008-01-01

    Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.

  1. Phenomenology tools on cloud infrastructures using OpenStack

    NASA Astrophysics Data System (ADS)

    Campos, I.; Fernández-del-Castillo, E.; Heinemeyer, S.; Lopez-Garcia, A.; Pahlen, F.; Borges, G.

    2013-04-01

    We present a new environment for computations in particle physics phenomenology employing recent developments in cloud computing. On this environment users can create and manage "virtual" machines on which the phenomenology codes/tools can be deployed easily in an automated way. We analyze the performance of this environment based on "virtual" machines versus the utilization of physical hardware. In this way we provide a qualitative result for the influence of the host operating system on the performance of a representative set of applications for phenomenology calculations.

  2. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License

  3. ATHLETE: Low Gravity Testbed

    NASA Technical Reports Server (NTRS)

    Qi, Jay Y.

    2011-01-01

    The All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) is a vehicle concept developed at Jet Propulsion Laboratory as a multipurpose robot for exploration. Currently, the ATHLETE team is working on creating a low gravity testbed to physically simulate ATHLETE landing on an asteroid. Several projects were worked on this summer to support the low gravity testbed.

  4. Enabling Open Cloud Markets Through WS-Agreement Extensions

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn

    Research into computing resource markets has mainly considered the question of which market mechanisms provide a fair resource allocation. However, while developing such markets, the definition of the unit of trade (i.e. the definition of resource) has not been given much attention. In this paper, we analyze the requirements for tradable resource goods. Based on the results, we suggest a detailed goods definition, which is easy to understand, can be used with many market mechanisms, and addresses the needs of a Cloud resource market. The goods definition captures the complete system resource, including hardware specifications, software specifications, the terms of use, and a pricing function. To demonstrate the usefulness of such a standardized goods definition, we demonstrate its application in the form of a WS-Agreement template for a number of market mechanisms for commodity system resources.

  5. Aspects of the quality of data from the Southern Great Plains (SGP) cloud and radiation testbed (CART) site broadband radiation sensors

    SciTech Connect

    Splitt, M.E.; Wesely, M.L.

    1996-04-01

    A systmatic evaluation of the performance of broadband radiometers at the Radiation Testbed (CART) site is needed to estimate the uncertainties of the irradiance observations. Here, net radiation observed with the net radiometer in the enrgy balance Bowen ratio station at the Central facility is compared with the net radiation computed as the sum of component irradiances recorded by nearby pyranameters and pyrgeometers. In addition, data obtained from the central facility pyranometers, pyrgeometers, and pyrheliometers are examined for April 1994, when intensive operations periods were being carried out. The data used in this study are from central facility radiometers in a solar and infrared observation station, and EBBR station, the so-called `BSRN` set of upward pointing radiometers, and a set of radiometers pointed down at the 25-m level of a 60-m tower.

  6. Open-cell and closed-cell clouds off Peru [detail

    NASA Image and Video Library

    2017-09-27

    2010/107 - 04/17 at 21 :05 UTC. Open-cell and closed-cell clouds off Peru, Pacific Ocean. To view the full fame of this image to go: www.flickr.com/photos/gsfc/4557497219/ Resembling a frosted window on a cold winter's day, this lacy pattern of marine clouds was captured off the coast of Peru in the Pacific Ocean by the MODIS on the Aqua satellite on April 19, 2010. The image reveals both open- and closed-cell cumulus cloud patterns. These cells, or parcels of air, often occur in roughly hexagonal arrays in a layer of fluid (the atmosphere often behaves like a fluid) that begins to "boil," or convect, due to heating at the base or cooling at the top of the layer. In "closed" cells warm air is rising in the center, and sinking around the edges, so clouds appear in cell centers, but evaporate around cell edges. This produces cloud formations like those that dominate the lower left. The reverse flow can also occur: air can sink in the center of the cell and rise at the edge. This process is called "open cell" convection, and clouds form at cell edges around open centers, which creates a lacy, hollow-looking pattern like the clouds in the upper right. Closed and open cell convection represent two stable atmospheric configurations — two sides of the convection coin. But what determines which path the "boiling" atmosphere will take? Apparently the process is highly chaotic, and there appears to be no way to predict whether convection will result in open or closed cells. Indeed, the atmosphere may sometimes flip between one mode and another in no predictable pattern. Satellite: Aqua NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team To learn more about MODIS go to: rapidfire.sci.gsfc.nasa.gov/gallery/?latest NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.

  7. The Confluence of GIS, Cloud and Open Source, Enabling Big Raster Data Applications

    NASA Astrophysics Data System (ADS)

    Plesea, L.; Emmart, C. B.; Boller, R. A.; Becker, P.; Baynes, K.

    2016-12-01

    The rapid evolution of available cloud services is profoundly changing the way applications are being developed and used. Massive object stores, service scalability, continuous integration are some of the most important cloud technology advances that directly influence science applications and GIS. At the same time, more and more scientists are using GIS platforms in their day to day research. Yet with new opportunities there are always some challenges. Given the large amount of data commonly required in science applications, usually large raster datasets, connectivity is one of the biggest problems. Connectivity has two aspects, one being the limited bandwidth and latency of the communication link due to the geographical location of the resources, the other one being the interoperability and intrinsic efficiency of the interface protocol used to connect. NASA and Esri are actively helping each other and collaborating on a few open source projects, aiming to provide some of the core technology components to directly address the GIS enabled data connectivity problems. Last year Esri contributed LERC, a very fast and efficient compression algorithm to the GDAL/MRF format, which itself is a NASA/Esri collaboration project. The MRF raster format has some cloud aware features that make it possible to build high performance web services on cloud platforms, as some of the Esri projects demonstrate. Currently, another NASA open source project, the high performance OnEarth WMTS server is being refactored and enhanced to better integrate with MRF, GDAL and Esri software. Taken together, the GDAL, MRF and OnEarth form the core of an open source CloudGIS toolkit that is already showing results. Since it is well integrated with GDAL, which is the most common interoperability component of GIS applications, this approach should improve the connectivity and performance of many science and GIS applications in the cloud.

  8. The Fizeau Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Zhang, Xiaolei; Carpenter, Kenneth G.; Lyon, Richard G,; Huet, Hubert; Marzouk, Joe; Solyar, Gregory

    2003-01-01

    The Fizeau Interferometer Testbed (FIT) is a collaborative effort between NASA's Goddard Space Flight Center, the Naval Research Laboratory, Sigma Space Corporation, and the University of Maryland. The testbed will be used to explore the principles of and the requirements for the full, as well as the pathfinder, Stellar Imager mission concept. It has a long term goal of demonstrating closed-loop control of a sparse array of numerous articulated mirrors to keep optical beams in phase and optimize interferometric synthesis imaging. In this paper we present the optical and data acquisition system design of the testbed, and discuss the wavefront sensing and control algorithms to be used. Currently we have completed the initial design and hardware procurement for the FIT. The assembly and testing of the Testbed will be underway at Goddard's Instrument Development Lab in the coming months.

  9. Clouds

    NASA Image and Video Library

    2010-09-14

    Clouds are common near the north polar caps throughout the spring and summer. The clouds typically cause a haze over the extensive dune fields. This image from NASA Mars Odyssey shows the edge of the cloud front.

  10. Building a Parallel Cloud Storage System using OpenStack’s Swift Object Store and Transformative Parallel I/O

    SciTech Connect

    Burns, Andrew J.; Lora, Kaleb D.; Martinez, Esteban; Shorter, Martel L.

    2012-07-30

    Our project consists of bleeding-edge research into replacing the traditional storage archives with a parallel, cloud-based storage solution. It used OpenStack's Swift Object Store cloud software. It's Benchmarked Swift for write speed and scalability. Our project is unique because Swift is typically used for reads and we are mostly concerned with write speeds. Cloud Storage is a viable archive solution because: (1) Container management for larger parallel archives might ease the migration workload; (2) Many tools that are written for cloud storage could be utilized for local archive; and (3) Current large cloud storage practices in industry could be utilized to manage a scalable archive solution.

  11. PROOF on the Cloud for ALICE using PoD and OpenNebula

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Bagnasco, S.; Brunetti, R.; Lusso, S.

    2012-06-01

    In order to optimize the use and management of computing centres, their conversion to cloud facilities is becoming increasingly popular. In a medium to large cloud facility, many different virtual clusters may concur for the same resources: unused resources can be freed either by turning off idle virtual machines, or by lowering resources assigned to a virtual machine at runtime. PROOF, a ROOT-based parallel and interactive analysis framework, is officially endorsed in the computing model of the ALICE experiment as complementary to the Grid, and it has become very popular over the last three years. The locality of PROOF-based analysis facilities forces system administrators to scavenge resources, yet the chaotic nature of user analysis tasks deems them unstable and inconstantly used, making PROOF a typical use-case for HPC cloud computing. Currently, PoD dynamically and easily provides a PROOF-enabled cluster by submitting agents to a job scheduler. Unfortunately, a Tier-2 does not comfortably share the same queue between interactive and batch jobs, due to the very large average time to completion of the latter: an elastic cloud approach would enable interactive virtual machines to temporarily subtract resources to the batch ones, without a noticeable impact on them. In this work we describe our setup of a dynamic PROOF-based cloud analysis facility based on PoD and OpenNebula, orchestrated by a simple and lightweight control daemon that makes virtualization transparent for the user.

  12. AutoGNC Testbed

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Vaughan, Andrew T.; Bayard, David S.; Riedel, Joseph E.; Balaram, J.

    2010-01-01

    A simulation testbed architecture was developed and implemented for the integration, test, and development of a TRL-6 flight software set called Auto- GNC. The AutoGNC software will combine the TRL-9 Deep Impact AutoNAV flight software suite, the TRL-9 Virtual Machine Language (VML) executive, and the TRL-3 G-REX guidance, estimation, and control algorithms. The Auto- GNC testbed was architected to provide software interface connections among the AutoNAV and VML flight code written in C, the G-REX algorithms in MATLAB and C, stand-alone image rendering algorithms in C, and other Fortran algorithms, such as the OBIRON landmark tracking suite. The testbed architecture incorporates software components for propagating a high-fidelity truth model of the environment and the spacecraft dynamics, along with the flight software components for onboard guidance, navigation, and control (GN&C). The interface allows for the rapid integration and testing of new algorithms prior to development of the C code for implementation in flight software. This testbed is designed to test autonomous spacecraft proximity operations around small celestial bodies, moons, or other spacecraft. The software is baselined for upcoming comet and asteroid sample return missions. This architecture and testbed will provide a direct improvement upon the onboard flight software utilized for missions such as Deep Impact, Stardust, and Deep Space 1.

  13. Features of Transitional Regimes for Hydrocarbon Combustion in Closed Volumes and in Opened Clouds

    NASA Astrophysics Data System (ADS)

    Lin, E. E.; Tanakov, Z. V.

    2006-08-01

    We present brief review and analysis of experimental results concerned to simulation of processes both in power-plants and in open-air surface space, when burning hydrocarbons gaseous mixtures. Combustion regimes in closed volumes are considered for acetylene mixtures C2H2 + mO2 + nN2, C2H2 + mN2O + nN2 in tubes with relative length L/d = 4 - 60. Combustion of opened fuel-air clouds under regime of their collisions is considered for propane-butane, when dispersing in atmosphere from several closely located reservoirs with liquefied gas.

  14. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  15. MIT's interferometer CST testbed

    NASA Technical Reports Server (NTRS)

    Hyde, Tupper; Kim, ED; Anderson, Eric; Blackwood, Gary; Lublin, Leonard

    1990-01-01

    The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.

  16. Uav-Based Photogrammetric Point Clouds - Tree STEM Mapping in Open Stands in Comparison to Terrestrial Laser Scanner Point Clouds

    NASA Astrophysics Data System (ADS)

    Fritz, A.; Kattenborn, T.; Koch, B.

    2013-08-01

    In both ecology and forestry, there is a high demand for structural information of forest stands. Forest structures, due to their heterogeneity and density, are often difficult to assess. Hence, a variety of technologies are being applied to account for this "difficult to come by" information. Common techniques are aerial images or ground- and airborne-Lidar. In the present study we evaluate the potential use of unmanned aerial vehicles (UAVs) as a platform for tree stem detection in open stands. A flight campaign over a test site near Freiburg, Germany covering a target area of 120 × 75 [m2] was conducted. The dominant tree species of the site is oak (quercus robur) with almost no understory growth. Over 1000 images with a tilt angle of 45° were shot. The flight pattern applied consisted of two antipodal staggered flight routes at a height of 55 [m] above the ground. We used a Panasonic G3 consumer camera equipped with a 14-42 [mm] standard lens and a 16.6 megapixel sensor. The data collection took place in leaf-off state in April 2013. The area was prepared with artificial ground control points for transformation of the structure-from-motion (SFM) point cloud into real world coordinates. After processing, the results were compared with a terrestrial laser scanner (TLS) point cloud of the same area. In the 0.9 [ha] test area, 102 individual trees above 7 [cm] diameter at breast height were located on in the TLS-cloud. We chose the software CMVS/PMVS-2 since its algorithms are developed with focus on dense reconstruction. The processing chain for the UAV-acquired images consists of six steps: a. cleaning the data: removing of blurry, under- or over exposed and off-site images; b. applying the SIFT operator [Lowe, 2004]; c. image matching; d. bundle adjustment; e. clustering; and f. dense reconstruction. In total, 73 stems were considered as reconstructed and located within one meter of the reference trees. In general stems were far less accurate and complete as

  17. Use of AVHRR-derived spectral reflectances to estimate surface albedo across the Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Qiu, J.; Gao, W.

    1997-03-01

    Substantial variations in surface albedo across a large area cause difficulty in estimating regional net solar radiation and atmospheric absorption of shortwave radiation when only ground point measurements of surface albedo are used to represent the whole area. Information on spatial variations and site-wide averages of surface albedo, which vary with the underlying surface type and conditions and the solar zenith angle, is important for studies of clouds and atmospheric radiation over a large surface area. In this study, a bidirectional reflectance model was used to inversely retrieve surface properties such as leaf area index and then the bidirectional reflectance distribution was calculated by using the same radiation model. The albedo was calculated by converting the narrowband reflectance to broadband reflectance and then integrating over the upper hemisphere.

  18. The WFI Hα spectroscopic survey of the Magellanic Clouds: Be stars in SMC open clusters

    NASA Astrophysics Data System (ADS)

    Martayan, Christophe; Baade, Dietrich; Fabregat, Juan

    2009-03-01

    At low metallicity, B-type stars show lower loss of mass and, therefore, angular momentum so that it is expected that there are more Be stars in the Magellanic Clouds than in the Milky Way. However, till now, searches for Be stars were only performed in a very small number of open clusters in the Magellanic Clouds. Using the ESO/WFI in its slitless spectroscopic mode, we performed a Hα survey of the Large and Small Magellanic Cloud. Eight million low-resolution spectra centered on Hα were obtained. For their automatic analysis, we developed the ALBUM code. Here, we present the observations, the method to exploit the data and first results for 84 open clusters in the SMC. In particular, cross-correlating our catalogs with OGLE positional and photometric data, we classified more than 4000 stars and were able to find the B and Be stars in them. We show the evolution of the rates of Be stars as functions of area density, metallicity, spectral type, and age.

  19. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  20. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program is developing initial recommendations for requirements and design approaches for the information systems of the Space Station era. During this quarter, drafting of the final reports of the various participants was initiated. Several drafts are included in this report as the University technical reports.

  1. Network testbed creation and validation

    DOEpatents

    Thai, Tan Q.; Urias, Vincent; Van Leeuwen, Brian P.; Watts, Kristopher K.; Sweeney, Andrew John

    2017-03-21

    Embodiments of network testbed creation and validation processes are described herein. A "network testbed" is a replicated environment used to validate a target network or an aspect of its design. Embodiments describe a network testbed that comprises virtual testbed nodes executed via a plurality of physical infrastructure nodes. The virtual testbed nodes utilize these hardware resources as a network "fabric," thereby enabling rapid configuration and reconfiguration of the virtual testbed nodes without requiring reconfiguration of the physical infrastructure nodes. Thus, in contrast to prior art solutions which require a tester manually build an emulated environment of physically connected network devices, embodiments receive or derive a target network description and build out a replica of this description using virtual testbed nodes executed via the physical infrastructure nodes. This process allows for the creation of very large (e.g., tens of thousands of network elements) and/or very topologically complex test networks.

  2. Network testbed creation and validation

    DOEpatents

    Thai, Tan Q.; Urias, Vincent; Van Leeuwen, Brian P.; Watts, Kristopher K.; Sweeney, Andrew John

    2017-04-18

    Embodiments of network testbed creation and validation processes are described herein. A "network testbed" is a replicated environment used to validate a target network or an aspect of its design. Embodiments describe a network testbed that comprises virtual testbed nodes executed via a plurality of physical infrastructure nodes. The virtual testbed nodes utilize these hardware resources as a network "fabric," thereby enabling rapid configuration and reconfiguration of the virtual testbed nodes without requiring reconfiguration of the physical infrastructure nodes. Thus, in contrast to prior art solutions which require a tester manually build an emulated environment of physically connected network devices, embodiments receive or derive a target network description and build out a replica of this description using virtual testbed nodes executed via the physical infrastructure nodes. This process allows for the creation of very large (e.g., tens of thousands of network elements) and/or very topologically complex test networks.

  3. Continuation: The EOSDIS testbed data system

    NASA Technical Reports Server (NTRS)

    Emery, Bill; Kelley, Timothy D.

    1995-01-01

    The continuation of the EOSDIS testbed ('Testbed') has materialized from a multi-task system to a fully functional stand-alone data archive distribution center that once was only X-Windows driven to a system that is accessible by all types of users and computers via the World Wide Web. Throughout the past months, the Testbed has evolved into a completely new system. The current system is now accessible through Netscape, Mosaic, and all other servers that can contact the World Wide Web. On October 1, 1995 we will open to the public and we expect that the statistics of the type of user, where they are located, and what they are looking for will drastically change. What is the most important change in the Testbed has been the Web interface. This interface will allow more users access to the system and walk them through the data types with more ease than before. All of the callbacks are written in such a way that icons can be used to easily move around in the programs interface. The homepage offers the user the opportunity to go and get more information about each satellite data type and also information on free programs. These programs are grouped into categories for types of computers that the programs are compiled for, along with information on how to FTP the programs back to the end users computer. The heart of the Testbed is still the acquisition of satellite data. From the Testbed homepage, the user selects the 'access to data system' icon, which will take them to the world map and allow them to select an area that they would like coverage on by simply clicking that area of the map. This creates a new map where other similar choices can be made to get the latitude and longitude of the region the satellite data will cover. Once a selection has been made the search parameters page will appear to be filled out. Afterwards, the browse image will be called for once the search is completed and the images for viewing can be selected. There are several other option pages

  4. The Palomar Testbed Interferometer

    NASA Technical Reports Server (NTRS)

    Colavita, M. M.; Wallace, J. K.; Hines, B. E.; Gursel, Y.; Malbet, F.; Palmer, D. L.; Pan, X. P.; Shao, M.; Yu, J. W.; Boden, A. F.

    1999-01-01

    The Palomar Testbed Interferometer (PTI) is a long-baseline infrared interferometer located at Palomar Observatory, California. It was built as a testbed for interferometric techniques applicable to the Keck Interferometer. First fringes were obtained in 1995 July. PTI implements a dual-star architecture, tracking two stars simultaneously for phase referencing and narrow-angle astrometry. The three fixed 40 cm apertures can be combined pairwise to provide baselines to 110 m. The interferometer actively tracks the white-light fringe using an array detector at 2.2 microns and active delay lines with a range of +/-38 m. Laser metrology of the delay lines allows for servo control, and laser metrology of the complete optical path enables narrow-angle astrometric measurements. The instrument is highly automated, using a multiprocessing computer system for instrument control and sequencing.

  5. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  6. Testbed for LISA Photodetectors

    NASA Technical Reports Server (NTRS)

    Guzman, Felipe; Livas, Jeffrey; Silverberg, Robert

    2009-01-01

    The Laser Interferometer Space Antenna (LISA) is a gravitational wave observatory consisting of three spacecraft separated by 5 million km in an equilateral triangle whose center follows the Earth in orbit around the Sun but offset in orbital phase by 20 degrees. LISA is designed to observe sources in the frequency range of 0.1 mHz-100 mHz by measuring fluctuations of the inter-spacecraft separation with laser interferometry. Quadrant photodetectors are used to measure both separation and angular orientation. Noise level, phase and amplitude inhomogeneities of the semiconductor response, and channel cross-talk between quadrant cells need to be assessed in order to ensure the 10 pm/Square root(Hz) sensitivity required for the interferometric length measurement in LISA. To this end, we are currently developing a testbed that allows us to evaluate photodetectors to the sensitivity levels required for LISA. A detailed description of the testbed and preliminary results will be presented.

  7. Comparison of the cloud activation potential of open ocean and coastal aerosol in the Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Vidaurre, G.; Brooks, S. D.; Thornton, D. C.

    2010-12-01

    Continuous measurements of aerosol concentration, particle size distribution, and cloud activation potential between 0.15 and 1.2% supersaturation were performed for open ocean and coastal air during the Halocarbon Air Sea Transect - Pacific (HalocAST) campaign. The nearly 7000 mile transect, aboard the R/V Thomas G. Thompson, started in Punta Arenas, Chile and ended in Seattle, Washington. Air mass source regions were identified on the basis of air mass back trajectories. For air masses in the southern hemisphere, aerosols sampled over the open ocean acted as cloud condensation nuclei at supersaturations between 0.5 and 1%, while coastal aerosols required higher supersaturations. In the pristine open ocean, observed aerosol concentrations were very low, typically below 200 cm-3, with an average particle diameter of approximately 0.4 μm. On the other hand, coastal aerosol concentrations were above 1000 cm-3 with an average particle diameter of 0.7 μm. Air masses originating in the northern hemisphere had much higher aerosol loads, between 500 and 2000 cm-3 over the ocean and above 4000 cm-3 at the coast. In both cases, the average particle diameters were approximately 0.5 μm. Measurements suggest that the northern hemisphere, substantially more polluted than the southern hemisphere, is characterized by alternating regions of high and medium aerosol number concentration. In addition, measurements of microorganism and organic matter concentration in the surface layer of the ocean water were conducted along the cruise track, to test the hypothesis that biogenic aerosol containing marine organic matter contribute to cloud activation potential. There was a significant correlation between mean aerosol diameter and prokaryote concentration in surface waters (r = 0.585, p < 0.01, n = 24), and between critical supersaturation and prokaryote concentration in surface waters (r = 0.538, p < 0.01, n = 24). This correlation indicates that larger aerosols occurred over water

  8. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program (TTPP) is intended to develop initial recommendations for requirements and design approaches for the information system of the Space Station era. Multiple scientific experiments are being performed, each exploring advanced technologies and technical approaches and each emulating some aspect of Space Station era science. The aggregate results of the program will serve to guide the development of future NASA information systems.

  9. The Fizeau Interferometer Testbed

    DTIC Science & Technology

    2003-03-01

    The Fizeau Interferometer Testbed Xiaolei Zhang US Naval Research Laboratory, Remote Sensing Division Washington, DC 20375, USA Tel: (202) 404-2389...Corporation 9801 Greenbelt Rd, Suite 105, Lanham, MD 20771, USA Gregory Solyar GEST/UMBC, NASA’s GSFC, Greenbelt, MD 20771, USA Abstract— The Fizeau ...OF CONTENTS 1 INTRODUCTION 2 FIZEAU VERSUS MICHELSON BEAM COMBINA- TION 3 OBJECTIVES AND DESIGN OF THE FIT 4 OPTICS DESIGN 5 RADIOMETRY 6 WAVEFRONT

  10. Telescience testbed pilot program

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1988-01-01

    The Universities Space Research Association (USRA), under sponsorship from the NASA Office of Space Science and Applications, is conducting a Telescience Testbed Pilot Program. Fifteen universities, under subcontract to USRA, are conducting a variety of scientific experiments using advanced technology to determine the requirements and evaluate the tradeoffs for the information system of the Space Station era. An interim set of recommendations based on the experiences of the first six months of the pilot program is presented.

  11. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  12. Marshall Avionics Testbed System (MAST)

    NASA Technical Reports Server (NTRS)

    Smith, Wayne D.

    1989-01-01

    Work accomplished in the summer of 1989 in association with the NASA/ASEE Summer Faculty Research Fellowship Program at Marshall Space Flight Center is summarized. The project was aimed at developing detailed specifications for the Marshall Avionics System Testbed (MAST). This activity was to include the definition of the testbed requirements and the development of specifications for a set of standard network nodes for connecting the testbed to a variety of networks. The project was also to include developing a timetable for the design, implementation, programming and testing of the testbed. Specifications of both hardware and software components for the system were to be included.

  13. Advanced turboprop testbed systems study

    NASA Technical Reports Server (NTRS)

    Goldsmith, I. M.

    1982-01-01

    The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.

  14. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  15. Single link flexible beam testbed project. Thesis

    NASA Technical Reports Server (NTRS)

    Hughes, Declan

    1992-01-01

    This thesis describes the single link flexible beam testbed at the CLaMS laboratory in terms of its hardware, software, and linear model, and presents two controllers, each including a hub angle proportional-derivative (PD) feedback compensator and one augmented by a second static gain full state feedback loop, based upon a synthesized strictly positive real (SPR) output, that increases specific flexible mode pole damping ratios w.r.t the PD only case and hence reduces unwanted residual oscillation effects. Restricting full state feedback gains so as to produce a SPR open loop transfer function ensures that the associated compensator has an infinite gain margin and a phase margin of at least (-90, 90) degrees. Both experimental and simulation data are evaluated in order to compare some different observer performance when applied to the real testbed and to the linear model when uncompensated flexible modes are included.

  16. Testbed For Telerobotic Servicing

    NASA Technical Reports Server (NTRS)

    Matijevic, Jacob R.

    1991-01-01

    Telerobot testbed used to evaluate technologies for remote servicing, including assembly, maintenance, and repair. Enables study of advantages and disadvantages of modes and problems encountered in implementing them. Best technologies for implementing modes chosen. Provides delays simulating transmission delays between control stations on ground and orbiting spacecraft. Includes five major equipment subsystems, each consisting of such commercially available equipment as video cameras, computers, and robot arms. Used on Space Station and on Space Shuttle and satellites in orbit. Also used in hazardous and underwater environments on Earth.

  17. LISA Optical Bench Testbed

    NASA Astrophysics Data System (ADS)

    Lieser, M.; d'Arcio, L.; Barke, S.; Bogenstahl, J.; Diekmann, C.; Diepholz, I.; Fitzsimons, E. D.; Gerberding, O.; Henning, J.-S.; Hewitson, M.; Hey, F. G.; Hogenhuis, H.; Killow, C. J.; Lucarelli, S.; Nikolov, S.; Perreur-Lloyd, M.; Pijnenburg, J.; Robertson, D. I.; Sohmer, A.; Taylor, A.; Tröbs, M.; Ward, H.; Weise, D.; Heinzel, G.; Danzmann, K.

    2013-01-01

    The optical bench (OB) is a part of the LISA spacecraft, situated between the telescope and the testmass. For measuring the inter-spacecraft distances there are several interferometers on the OB. The elegant breadboard of the OB for LISA is developed for the European Space Agency (ESA) by EADS Astrium, TNO Science & Industry, University of Glasgow and the Albert Einstein Intitute (AEI), the performance tests then will be done at the AEI. Here we present the testbed that will be used for the performance tests with the focus on the thermal environment and the laser infrastructure.

  18. Wireless Testbed Bonsai

    DTIC Science & Technology

    2006-02-01

    wireless sensor device network, and a about 200 Stargate nodes higher-tier multi-hop peer- to-peer 802.11b wireless network. Leading up to the full ExScal...deployment, we conducted spatial scaling tests on our higher-tier protocols on a 7 × 7 grid of Stargates nodes 45m and with 90m separations respectively...onW and its scaled version W̃ . III. EXPERIMENTAL SETUP Description of Kansei testbed. A stargate is a single board linux-based computer [7]. It uses a

  19. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    PubMed

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching.

  20. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide

    DTIC Science & Technology

    2016-03-01

    distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This document serves as the quick-start guide for GIFT Cloud, the web -based...to users with a GIFT Account at no cost. GIFT Cloud is a new implementation of GIFT. This web -based application allows learners, authors, and...distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser. Officially, GIFT Cloud has been tested to work on

  1. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA s Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing an aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in realtime in a sequence similar to what would occur in the NAS. The ATC applications that can be studied are the Aeronautical Telecommunications Network s (ATN) Context Management (CM) and Controller Pilot Data Link Communications (CPDLC). The Surveillance applications are Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Services - Broadcast (TIS-B).

  2. An open, interoperable, transdisciplinary approach to a point cloud data service using OGC standards and open source software.

    NASA Astrophysics Data System (ADS)

    Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley

    2017-04-01

    High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for

  3. High Resolution Cloud Microphysics and Radiation Studies

    DTIC Science & Technology

    2011-06-16

    characteristics of mid level altocumulus clouds and upper level visible and subvisual cirrus clouds The MPL lidar provided information about the temporal...balloon, lidar, and radar study of cirrus and altocumulus clouds to further investigate the presence of multi- cloud and nearly cloud -free layers...data set of the clouds and thermodynanuc structure to build a mesoscale and LF.S test-bed for cirrus and altocumulus cloud layers. The project was

  4. Delay Tolerant Networking on NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Johnson, Sandra; Eddy, Wesley

    2016-01-01

    This presentation covers the status of the implementation of an open source software that implements the specifications developed by the CCSDS Working Group. Interplanetary Overlay Network (ION) is open source software and it implements specifications that have been developed by two international working groups through IETF and CCSDS. ION was implemented on the SCaN Testbed, a testbed located on an external pallet on ISS, by the GRC team. The presentation will cover the architecture of the system, high level implementation details, and issues porting ION to VxWorks.

  5. A Hybrid Cloud Computing Service for Earth Sciences

    NASA Astrophysics Data System (ADS)

    Yang, C. P.

    2016-12-01

    Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.

  6. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  7. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  8. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  9. Improving data workflow systems with cloud services and use of open data for bioinformatics research.

    PubMed

    Karim, Md Rezaul; Michel, Audrey; Zappa, Achille; Baranov, Pavel; Sahay, Ratnesh; Rebholz-Schuhmann, Dietrich

    2017-04-16

    Data workflow systems (DWFSs) enable bioinformatics researchers to combine components for data access and data analytics, and to share the final data analytics approach with their collaborators. Increasingly, such systems have to cope with large-scale data, such as full genomes (about 200 GB each), public fact repositories (about 100 TB of data) and 3D imaging data at even larger scales. As moving the data becomes cumbersome, the DWFS needs to embed its processes into a cloud infrastructure, where the data are already hosted. As the standardized public data play an increasingly important role, the DWFS needs to comply with Semantic Web technologies. This advancement to DWFS would reduce overhead costs and accelerate the progress in bioinformatics research based on large-scale data and public resources, as researchers would require less specialized IT knowledge for the implementation. Furthermore, the high data growth rates in bioinformatics research drive the demand for parallel and distributed computing, which then imposes a need for scalability and high-throughput capabilities onto the DWFS. As a result, requirements for data sharing and access to public knowledge bases suggest that compliance of the DWFS with Semantic Web standards is necessary. In this article, we will analyze the existing DWFS with regard to their capabilities toward public open data use as well as large-scale computational and human interface requirements. We untangle the parameters for selecting a preferable solution for bioinformatics research with particular consideration to using cloud services and Semantic Web technologies. Our analysis leads to research guidelines and recommendations toward the development of future DWFS for the bioinformatics research community. © The Author 2017. Published by Oxford University Press.

  10. Long Duration Sorbent Testbed

    NASA Technical Reports Server (NTRS)

    Howard, David F.; Knox, James C.; Long, David A.; Miller, Lee; Cmaric, Gregory; Thomas, John

    2016-01-01

    The Long Duration Sorbent Testbed (LDST) is a flight experiment demonstration designed to expose current and future candidate carbon dioxide removal system sorbents to an actual crewed space cabin environment to assess and compare sorption working capacity degradation resulting from long term operation. An analysis of sorbent materials returned to Earth after approximately one year of operation in the International Space Station's (ISS) Carbon Dioxide Removal Assembly (CDRA) indicated as much as a 70% loss of working capacity of the silica gel desiccant material at the extreme system inlet location, with a gradient of capacity loss down the bed. The primary science objective is to assess the degradation of potential sorbents for exploration class missions and ISS upgrades when operated in a true crewed space cabin environment. A secondary objective is to compare degradation of flight test to a ground test unit with contaminant dosing to determine applicability of ground testing.

  11. Space Environments Testbed

    NASA Technical Reports Server (NTRS)

    Leucht, David K.; Koslosky, Marie J.; Kobe, David L.; Wu, Jya-Chang C.; Vavra, David A.

    2011-01-01

    The Space Environments Testbed (SET) is a flight controller data system for the Common Carrier Assembly. The SET-1 flight software provides the command, telemetry, and experiment control to ground operators for the SET-1 mission. Modes of operation (see dia gram) include: a) Boot Mode that is initiated at application of power to the processor card, and runs memory diagnostics. It may be entered via ground command or autonomously based upon fault detection. b) Maintenance Mode that allows for limited carrier health monitoring, including power telemetry monitoring on a non-interference basis. c) Safe Mode is a predefined, minimum power safehold configuration with power to experiments removed and carrier functionality minimized. It is used to troubleshoot problems that occur during flight. d) Operations Mode is used for normal experiment carrier operations. It may be entered only via ground command from Safe Mode.

  12. Holodeck Testbed Project

    NASA Technical Reports Server (NTRS)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  13. Autonomous Flying Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2005-01-01

    The Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis,Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights.

  14. Optical Network Testbeds Workshop

    SciTech Connect

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  15. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  16. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  17. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-12-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources.

  18. Updated Electronic Testbed System

    NASA Technical Reports Server (NTRS)

    Brewer, Kevin L.

    2001-01-01

    As we continue to advance in exploring space frontiers, technology must also advance. The need for faster data recovery and data processing is crucial. In this, the less equipment used, and lighter that equipment is, the better. Because integrated circuits become more sensitive in high altitude, experimental verification and quantification is required. The Center for Applied Radiation Research (CARR) at Prairie View A&M University was awarded a grant by NASA to participate in the NASA ER-2 Flight Program, the APEX balloon flight program, and the Student Launch Program. These programs are to test anomalous errors in integrated circuits due to single event effects (SEE). CARR had already begun experiments characterizing the SEE behavior of high speed and high density SRAM's. The research center built a error testing system using a PC-104 computer unit, an Iomega Zip drive for storage, a test board with the components under test, and a latchup detection and reset unit. A test program was written to continuously monitor a stored data pattern in the SRAM chip and record errors. The devices under test were eight 4Mbit memory chips totaling 4Mbytes of memory. CARR was successful at obtaining data using the Electronic TestBed System (EBS) in various NASA ER-2 test flights. These series of high altitude flights of up to 70,000 feet, were effective at yielding the conditions which single event effects usually occur. However, the data received from the series of flights indicated one error per twenty-four hours. Because flight test time is very expensive, the initial design proved not to be cost effective. The need for orders of magnitude with more memory became essential. Therefore, a project which could test more memory within a given time was created. The goal of this project was not only to test more memory within a given time, but also to have a system with a faster processing speed, and which used less peripherals. This paper will describe procedures used to build an

  19. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  20. Adaptive Signal Processing Testbed

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1991-09-01

    The design and implementation of a system for the acquisition, processing, and analysis of signal data is described. The initial application for the system is the development and analysis of algorithms for excision of interfering tones from direct sequence spread spectrum communication systems. The system is called the Adaptive Signal Processing Testbed (ASPT) and is an integrated hardware and software system built around the TMS320C30 chip. The hardware consists of a radio frequency data source, digital receiver, and an adaptive signal processor implemented on a Sun workstation. The software components of the ASPT consists of a number of packages including the Sun driver package; UNIX programs that support software development on the TMS320C30 boards; UNIX programs that provide the control, user interaction, and display capabilities for the data acquisition, processing, and analysis components of the ASPT; and programs that perform the ASPT functions including data acquisition, despreading, and adaptive filtering. The performance of the ASPT system is evaluated by comparing actual data rates against their desired values. A number of system limitations are identified and recommendations are made for improvements.

  1. FASR Subsystem Testbed

    NASA Astrophysics Data System (ADS)

    Liu, Zhiwei; Gary, D. E.; Nita, G. M.; White, S. M.; Hurford, G. J.

    2006-06-01

    The construction of the Frequency Agile Solar Radiotelescope (FASR) Subsystem Testbed (FST) and first results are described. Three antennas of Owens Valley Solar Array (OVSA) are upgraded with the newly designed, state of art technology. The 1-9 GHz RF signal from the feed is transmitted through the fiber optical system to the control room. Then it is downconverted to a 500 MHz single-sideband signal that can be tuned across the 1-9 GHz RF band. The data are sampled with an 8-bit, 1 GHz sampling-rate digitizer, and further saved to the hard disk. The correlated (phase and amplitude) spectra are derived through offline software. As a prototype of the FASR system, FST provides the opportunity to study the design, calibration, and interference-avoidance requirements of FASR. FST provides, for the first time, the ability to perform broadband imaging spectroscopy with high spectral, temporal and moderate spatial resolution. With this three element interferometer, we have the ability to determine the location of simple source structures with very high time resolution (20 ms) and frequency resolution ( <1 MHz) as well as the dynamic spectrum. Initial examples of geostationary satellite, GPS satellite and solar observations are presented.

  2. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations. In neurosurgery, the needle used in the standard stereotactic CT (Computational Tomography) or MRI (Magnetic Resonance Imaging) guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled 'Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification' is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  3. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  4. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations. In neurosurgery, the needle used in the standard stereotactic CT (Computational Tomography) or MRI (Magnetic Resonance Imaging) guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled 'Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification' is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  5. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  6. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea-ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Crosier, Jonathan; Bower, Keith N.; Darbyshire, Eoghan; Taylor, Jonathan W.; Liu, Dantong; Allan, James D.; Williams, Paul I.; Gallagher, Martin W.; Choularton, Thomas W.

    2016-04-01

    The Arctic sea-ice is intricately coupled to the atmosphere[1]. The decreasing sea-ice extent with the changing climate raises questions about how Arctic cloud structure will respond. Any effort to answer these questions is hindered by the scarcity of atmospheric observations in this region. Comprehensive cloud and aerosol measurements could allow for an improved understanding of the relationship between surface conditions and cloud structure; knowledge which could be key in validating weather model forecasts. Previous studies[2] have shown via remote sensing that cloudiness increases over the marginal ice zone (MIZ) and ocean with comparison to the sea-ice; however, to our knowledge, detailed in-situ data of this transition have not been previously presented. In 2013, the Aerosol-Cloud Coupling and Climate Interactions in the Arctic (ACCACIA) campaign was carried out in the vicinity of Svalbard, Norway to collect in-situ observations of the Arctic atmosphere and investigate this issue. Fitted with a suite of remote sensing, cloud and aerosol instrumentation, the FAAM BAe-146 aircraft was used during the spring segment of the campaign (Mar-Apr 2013). One case study (23rd Mar 2013) produced excellent coverage of the atmospheric changes when transitioning from sea-ice, through the MIZ, to the open ocean. Clear microphysical changes were observed, with the cloud liquid-water content increasing by almost four times over the transition. Cloud base, depth and droplet number also increased, whilst ice number concentrations decreased slightly. The surface warmed by ~13 K from sea-ice to ocean, with minor differences in aerosol particle number (of sizes corresponding to Cloud Condensation Nuclei or Ice Nucleating Particles) observed, suggesting that the primary driver of these microphysical changes was the increased heat fluxes and induced turbulence from the warm ocean surface as expected. References: [1] Kapsch, M.L., Graversen, R.G. and Tjernström, M. Springtime

  7. RF testbed for thermoacoustic tomography.

    PubMed

    Fallon, D; Yan, L; Hanson, G W; Patch, S K

    2009-06-01

    Thermoacoustic signal excitation is a function of intrinsic tissue properties and illuminating electric field. De-ionized (DI) water is a preferred acoustic coupling medium for thermoacoustics because acoustic and electromagnetic waves propagate in DI water with very little loss. We have designed a water-filled testbed propagating a controlled electric field with respect to pulse shape, power, and polarization. Directional coupler line sections permit measurement of incident, reflected, and transmitted powers. Both S-parameters and E(y) measurement show that the electric-field distribution is relatively uniform in testbed. Comparing baseline power measurements to those taken with a test object in place yields power loss in the object, which should correlate to thermoacoustic signal strength. Moreover, power loss--and therefore thermoacoustic computerized tomography signal strength--is sensitive to the orientation of the object to the polarization of the electric field. This testbed will enable quantitative characterization of the thermoacoustic contrast mechanism in ex vivo tissue specimens.

  8. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  9. The ac power system testbed

    NASA Technical Reports Server (NTRS)

    Mildice, J.; Sundberg, R.

    1987-01-01

    The object of this program was to design, build, test, and deliver a high frequency (20 kHz) Power System Testbed which would electrically approximate a single, separable power channel of an IOC Space Station. That program is described, including the technical background, and the results are discussed showing that the major assumptions about the characteristics of this class of hardware (size, mass, efficiency, control, etc.) were substantially correct. This testbed equipment was completed and delivered and is being operated as part of the Space Station Power System Test Facility.

  10. A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group

    NASA Astrophysics Data System (ADS)

    Graves, W. R.; Holliday, J. R.; Rundle, J. B.

    2010-12-01

    According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage

  11. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  12. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    PubMed

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.

  13. Large Eddy Simulation of VOCALS RF06: The Role of Cloud Droplet Number Concentration Gradients in Pockets of Open Cells

    NASA Astrophysics Data System (ADS)

    Berner, A. H.; Bretherton, C. S.; Wood, R.; Blossey, P. N.

    2009-12-01

    The VAMOS Ocean-Land-Atmosphere Study (VOCALS) REx field campaign sampled several excellent cases of pockets of open cells (POCS) embedded in a fully cloud-covered stratocumulus layer, most notably NSF C-130 flight RF06, which sampled across the boundary of a well defined POC between 0500 and 1000 local time on October 26th, 2008. We present the initial results of Large Eddy Simulation (LES) modeling of RF06 and examine the fidelity of the simulation in reproducing the effects of the observed gradients of cloud droplet concentration, most visibly the difference in cloud characteristics inside vs. outside the POC . The LES simulations were initialized with soundings constructed from aircraft data and NCEP reanalysis. Observations indicated a sharp transition in cloud droplet number concentration across the POC boundary. The SAM LES of Marat Khairoutdinov was run using CAM radiation and Morrison (2005) microphysics, with cloud droplet concentration Nc treated as an advected scalar without microphysical sources and sinks as a first step toward a realistic treatment of aerosols. The simulations were initialized with a step function change in Nc from 60 within the overcast region to 10 within the POC region, and with Nc equal to 10 above the inversion. A doubly periodic ‘bowling alley’ domain with horizontal dimensions of 192km x 24km is used to simulate a transect across the POC. The horizontal resolution is 125m and vertical resolution varies from 20m near the surface to 5m around the inversion, and then stretching to the domain top at 30km. The runs start at 0300 local time and continue for 18 hours across the diurnal cycle of insolation. Mesoscale circulations rapidly develop within the domain, with low level outflow from the POC to overcast regions and inflow near the top of the boundary layer from the overcast region into the POC. Drizzle cells develop within the POC and along its boundaries, consistent with observations, though actual precipitation amounts

  14. INDIGO: Building a DataCloud Framework to support Open Science

    NASA Astrophysics Data System (ADS)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  15. A Kenyan Cloud School. Massive Open Online & Ongoing Courses for Blended and Lifelong Learning

    ERIC Educational Resources Information Center

    Jobe, William

    2013-01-01

    This research describes the predicted outcomes of a Kenyan Cloud School (KCS), which is a MOOC that contains all courses taught at the secondary school level in Kenya. This MOOC will consist of online, ongoing subjects in both English and Kiswahili. The KCS subjects offer self-testing and peer assessment to maximize scalability, and digital badges…

  16. A Framework to Evaluate Unified Parameterizations for Seasonal Prediction: An LES/SCM Parameterization Test-Bed

    DTIC Science & Technology

    2012-09-30

    models. In particular we will: i) develop a Single Column Model ( SCM ) version of the latest operational NOGAPS that can be used to simulate GEWEX Cloud ...unlimited. A framework to evaluate unified parameterizations for seasonal prediction: an LES/ SCM parameterization test-bed Joao Teixeira Jet...iii) develop an integrated framework to use the NOGAPS SCM and the LES model as a parameterization test-bed. APPROACH It is well accepted that sub

  17. A Framework to Evaluate Unified Parameterizations for Seasonal Prediction: An LES/SCM Parameterization Test-Bed

    DTIC Science & Technology

    2013-09-30

    Seasonal Prediction: An LES/ SCM Parameterization Test-Bed Joao Teixeira Jet Propulsion Laboratory California Institute of Technology, MS 169-237...a Single Column Model ( SCM ) version of the latest operational NAVGEM that can be used to simulate GEWEX Cloud Systems Study (GCSS) case-studies; ii...use the NAVGEM SCM and the LES model as a parameterization test-bed. APPROACH It is well accepted that sub-grid physical processes such as

  18. High-contrast imaging testbed

    SciTech Connect

    Baker, K; Silva, D; Poyneer, L; Macintosh, B; Bauman, B; Palmer, D; Remington, T; Delgadillo-Lariz, M

    2008-01-23

    Several high-contrast imaging systems are currently under construction to enable the detection of extra-solar planets. In order for these systems to achieve their objectives, however, there is considerable developmental work and testing which must take place. Given the need to perform these tests, a spatially-filtered Shack-Hartmann adaptive optics system has been assembled to evaluate new algorithms and hardware configurations which will be implemented in these future high-contrast imaging systems. In this article, construction and phase measurements of a membrane 'woofer' mirror are presented. In addition, results from closed-loop operation of the assembled testbed with static phase plates are presented. The testbed is currently being upgraded to enable operation at speeds approaching 500 hz and to enable studies of the interactions between the woofer and tweeter deformable mirrors.

  19. Generalized Nanosatellite Avionics Testbed Lab

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt

    2015-01-01

    The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.

  20. Fading testbed for free-space optical communications

    NASA Astrophysics Data System (ADS)

    Shrestha, Amita; Giggenbach, Dirk; Mustafa, Ahmad; Pacheco-Labrador, Jorge; Ramirez, Julio; Rein, Fabian

    2016-10-01

    Free-space optical (FSO) communication is a very attractive technology offering very high throughput without spectral regulation constraints, yet allowing small antennas (telescopes) and tap-proof communication. However, the transmitted signal has to travel through the atmosphere where it gets influenced by atmospheric turbulence, causing scintillation of the received signal. In addition, climatic effects like fogs, clouds and rain also affect the signal significantly. Moreover, FSO being a line of sight communication requires precise pointing and tracking of the telescopes, which otherwise also causes fading. To achieve error-free transmission, various mitigation techniques like aperture averaging, adaptive optics, transmitter diversity, sophisticated coding and modulation schemes are being investigated and implemented. Evaluating the performance of such systems under controlled conditions is very difficult in field trials since the atmospheric situation constantly changes, and the target scenario (e.g. on aircraft or satellites) is not easily accessible for test purposes. Therefore, with the motivation to be able to test and verify a system under laboratory conditions, DLR has developed a fading testbed that can emulate most realistic channel conditions. The main principle of the fading testbed is to control the input current of a variable optical attenuator such that it attenuates the incoming signal according to the loaded power vector. The sampling frequency and mean power of the vector can be optionally changed according to requirements. This paper provides a brief introduction to software and hardware development of the fading testbed and measurement results showing its accuracy and application scenarios.

  1. The NASA/OAST telerobot testbed architecture

    NASA Technical Reports Server (NTRS)

    Matijevic, J. R.; Zimmerman, W. F.; Dolinsky, S.

    1989-01-01

    Through a phased development such as a laboratory-based research testbed, the NASA/OAST Telerobot Testbed provides an environment for system test and demonstration of the technology which will usefully complement, significantly enhance, or even replace manned space activities. By integrating advanced sensing, robotic manipulation and intelligent control under human-interactive supervision, the Testbed will ultimately demonstrate execution of a variety of generic tasks suggestive of space assembly, maintenance, repair, and telescience. The Testbed system features a hierarchical layered control structure compatible with the incorporation of evolving technologies as they become available. The Testbed system is physically implemented in a computing architecture which allows for ease of integration of these technologies while preserving the flexibility for test of a variety of man-machine modes. The development currently in progress on the functional and implementation architectures of the NASA/OAST Testbed and capabilities planned for the coming years are presented.

  2. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  3. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  4. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    NASA Astrophysics Data System (ADS)

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong; Qian, Yun; Zhang, Kai; Wang, Yuhang; Yang, Xiu-Qun

    2016-11-01

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol-radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m-2 (1σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 W m-2), while fire POM induces a small effect (-0.05 and 0.04 ± 0.01 W m-2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol-cloud interactions (REaci) of all fire aerosols is -0.70 ± 0.05 W m-2, resulting mainly from the fire POM effect (-0.59 ± 0.03 W m-2). REari (0.43 ± 0.03 W m-2) and REaci (-1.38 ± 0.23 W m-2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and -0.82 ± 0.09 W m-2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to -15 W m-2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. The global annual mean RE due to surface-albedo changes (REsac) over land areas (0.03 ± 0.10 W m-2) is small and statistically insignificant and is mainly due to the fire BC-in-snow effect (0.02 W m-2) with the maximum albedo effect

  5. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  6. Software Defined Networking challenges and future direction: A case study of implementing SDN features on OpenStack private cloud

    NASA Astrophysics Data System (ADS)

    Shamugam, Veeramani; Murray, I.; Leong, J. A.; Sidhu, Amandeep S.

    2016-03-01

    Cloud computing provides services on demand instantly, such as access to network infrastructure consisting of computing hardware, operating systems, network storage, database and applications. Network usage and demands are growing at a very fast rate and to meet the current requirements, there is a need for automatic infrastructure scaling. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Managing complex environments using traditional networks is time-consuming and expensive, especially in the case of generating virtual machines, migration and network configuration. To mitigate the challenges, network operations require efficient, flexible, agile and scalable software defined networks (SDN). This paper discuss various issues in SDN and suggests how to mitigate the network management related issues. A private cloud prototype test bed was setup to implement the SDN on the OpenStack platform to test and evaluate the various network performances provided by the various configurations.

  7. NASA's telemedicine testbeds: Commercial benefit

    NASA Astrophysics Data System (ADS)

    Doarn, Charles R.; Whitten, Raymond

    1998-01-01

    The National Aeronautics and Space Administration (NASA) has been developing and applying telemedicine to support space flight since the Agency's beginning. Telemetry of physiological parameters from spacecraft to ground controllers is critical to assess the health status of humans in extreme and remote environments. Requisite systems to support medical care and maintain readiness will evolve as mission duration and complexity increase. Developing appropriate protocols and procedures to support multinational, multicultural missions is a key objective of this activity. NASA has created an Agency-wide strategic plan that focuses on the development and integration of technology into the health care delivery systems for space flight to meet these challenges. In order to evaluate technology and systems that can enhance inflight medical care and medical education, NASA has established and conducted several testbeds. Additionally, in June of 1997, NASA established a Commercial Space Center (CSC) for Medical Informatics and Technology Applications at Yale University School of Medicine. These testbeds and the CSC foster the leveraging of technology and resources between government, academia and industry to enhance health care. This commercial endeavor will influence both the delivery of health care in space and on the ground. To date, NASA's activities in telemedicine have provided new ideas in the application of telecommunications and information systems to health care. NASA's Spacebridge to Russia, an Internet-based telemedicine testbed, is one example of how telemedicine and medical education can be conducted using the Internet and its associated tools. Other NASA activities, including the development of a portable telemedicine workstation, which has been demonstrated on the Crow Indian Reservation and in the Texas Prison System, show promise in serving as significant adjuncts to the delivery of health care. As NASA continues to meet the challenges of space flight, the

  8. Exploiting Open Environmental Data using Linked Data and Cloud Computing: the MELODIES project

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Gonçalves, Pedro; Caumont, Hervé; Koubarakis, Manolis; Perkins, Bethan

    2015-04-01

    The European Open Data Strategy establishes important new principles that ensure that European public sector data will be released at no cost (or marginal cost), in machine-readable, commonly-understood formats, and with liberal licences enabling wide reuse. These data encompass both scientific data about the environment (from Earth Observation and other fields) and other public sector information, including diverse topics such as demographics, health and crime. Many open geospatial datasets (e.g. land use) are already available through the INSPIRE directive and made available through infrastructures such as the Global Earth Observation System of Systems (GEOSS). The intention of the Open Data Strategy is to stimulate the growth of research and value-adding services that build upon these data streams; however, the potential value inherent in open data, and the benefits that can be gained by combining previously-disparate sources of information are only just starting to become understood. The MELODIES project (Maximising the Exploitation of Linked Open Data In Enterprise and Science) is developing eight innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. MELODIES (http://melodiesproject.eu) is a European FP7 project that is coordinated by the University of Reading and has sixteen partners (including nine SMEs) from eight European countries. It started in November 2013 and will run for three years. The project is therefore in its early stages and therefore we will value the opportunity that this workshop affords to present our plans and interact with the wider Linked Geospatial Data community. The project is developing eight new services[1] covering a range of domains including agriculture, urban ecosystems, land use management, marine information, desertification, crisis management and hydrology. These services will combine Earth

  9. A Variable Dynamic Testbed Vehicle

    NASA Technical Reports Server (NTRS)

    Marriott, A.

    1995-01-01

    This paper describes the concept of a potential test vehicle for the National Highway Traffic Safety Administration (NHTSA) that is designed to evaluate the dynamics, human factors, and safety aspects of advanced technologies in passenger class automobiles expected to be introduced as a result of the Intelligent Vehicle/Highway System (IVHS) Program. The Variable Dynamic Testbed Vehicle (VDTV) requirements were determined from the inputs of anticipated users and possible research needs of NHTSA. Design and implementation approaches are described, the benefits of the vehicle are discussed and costs for several options presented.

  10. Control design for the SERC experimental testbeds

    NASA Technical Reports Server (NTRS)

    Jacques, Robert; Blackwood, Gary; Macmartin, Douglas G.; How, Jonathan; Anderson, Eric

    1992-01-01

    Viewgraphs on control design for the Space Engineering Research Center experimental testbeds are presented. Topics covered include: SISO control design and results; sensor and actuator location; model identification; control design; experimental results; preliminary LAC experimental results; active vibration isolation problem statement; base flexibility coupling into isolation feedback loop; cantilever beam testbed; and closed loop results.

  11. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  12. CRYOTE (Cryogenic Orbital Testbed) Concept

    NASA Technical Reports Server (NTRS)

    Gravlee, Mari; Kutter, Bernard; Wollen, Mark; Rhys, Noah; Walls, Laurie

    2009-01-01

    Demonstrating cryo-fluid management (CFM) technologies in space is critical for advances in long duration space missions. Current space-based cryogenic propulsion is viable for hours, not the weeks to years needed by space exploration and space science. CRYogenic Orbital TEstbed (CRYOTE) provides an affordable low-risk environment to demonstrate a broad array of critical CFM technologies that cannot be tested in Earth's gravity. These technologies include system chilldown, transfer, handling, health management, mixing, pressure control, active cooling, and long-term storage. United Launch Alliance is partnering with Innovative Engineering Solutions, the National Aeronautics and Space Administration, and others to develop CRYOTE to fly as an auxiliary payload between the primary payload and the Centaur upper stage on an Atlas V rocket. Because satellites are expensive, the space industry is largely risk averse to incorporating unproven systems or conducting experiments using flight hardware that is supporting a primary mission. To minimize launch risk, the CRYOTE system will only activate after the primary payload is separated from the rocket. Flying the testbed as an auxiliary payload utilizes Evolved Expendable Launch Vehicle performance excess to cost-effectively demonstrate enhanced CFM.

  13. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Eucalyptus Systems • Provides an open-source application that can be used to implement a cloud computing environment on a datacenter • Trying to establish an...edgeplatform.html • Amazon Elastic Compute Cloud (EC2): http://aws.amazon.com/ec2/ • Amazon Simple Storage Solution (S3): http://aws.amazon.com/s3/ • Eucalyptus

  14. Experimental nowcasting and short-range forecasting of severe storms at the ESSL Testbed

    NASA Astrophysics Data System (ADS)

    Groenemeijer, Pieter; Holzer, Alois M.; Pistotnik, Georg; Riemann-Campe, Kathrin

    2013-04-01

    From 4 June to 6 July 2012, the first ESSL Testbed has taken place at the Research and Training Centre of the European Severe Storms Laboratory in Wiener Neustadt, Austria. During this time, researchers and forecasters worked closely together putting new forecast supporting products to the test. The Testbed's main activity is to prepare experimental forecasts for severe weather, of which short-range forecasts and nowcasts for the following 2 hours form an important part. These nowcasts are made using new tools based on NWP, radar and satellite, as well as surface and upper-air observations. Subsequently, a verification of the forecasts is performed using the European Severe Weather Database, followed by an evaluation of forecasting tools and techniques. Inspired by the annual Spring Program at NOAA's Hazardous Weather Testbed (HWT), the ESSL Testbed has a stronger focus on forecaster training than the HWT. Given the various backgrounds of the participants, an important Testbed goal is to acquaint its participants with severe weather forecasting methods and techniques that work universally. Among the tools that were evaluated at the 2012 Testbed were visualizations of high-resolution ensemble NWP (DWD's COSMO-DE-EPS), satellite-based cloud top cooling and overshooting top detection algorithms, lightning detection, and satellite and radar-based cell-tracking algorithms (DLR's Cb-TRAM and RadTRAM, and DWD's NowcastMix). In daily "Expert Lectures", that were broadcast online to remote participants, researchers provided background information on their products and internationally renowned experts in forecasting presented their viewpoints on storm forecasting and its scientific roots. Organized by ESSL in close cooperation with the Austrian Central Institute for Meteorology and Geodynamics (ZAMG), the Testbed was supported - among others - by the German Weather Service (DWD), EUMETSAT, WMO, ECMWF, VAISALA, and the GOES-R programme, providing products for evaluation and

  15. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  16. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    SciTech Connect

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong; Qian, Yun; Zhang, Kai; Wang, Yuhang; Yang, Xiu-Qun

    2016-11-29

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol–radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m-2 (1σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 W m-2), while fire POM induces a small effect (-0.05 and 0.04 ± 0.01 W m-2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol–cloud interactions (REaci) of all fire aerosols is -0.70 ± 0.05 W m-2, resulting mainly from the fire POM effect (-0.59 ± 0.03 W m-2). REari (0.43 ± 0.03 W m-2) and REaci (-1.38 ± 0.23 W m-2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and -0.82 ± 0.09 W m-2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to -15 Wm-2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. The global annual mean RE due to surface-albedo changes (REsac) over land areas (0.03 ± 0.10 W m-2) is small

  17. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    DOE PAGES

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong; ...

    2016-11-29

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol–radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m−2 (1σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 W m−2), while fire POM induces a small effect (−0.05 andmore » 0.04 ± 0.01 W m−2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol–cloud interactions (REaci) of all fire aerosols is −0.70 ± 0.05 W m−2, resulting mainly from the fire POM effect (−0.59 ± 0.03 W m−2). REari (0.43 ± 0.03 W m−2) and REaci (−1.38 ± 0.23 W m−2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and −0.82 ± 0.09 W m−2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to −15 W m−2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. The global annual mean RE due to surface-albedo changes (REsac) over land areas (0.03 ± 0.10 W m−2) is small

  18. On galaxy structure: CO clouds, open clusters and stars between 270 and 300 deg

    NASA Astrophysics Data System (ADS)

    Giorgi, E. E.; Carraro, G.; Moitinho, A.; Perren, G. I.; Bronfman, L.; Vázquez, R. A.

    2017-10-01

    The most used open cluster databases of our Galaxy include about 240 objects located in the region to in galactic longitude and to in galactic latitude. Only 146 out of the total number of these clusters have been investigated with some detail. On this occasion we present preliminary results of a study including optical and CO radio observations sweeping the above mentioned extension of the Milky Way combined with literature data. As for optical data we have selected a total of 16 regions including potential clusters (some of them never observed before) to be surveyed in the system with the main purpose of scrutinising not only the properties of the open cluster system in that place but also to detect and characterise the properties of field hot stars that could help to reveal the far spiral structure in this place. The present study is a continuation of our sine die project aimed at describing the spiral structure in the third and fourth galactic quadrants.

  19. The HERschel Inventory of the Agents of Galaxy Evolution in the Magellanic Clouds, a HERschel Open Time Key Program

    NASA Technical Reports Server (NTRS)

    Meixner, Margaret; Panuzzo, P.; Roman-Duval, J.; Engelbracht, C.; Babler, B.; Seale, J.; Hony, S.; Montiel, E.; Sauvage, M.; Gordon, K.; Misselt, K.; Okumura, K.; Chanial, P.; Beck, T.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Boyer, M. L.; Carlson, L. R.; Clayton, G. C.; Chen, C.-H. R.; Cormier, D.; Fukui, Y.; Galametz, M.; Galliano, F.

    2013-01-01

    We present an overview or the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 micron with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the critetia used to establish a catalog for each waveband for the HERITAGE program. The 250 micron band is the most sensitive and the source catalogs for this band have approx. 25,000 objects for the LMC and approx. 5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supemova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the conummity support

  20. THE HERSCHEL INVENTORY OF THE AGENTS OF GALAXY EVOLUTION IN THE MAGELLANIC CLOUDS, A HERSCHEL OPEN TIME KEY PROGRAM

    SciTech Connect

    Meixner, M.; Roman-Duval, J.; Seale, J.; Gordon, K.; Beck, T.; Boyer, M. L.; Panuzzo, P.; Hony, S.; Sauvage, M.; Okumura, K.; Chanial, P.; Babler, B.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Carlson, L. R.; Clayton, G. C.; and others

    2013-09-15

    We present an overview of the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 {mu}m with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for the LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the criteria used to establish a catalog for each waveband for the HERITAGE program. The 250 {mu}m band is the most sensitive and the source catalogs for this band have {approx}25,000 objects for the LMC and {approx}5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supernova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the community support

  1. Inferring spatial clouds statistics from limited field-of-view, zenith observations

    SciTech Connect

    Sun, C.H.; Thorne, L.R.

    1996-04-01

    Many of the Cloud and Radiation Testbed (CART) measurements produce a time series of zenith observations, but spatial averages are often the desired data product. One possible approach to deriving spatial averages from temporal averages is to invoke Taylor`s hypothesis where and when it is valid. Taylor`s hypothesis states that when the turbulence is small compared with the mean flow, the covariance in time is related to the covariance in space by the speed of the mean flow. For clouds fields, Taylor`s hypothesis would apply when the {open_quotes}local{close_quotes} turbulence is small compared with advective flow (mean wind). The objective of this study is to determine under what conditions Taylor`s hypothesis holds or does not hold true for broken cloud fields.

  2. A Space Testbed for Photovoltaics

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Bailey, Sheila G.

    1998-01-01

    The Ohio Aerospace Institute and the NASA Lewis Research Center are designing and building a solar-cell calibration facility, the Photovoltaic Engineering Testbed (PET) to fly on the International Space Station to test advanced solar cell types in the space environment. A wide variety of advanced solar cell types have become available in the last decade. Some of these solar cells offer more than twice the power per unit area of the silicon cells used for the space station power system. They also offer the possibilities of lower cost, lighter weight, and longer lifetime. The purpose of the PET facility is to reduce the cost of validating new technologies and bringing them to spaceflight readiness. The facility will be used for three primary functions: calibration, measurement, and qualification. It is scheduled to be launched in June of 2002.

  3. A Space Testbed for Photovoltaics

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Bailey, Sheila G.

    1998-01-01

    The Ohio Aerospace Institute and the NASA Lewis Research Center are designing and building a solar-cell calibration facility, the Photovoltaic Engineering Testbed (PET) to fly on the International Space Station to test advanced solar cell types in the space environment. A wide variety of advanced solar cell types have become available in the last decade. Some of these solar cells offer more than twice the power per unit area of the silicon cells used for the space station power system. They also offer the possibilities of lower cost, lighter weight, and longer lifetime. The purpose of the PET facility is to reduce the cost of validating new technologies and bringing them to spaceflight readiness. The facility will be used for three primary functions: calibration, measurement, and qualification. It is scheduled to be launched in June of 2002.

  4. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  5. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  6. The SMART-NAS Testbed

    NASA Technical Reports Server (NTRS)

    Aquilina, Rudolph A.

    2015-01-01

    The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.

  7. Experiments Program for NASA's Space Communications Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Reinhart, Richard

    2012-01-01

    NASA developed a testbed for communications and navigation that was launched to the International Space Station in 2012. The testbed promotes new software defined radio (SDR) technologies and addresses associated operational concepts for space-based SDRs, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. The experiments program consists of a mix of in-house and external experiments from partners in industry, academia, and government. The experiments will investigate key challenges in communications, networking, and global positioning system navigation both on the ground and on orbit. This presentation will discuss some of the key opportunities and challenges for the testbed experiments program.

  8. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  9. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  10. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    PubMed Central

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  11. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  12. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  13. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  14. Testbed for Tactical Networking and Collaboration

    DTIC Science & Technology

    2010-01-01

    dodccrp.org Focus & Convergence for Complex Endeavors The International C2 Journal | Vol 4, No 3 Testbed for Tactical Networking and Collaboration...interface, self- aligning directional antennas Hyper -Nodes with 8th Layer (Bordetsky & Hayes-Roth, 2007) Extending tactical self-forming...the “flattened” infra - structure of committee, team, and group team working clusters, as depicted in Figure 18. BORDETSKY & NETZER | Testbed for

  15. A satellite observation test bed for cloud parameterization development

    NASA Astrophysics Data System (ADS)

    Lebsock, M. D.; Suselj, K.

    2015-12-01

    We present an observational test-bed of cloud and precipitation properties derived from CloudSat, CALIPSO, and the the A-Train. The focus of the test-bed is on marine boundary layer clouds including stratocumulus and cumulus and the transition between these cloud regimes. Test-bed properties include the cloud cover and three dimensional cloud fraction along with the cloud water path and precipitation water content, and associated radiative fluxes. We also include the subgrid scale distribution of cloud and precipitation, and radiaitive quantities, which must be diagnosed by a model parameterization. The test-bed further includes meterological variables from the Modern Era Retrospective-analysis for Research and Applications (MERRA). MERRA variables provide the initialization and forcing datasets to run a parameterization in Single Column Model (SCM) mode. We show comparisons of an Eddy-Diffusivity/Mass-FLux (EDMF) parameterization coupled to micorphsycis and macrophysics packages run in SCM mode with observed clouds. Comparsions are performed regionally in areas of climatological subsidence as well stratified by dynamical and thermodynamical variables. Comparisons demonstrate the ability of the EDMF model to capture the observed transitions between subtropical stratocumulus and cumulus cloud regimes.

  16. Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal; Rizzo, Maxime; Thompson, Patrick

    2009-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept study and a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC would provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 5-year mission lifetime. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables starlight suppression in broadband light from 480-960 nm. To demonstrate the VNC approach and advance it's technology readiness we have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  17. The CMS integration grid testbed

    SciTech Connect

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  18. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  19. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Choularton, Thomas W.; Crosier, Jonathan; Bower, Keith N.; Gallagher, Martin W.; Davies, Rhiannon S.; Renfrew, Ian A.; Elvidge, Andrew D.; Darbyshire, Eoghan; Marenco, Franco; Brown, Philip R. A.; Ricketts, Hugo M. A.; Connolly, Paul J.; Lloyd, Gary; Williams, Paul I.; Allan, James D.; Taylor, Jonathan W.; Liu, Dantong; Flynn, Michael J.

    2016-11-01

    In situ airborne observations of cloud microphysics, aerosol properties, and thermodynamic structure over the transition from sea ice to ocean are presented from the Aerosol-Cloud Coupling And Climate Interactions in the Arctic (ACCACIA) campaign. A case study from 23 March 2013 provides a unique view of the cloud microphysical changes over this transition under cold-air outbreak conditions. Cloud base lifted and cloud depth increased over the transition from sea ice to ocean. Mean droplet number concentrations, Ndrop, also increased from 110 ± 36 cm-3 over the sea ice to 145 ± 54 cm-3 over the marginal ice zone (MIZ). Downstream over the ocean, Ndrop decreased to 63 ± 30 cm-3. This reduction was attributed to enhanced collision-coalescence of droplets within the deep ocean cloud layer. The liquid water content increased almost four fold over the transition and this, in conjunction with the deeper cloud layer, allowed rimed snowflakes to develop and precipitate out of cloud base downstream over the ocean. The ice properties of the cloud remained approximately constant over the transition. Observed ice crystal number concentrations averaged approximately 0.5-1.5 L-1, suggesting only primary ice nucleation was active; however, there was evidence of crystal fragmentation at cloud base over the ocean. Little variation in aerosol particle number concentrations was observed between the different surface conditions; however, some variability with altitude was observed, with notably greater concentrations measured at higher altitudes ( > 800 m) over the sea ice. Near-surface boundary layer temperatures increased by 13 °C from sea ice to ocean, with corresponding increases in surface heat fluxes and turbulent kinetic energy. These significant thermodynamic changes were concluded to be the primary driver of the microphysical evolution of the cloud. This study represents the first investigation, using in situ airborne observations, of cloud microphysical changes with

  20. Contrasting sea-ice and open-water boundary layers during melt and freeze-up seasons: Some result from the Arctic Clouds in Summer Experiment.

    NASA Astrophysics Data System (ADS)

    Tjernström, Michael; Sotiropoulou, Georgia; Sedlar, Joseph; Achtert, Peggy; Brooks, Barbara; Brooks, Ian; Persson, Ola; Prytherch, John; Salsbury, Dominic; Shupe, Matthew; Johnston, Paul; Wolfe, Dan

    2016-04-01

    With more open water present in the Arctic summer, an understanding of atmospheric processes over open-water and sea-ice surfaces as summer turns into autumn and ice starts forming becomes increasingly important. The Arctic Clouds in Summer Experiment (ACSE) was conducted in a mix of open water and sea ice in the eastern Arctic along the Siberian shelf during late summer and early autumn 2014, providing detailed observations of the seasonal transition, from melt to freeze. Measurements were taken over both ice-free and ice-covered surfaces, offering an insight to the role of the surface state in shaping the lower troposphere and the boundary-layer conditions as summer turned into autumn. During summer, strong surface inversions persisted over sea ice, while well-mixed boundary layers capped by elevated inversions were frequent over open-water. The former were often associated with advection of warm air from adjacent open-water or land surfaces, whereas the latter were due to a positive buoyancy flux from the warm ocean surface. Fog and stratus clouds often persisted over the ice, whereas low-level liquid-water clouds developed over open water. These differences largely disappeared in autumn, when mixed-phase clouds capped by elevated inversions dominated in both ice-free and ice-covered conditions. Low-level-jets occurred ~20-25% of the time in both seasons. The observations indicate that these jets were typically initiated at air-mass boundaries or along the ice edge in autumn, while in summer they appeared to be inertial oscillations initiated by partial frictional decoupling as warm air was advected in over the sea ice. The start of the autumn season was related to an abrupt change in atmospheric conditions, rather than to the gradual change in solar radiation. The autumn onset appeared as a rapid cooling of the whole atmosphere and the freeze up followed as the warm surface lost heat to the atmosphere. While the surface type had a pronounced impact on boundary

  1. Exploration Systems Health Management Facilities and Testbed Workshop

    NASA Technical Reports Server (NTRS)

    Wilson, Scott; Waterman, Robert; McCleskey, Carey

    2004-01-01

    Presentation Agenda : (1) Technology Maturation Pipeline (The Plan) (2) Cryogenic testbed (and other KSC Labs) (2a) Component / Subsystem technologies (3) Advanced Technology Development Center (ATDC) (3a) System / Vehic1e technologies (4) EL V Flight Experiments (Flight Testbeds).

  2. A Manufacturing B2B Interoperability Testbed

    SciTech Connect

    Ivezic, Nenad; Kulvatunyou, Boonserm; Jones, Albert

    2003-10-01

    This paper describes the NIST Manufacturing Business-to-Business Interoperability Testbed developed at the Manufacturing Systems Integration Division of the National Institute of Standards and Technology. The testbed is geared to advance state of practice and art in interoperable information systems for the extended manufacturing enterprise. We discuss lessons learned while developing a Web-based, distributed architecture in support of piloting and testing interoperability trials in collaboration with manufacturing organisations and software vendors. We make a case that the combination of industry-focused testing activities and bottom-up testing research and development efforts within this testbed offer unique benefits to all stakeholders in advancing interoperable systems for the manufacturing enterprise.

  3. Thermal structure analyses for CSM testbed (COMET)

    NASA Technical Reports Server (NTRS)

    Xue, David Y.; Mei, Chuh

    1994-01-01

    This document is the final report for the project entitled 'Thermal Structure Analyses for CSM Testbed (COMET),' for the period of May 16, 1992 - August 15, 1994. The project was focused on the investigation and development of finite element analysis capability of the computational structural mechanics (CSM) testbed (COMET) software system in the field of thermal structural responses. The stages of this project consisted of investigating present capabilities, developing new functions, analysis demonstrations, and research topics. The appendices of this report list the detailed documents of major accomplishments and demonstration runstreams for future references.

  4. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  5. Towards a testbed for malicious code detection

    SciTech Connect

    Lo, R.; Kerchen, P.; Crawford, R.; Ho, W.; Crossley, J.; Fink, G.; Levitt, K.; Olsson, R.; Archer, M. . Div. of Computer Science)

    1991-01-01

    This paper proposes an environment for detecting many types of malicious code, including computer viruses, Trojan horses, and time/logic bombs. This malicious code testbed (MCT) is based upon both static and dynamic analysis tools developed at the University of California, Davis, which have been shown to be effective against certain types of malicious code. The testbed extends the usefulness of these tools by using them in a complementary fashion to detect more general cases of malicious code. Perhaps more importantly, the MCT allows administrators and security analysts to check a program before installation, thereby avoiding any damage a malicious program might inflict. 5 refs., 2 figs., 2 tabs.

  6. The design and implementation of the LLNL gigabit testbed

    SciTech Connect

    Garcia, D.

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  7. Openings

    PubMed Central

    Selwyn, Peter A.

    2015-01-01

    Reviewing his clinic patient schedule for the day, a physician reflects on the history of a young woman he has been caring for over the past 9 years. What starts out as a routine visit then turns into a unique opening for communication and connection. A chance glimpse out the window of the exam room leads to a deeper meditation on parenthood, survival, and healing, not only for the patient but also for the physician. How many missed opportunities have we all had, without even realizing it, to allow this kind of fleeting but profound opening? PMID:26195687

  8. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    NASA Astrophysics Data System (ADS)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  9. Distributed computing testbed for a remote experimental environment

    SciTech Connect

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.; Greenwood, D.E.

    1995-09-18

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.

  10. Contrast analysis and stability on the ExAO testbed

    SciTech Connect

    Evans, J; Thomas, S; Gavel, D; Dillon, D; Macintosh, B

    2008-06-10

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. The Laboratory for Adaptive Optics at UC Santa Cruz is investigating limits to high-contrast imaging in support of the Gemini Planet Imager. Previous contrast measurements were made with a simple single-opening prolate spheroid shaped pupil that produced a limited region of high-contrast, particularly when wavefront errors were corrected with the 1024-actuator Boston Micromachines MEMS deformable mirror currently in use on the testbed. A more sophisticated shaped pupil is now being used that has a much larger region of interest facilitating a better understanding of high-contrast measurements. In particular we examine the effect of heat sources in the testbed on PSF stability. We find that rms image motion scales as 0.02 {lambda}/D per watt when the heat source is near the pupil plane. As a result heat sources of greater than 5 watts should be avoided near pupil planes for GPI. The safest place to introduce heat is near a focal plane. Heat also can effect the standard deviation of the high-contrast region but in the final instrument other sources of error should be more significant.

  11. Observed and simulated temperature dependence of the liquid water path of low clouds

    SciTech Connect

    Del Genio, A.D.; Wolf, A.B.

    1996-04-01

    Data being acquired at the Atmospheric Radiation Measurement (ARM) Southern great Plains (SGP) Cloud and Radiation Testbed (CART) site can be used to examine the factors determining the temperature dependence of cloud optical thickness. We focus on cloud liquid water and physical thickness variations which can be derived from existing ARM measurements.

  12. A Laboratory Testbed for Embedded Fuzzy Control

    ERIC Educational Resources Information Center

    Srivastava, S.; Sukumar, V.; Bhasin, P. S.; Arun Kumar, D.

    2011-01-01

    This paper presents a novel scheme called "Laboratory Testbed for Embedded Fuzzy Control of a Real Time Nonlinear System." The idea is based upon the fact that project-based learning motivates students to learn actively and to use their engineering skills acquired in their previous years of study. It also fosters initiative and focuses…

  13. Cognitive nonlinear radar test-bed

    NASA Astrophysics Data System (ADS)

    Hedden, Abigail S.; Wikner, David A.; Martone, Anthony; McNamara, David

    2013-05-01

    Providing situational awareness to the warfighter requires radar, communications, and other electronic systems that operate in increasingly cluttered and dynamic electromagnetic environments. There is a growing need for cognitive RF systems that are capable of monitoring, adapting to, and learning from their environments in order to maintain their effectiveness and functionality. Additionally, radar systems are needed that are capable of adapting to an increased number of targets of interest. Cognitive nonlinear radar may offer critical solutions to these growing problems. This work focuses on ongoing efforts at the U.S. Army Research Laboratory (ARL) to develop a cognitive nonlinear radar test-bed. ARL is working toward developing a test-bed that uses spectrum sensing to monitor the RF environment and dynamically change the transmit waveforms to achieve detection of nonlinear targets with high confidence. This work presents the architecture of the test-bed system along with a discussion of its current capabilities and limitations. A brief outlook is presented for the project along with a discussion of a future cognitive nonlinear radar test-bed.

  14. Flight Projects Office Information Systems Testbed (FIST)

    NASA Technical Reports Server (NTRS)

    Liggett, Patricia

    1991-01-01

    Viewgraphs on the Flight Projects Office Information Systems Testbed (FIST) are presented. The goal is to perform technology evaluation and prototyping of information systems to support SFOC and JPL flight projects in order to reduce risk in the development of operational data systems for such projects.

  15. SSERVI Analog Regolith Simulant Testbed Facility

    NASA Astrophysics Data System (ADS)

    Minafra, J.; Schmidt, G. K.

    2016-12-01

    SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers. The SSERVI Analog Regolith Simulant Testbed provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment. The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area. SSERVI provides a bridge between several groups, joining together researchers from: 1) scientific and exploration communities, 2) multiple disciplines across a wide range of planetary sciences, and 3) domestic and international communities and partnerships. This testbed provides a means of consolidating the tasks of acquisition, storage and safety mitigation in handling large quantities of regolith simulant Facility hardware and environment testing scenarios include, but are not limited to the following; Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, and Surface features (i.e. grades and rocks) Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and planetary exploration activities at NASA Research Park, to academia and expanded commercial opportunities in California's Silicon Valley, as well as public outreach and education opportunities.

  16. ARA testbed template based UHE neutrino search

    NASA Astrophysics Data System (ADS)

    Prohira, Steven

    2014-03-01

    The Askaryan Radio Array (ARA) is an in-ice Antarctic neutrino detector deployed near the South Pole. ARA is designed to detect ultra high energy (UHE) neutrinos in the range of 0.1-10 EeV. Data from the ARA testbed, deployed in the 2010-2011 season, is used for a template based neutrino search. Askaryan Radio Array.

  17. A Laboratory Testbed for Embedded Fuzzy Control

    ERIC Educational Resources Information Center

    Srivastava, S.; Sukumar, V.; Bhasin, P. S.; Arun Kumar, D.

    2011-01-01

    This paper presents a novel scheme called "Laboratory Testbed for Embedded Fuzzy Control of a Real Time Nonlinear System." The idea is based upon the fact that project-based learning motivates students to learn actively and to use their engineering skills acquired in their previous years of study. It also fosters initiative and focuses…

  18. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  19. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  20. First field demonstration of cloud datacenter workflow automation employing dynamic optical transport network resources under OpenStack and OpenFlow orchestration.

    PubMed

    Szyrkowiec, Thomas; Autenrieth, Achim; Gunning, Paul; Wright, Paul; Lord, Andrew; Elbers, Jörg-Peter; Lumb, Alan

    2014-02-10

    For the first time, we demonstrate the orchestration of elastic datacenter and inter-datacenter transport network resources using a combination of OpenStack and OpenFlow. Programmatic control allows a datacenter operator to dynamically request optical lightpaths from a transport network operator to accommodate rapid changes of inter-datacenter workflows.

  1. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  2. Sparse matrix methods research using the CSM testbed software system

    NASA Technical Reports Server (NTRS)

    Chu, Eleanor; George, J. Alan

    1989-01-01

    Research is described on sparse matrix techniques for the Computational Structural Mechanics (CSM) Testbed. The primary objective was to compare the performance of state-of-the-art techniques for solving sparse systems with those that are currently available in the CSM Testbed. Thus, one of the first tasks was to become familiar with the structure of the testbed, and to install some or all of the SPARSPAK package in the testbed. A suite of subroutines to extract from the data base the relevant structural and numerical information about the matrix equations was written, and all the demonstration problems distributed with the testbed were successfully solved. These codes were documented, and performance studies comparing the SPARSPAK technology to the methods currently in the testbed were completed. In addition, some preliminary studies were done comparing some recently developed out-of-core techniques with the performance of the testbed processor INV.

  3. Time-multiplexed open-path TDLAS spectrometer for dynamic, sampling-free, interstitial H2 18O and H2 16O vapor detection in ice clouds

    NASA Astrophysics Data System (ADS)

    Kühnreich, B.; Wagner, S.; Habig, J. C.; Möhler, O.; Saathoff, H.; Ebert, V.

    2015-04-01

    An advanced in situ diode laser hygrometer for simultaneous, sampling-free detection of interstitial H2 16O and H2 18O vapor was developed and tested in the aerosol interaction and dynamics in atmosphere (AIDA) cloud chamber during dynamic cloud formation processes. The spectrometer to measure isotope-resolved water vapor concentrations comprises two rapidly time-multiplexed DFB lasers near 1.4 and 2.7 µm and an open-path White cell with 227-m absorption path length and 4-m mirror separation. A dynamic water concentration range from 2.6 ppb to 87 ppm for H2 16O and 87 ppt to 3.6 ppm for H2 18O could be achieved and was used to enable a fast and direct detection of dynamic isotope ratio changes during ice cloud formation in the AIDA chamber at temperatures between 190 and 230 K. Relative changes in the H2 18O/H2 16O isotope ratio of 1 % could be detected and resolved with a signal-to-noise ratio of 7. This converts to an isotope ratio resolution limit of 0.15 % at 1-s time resolution.

  4. Cloud Forensics Issues

    DTIC Science & Technology

    2014-07-01

    encryption is needed and the need for a comprehensive key management process for public key infrastructure, as well as session and other cryptologic keys...In a community cloud, a group of organizations with similar interests or needs share a cloud infrastructure. That infrastructure is not open to the...since it is unlikely that a large proportion of cloud consumers will simultaneously have high utilization needs . The cloud environment can

  5. Mini-mast CSI testbed user's guide

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  6. DEVELOPMENT OF A FACILITY MONITORING TESTBED

    SciTech Connect

    A. M. MIELKE; C. M. BOYLE; ET AL

    2001-06-01

    The Advanced Surveillance Technology (AST) project at Los Alamos National Laboratory (LANL), funded by the Nonproliferation Research and Engineering Group (NN-20) of the National Nuclear Security Administration (NNSA), is fielding a facility monitoring application testbed at the National High Magnetic Field Laboratory-Pulsed Field Laboratory (NHMFL-PFL). This application is designed to utilize continuous remote monitoring technology to provide an additional layer of personnel safety assurance and equipment fault prediction capability in the laboratory. Various off-the-shelf surveillance sensor technologies are evaluated. In this testbed environment, several of the deployed monitoring sensors have detected transient precursor equipment-fault events. Additionally the prototype remote monitoring system employs specialized video state recognition software to determine whether the operations occurring within the facility are acceptable, given the observed equipment status. By integrating the Guardian reasoning system developed at LANL, anomalous facility events trigger alarms signaling personnel to the likelihood of an equipment failure or unsafe operation.

  7. VCE testbed program planning and definition study

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.; Godston, J.

    1978-01-01

    The flight definition of the Variable Stream Control Engine (VSCE) was updated to reflect design improvements in the two key components: (1) the low emissions duct burner, and (2) the coannular exhaust nozzle. The testbed design was defined and plans for the overall program were formulated. The effect of these improvements was evaluated for performance, emissions, noise, weight, and length. For experimental large scale testing of the duct burner and coannular nozzle, a design definition of the VCE testbed configuration was made. This included selecting the core engine, determining instrumentation requirements, and selecting the test facilities, in addition to defining control system and assembly requirements. Plans for a comprehensive test program to demonstrate the duct burner and nozzle technologies were formulated. The plans include both aeroacoustic and emissions testing.

  8. Overview of the Telescience Testbed Program

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad; Leiner, Barry M.

    1991-01-01

    The NASA's Telescience Testbed Program (TTP) conducted by the Ames Research Center is described with particular attention to the objectives, the approach used to achieve these objectives, and the expected benefits of the program. The goal of the TTP is to gain operational experience for the Space Station Freedom and the Earth Observing System programs, using ground testbeds, and to define the information and communication systems requirements for the development and operation of these programs. The results of TTP are expected to include the requirements for the remote coaching, command and control, monitoring and maintenance, payload design, and operations management. In addition, requirements for technologies such as workstations, software, video, automation, data management, and networking will be defined.

  9. Overview of the Telescience Testbed Program

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad; Leiner, Barry M.

    1991-01-01

    The NASA's Telescience Testbed Program (TTP) conducted by the Ames Research Center is described with particular attention to the objectives, the approach used to achieve these objectives, and the expected benefits of the program. The goal of the TTP is to gain operational experience for the Space Station Freedom and the Earth Observing System programs, using ground testbeds, and to define the information and communication systems requirements for the development and operation of these programs. The results of TTP are expected to include the requirements for the remote coaching, command and control, monitoring and maintenance, payload design, and operations management. In addition, requirements for technologies such as workstations, software, video, automation, data management, and networking will be defined.

  10. Automatic Cloud Bursting under FermiCloud

    SciTech Connect

    Wu, Hao; Shangping, Ren; Garzoglio, Gabriele; Timm, Steven; Bernabeu, Gerard; Kim, Hyun Woo; Chadwick, Keith; Jang, Haengjin; Noh, Seo-Young

    2013-01-01

    Cloud computing is changing the infrastructure upon which scientific computing depends from supercomputers and distributed computing clusters to a more elastic cloud-based structure. The service-oriented focus and elasticity of clouds can not only facilitate technology needs of emerging business but also shorten response time and reduce operational costs of traditional scientific applications. Fermi National Accelerator Laboratory (Fermilab) is currently in the process of building its own private cloud, FermiCloud, which allows the existing grid infrastructure to use dynamically provisioned resources on FermiCloud to accommodate increased but dynamic computation demand from scientists in the domains of High Energy Physics (HEP) and other research areas. Cloud infrastructure also allows to increase a private cloud’s resource capacity through “bursting” by borrowing or renting resources from other community or commercial clouds when needed. This paper introduces a joint project on building a cloud federation to support HEP applications between Fermi National Accelerator Laboratory and Korea Institution of Science and Technology Information, with technical contributions from the Illinois Institute of Technology. In particular, this paper presents two recent accomplishments of the joint project: (a) cloud bursting automation and (b) load balancer. Automatic cloud bursting allows computer resources to be dynamically reconfigured to meet users’ demands. The load balance algorithm which the cloud bursting depends on decides when and where new resources need to be allocated. Our preliminary prototyping and experiments have shown promising success, yet, they also have opened new challenges to be studied

  11. Cognitive Medical Wireless Testbed System (COMWITS)

    DTIC Science & Technology

    2016-11-01

    Testbed Use in Research and Education 42 8.1 Education 42 8.2 Research 43 9 Conclusions and Lessons Learned 45 References 47 Appendix A...setup for the LTE vulnerability analyses of [26] [27] [29]. COMWITS Virginia Tech 45 9. Conclusions and Lessons Learned The...systems. Some of the lessons learned are described below. A. Computing power The CPU load at the software eNodeB is proportional to the LTE bandwidth

  12. Variable Dynamic Testbed Vehicle: Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Lee, A. Y.; Le, N. T.; Marriott, A. T.

    1997-01-01

    The Variable Dynamic Testbed Vehicle (VDTV) concept has been proposed as a tool to evaluate collision avoidance systems and to perform driving-related human factors research. The goal of this study is to analytically investigate to what extent a VDTV with adjustable front and rear anti-roll bar stiffnesses, programmable damping rates, and four-wheel-steering can emulate the lateral dynamics of a broad range of passenger vehicles.

  13. Commissioning Results on the JWST Testbed Telescope

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.; Acton, D. Scott

    2006-01-01

    The one-meter 18 segment JWST Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate commissioning operations for the JWST Observatory. Eight different commissioning activities were tested on the TBT: telescope focus sweep, segment ID and Search, image array, global alignment, image stacking, coarse phasing, fine phasing, and multi-field phasing. This paper describes recent commissioning results from experiments performed on the TBT.

  14. SSERVI Analog Regolith Simulant Testbed Facility

    NASA Astrophysics Data System (ADS)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  15. Variable Dynamic Testbed Vehicle: Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Lee, A. Y.; Le, N. T.; Marriott, A. T.

    1997-01-01

    The Variable Dynamic Testbed Vehicle (VDTV) concept has been proposed as a tool to evaluate collision avoidance systems and to perform driving-related human factors research. The goal of this study is to analytically investigate to what extent a VDTV with adjustable front and rear anti-roll bar stiffnesses, programmable damping rates, and four-wheel-steering can emulate the lateral dynamics of a broad range of passenger vehicles.

  16. Thermodynamic and cloud parameter retrieval using infrared spectral data

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Huang, Hung-Lung A.; Li, Jun; McGill, Matthew J.; Mango, Stephen A.

    2005-01-01

    High-resolution infrared radiance spectra obtained from near nadir observations provide atmospheric, surface, and cloud property information. A fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. The retrieval algorithm is presented along with its application to recent field experiment data from the NPOESS Airborne Sounding Testbed - Interferometer (NAST-I). The retrieval accuracy dependence on cloud properties is discussed. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with an accuracy of approximately 1.0 km. Preliminary NAST-I retrieval results from the recent Atlantic-THORPEX Regional Campaign (ATReC) are presented and compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL).

  17. Dynamic federation of grid and cloud storage

    NASA Astrophysics Data System (ADS)

    Furano, Fabrizio; Keeble, Oliver; Field, Laurence

    2016-09-01

    The Dynamic Federations project ("Dynafed") enables the deployment of scalable, distributed storage systems composed of independent storage endpoints. While the Uniform Generic Redirector at the heart of the project is protocol-agnostic, we have focused our effort on HTTP-based protocols, including S3 and WebDAV. The system has been deployed on testbeds covering the majority of the ATLAS and LHCb data, and supports geography-aware replica selection. The work done exploits the federation potential of HTTP to build systems that offer uniform, scalable, catalogue-less access to the storage and metadata ensemble and the possibility of seamless integration of other compatible resources such as those from cloud providers. Dynafed can exploit the potential of the S3 delegation scheme, effectively federating on the fly any number of S3 buckets from different providers and applying a uniform authorization to them. This feature has been used to deploy in production the BOINC Data Bridge, which uses the Uniform Generic Redirector with S3 buckets to harmonize the BOINC authorization scheme with the Grid/X509. The Data Bridge has been deployed in production with good results. We believe that the features of a loosely coupled federation of open-protocolbased storage elements open many possibilities of smoothly evolving the current computing models and of supporting new scientific computing projects that rely on massive distribution of data and that would appreciate systems that can more easily be interfaced with commercial providers and can work natively with Web browsers and clients.

  18. The Algae Testbed Public-Private Partnership (ATP 3 ) framework; establishment of a national network of testbed sites to support sustainable algae production

    DOE PAGES

    McGowen, John; Knoshaug, Eric P.; Laurens, Lieve M. L.; ...

    2017-07-01

    Well-controlled experiments that directly compare seasonal algal productivities across geographically distinct locations have not been reported before. To fill this gap, six cultivation testbed facilities were chosen across the United States to evaluate different climatic zones with respect to algal biomass productivity potential. The geographical locations and climates were as follows: Southwest, desert; Western, coastal; Southeast, inland; Southeast, coastal; Pacific, tropical; and Midwest, greenhouse. The testbed facilities were equipped with identical systems for inoculum production and open pond operation and methods were standardized across all testbeds to ensure accurate measurement of physical and biological variables. The ability of the testbedmore » sites to culture and analyze the same algal species, Nannochloropsis oceanica KA32, using identical pond operational and data collection procedures was evaluated during the same seasonal timeframe. This manuscript describes the results of a first-of-its-kind coordinated testbed validation field study while providing critical details on how geographical variations in temperature, light, and weather variables influenced algal productivity, nitrate consumption, and biomass composition. We found distinct differences in growth characteristics due to the geographic location and the resulting climatic and seasonal conditions across the sites, with the highest productivities observed at the desert Southwest and tropical Pacific regions, followed by the Western coastal region. The lowest productivities were observed at the Southeast inland and Midwest greenhouse locations. These differences in productivities among the sites correlated with the differences in pond water temperature and available solar radiation. In addition two sites, the tropical Pacific and Southeast inland experienced unusual events, spontaneous flocculation, and unusually cold and wet (rainfall) conditions respectively, that negatively affected

  19. The Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Hines, Braden; Bell, Charles; Shen, Tsae-Pyng; Bloemhof, Eric; Zhao, Feng; Regehr, Martin; Holmes, Howard; Irigoyen, Robert; Neat, Gregory

    2003-01-01

    The Micro-Arcsecond Metrology (MAM) testbed is a ground-based system of optical and electronic equipment for testing components, systems, and engineering concepts for the Space Interferometer Mission (SIM) and similar future missions, in which optical interferometers will be operated in outer space. In addition, the MAM testbed is of interest in its own right as a highly precise metrological system. The designs of the SIM interferometer and the MAM testbed reflect a requirement to measure both the position of the starlight central fringe and the change in the internal optical path of the interferometer with sufficient spatial resolution to generate astrometric data with angular resolution at the microarcsecond level. The internal path is to be measured by use of a small metrological laser beam of 1,319-nm wavelength, whereas the position of the starlight fringe is to be estimated by use of a charge-coupled-device (CCD) image detector sampling a large concentric annular beam. For the SIM to succeed, the optical path length determined from the interferometer fringes must be tracked by the metrological subsystem to within tens of picometers, through all operational motions of an interferometer delay line and siderostats. The purpose of the experiments performed on the MAM testbed is to demonstrate this agreement in a large-scale simulation that includes a substantial portion of the system in the planned configuration for operation in outer space. A major challenge in this endeavor is to align the metrological beam with the starlight beam in order to maintain consistency between the metrological and starlight subsystems at the system level. The MAM testbed includes an optical interferometer with a white light source, all major optical components of a stellar interferometer, and heterodyne metrological sensors. The aforementioned subsystems are installed in a large vacuum chamber in order to suppress atmospheric and thermal disturbances. The MAM is divided into two

  20. Evaluating Aerosol Process Modules within the Framework of the Aerosol Modeling Testbed

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Velu, V.; Gustafson, W. I.; Chapman, E.; Easter, R. C.; Shrivastava, M.; Singh, B.

    2012-12-01

    Factors that influence predictions of aerosol direct and indirect forcing, such as aerosol mass, composition, size distribution, hygroscopicity, and optical properties, still contain large uncertainties in both regional and global models. New aerosol treatments are usually implemented into a 3-D atmospheric model and evaluated using a limited number of measurements from a specific case study. Under this modeling paradigm, the performance and computational efficiency of several treatments for a specific aerosol process cannot be adequately quantified because many other processes among various modeling studies (e.g. grid configuration, meteorology, emission rates) are different as well. The scientific community needs to know the advantages and disadvantages of specific aerosol treatments when the meteorology, chemistry, and other aerosol processes are identical in order to reduce the uncertainties associated with aerosols predictions. To address these issues, an Aerosol Modeling Testbed (AMT) has been developed that systematically and objectively evaluates new aerosol treatments for use in regional and global models. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from the Community Atmosphere Model version 5 (CAM5) have also been ported to WRF so that they can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. Data from several campaigns, including the 2006

  1. Overview on In-Space Internet Node Testbed (ISINT)

    NASA Technical Reports Server (NTRS)

    Richard, Alan M.; Kachmar, Brian A.; Fabian, Theodore; Kerczewski, Robert J.

    2000-01-01

    The Satellite Networks and Architecture Branch has developed the In-Space Internet Node Technology testbed (ISINT) for investigating the use of commercial Internet products for NASA missions. The testbed connects two closed subnets over a tabletop Ka-band transponder by using commercial routers and modems. Since many NASA assets are in low Earth orbits (LEO's), the testbed simulates the varying signal strength, changing propagation delay, and varying connection times that are normally experienced when communicating to the Earth via a geosynchronous orbiting (GEO) communications satellite. Research results from using this testbed will be used to determine which Internet technologies are appropriate for NASA's future communication needs.

  2. ISS Update: ISTAR -- International Space Station Testbed for Analog Research

    NASA Image and Video Library

    NASA Public Affairs Officer Kelly Humphries interviews Sandra Fletcher, EVA Systems Flight Controller. They discuss the International Space Station Testbed for Analog Research (ISTAR) activity that...

  3. Design of the Dual Conjugate Adaptive Optics Test-bed

    NASA Astrophysics Data System (ADS)

    Sharf, Inna; Bell, K.; Crampton, D.; Fitzsimmons, J.; Herriot, Glen; Jolissaint, Laurent; Lee, B.; Richardson, H.; van der Kamp, D.; Veran, Jean-Pierre

    In this paper, we describe the Multi-Conjugate Adaptive Optics laboratory test-bed presently under construction at the University of Victoria, Canada. The test-bench will be used to support research in the performance of multi-conjugate adaptive optics, turbulence simulators, laser guide stars and miniaturizing adaptive optics. The main components of the test-bed include two micro-machined deformable mirrors, a tip-tilt mirror, four wavefront sensors, a source simulator, a dual-layer turbulence simulator, as well as computational and control hardware. The paper will describe in detail the opto-mechanical design of the adaptive optics module, the design of the hot-air turbulence generator and the configuration chosen for the source simulator. Below, we present a summary of these aspects of the bench. The optical and mechanical design of the test-bed has been largely driven by the particular choice of the deformable mirrors. These are continuous micro-machined mirrors manufactured by Boston Micromachines Corporation. They have a clear aperture of 3.3 mm and are deformed with 140 actuators arranged in a square grid. Although the mirrors have an open-loop bandwidth of 6.6 KHz, their shape can be updated at a sampling rate of 100 Hz. In our optical design, the mirrors are conjugated at 0km and 10 km in the atmosphere. A planar optical layout was achieved by using four off-axis paraboloids and several folding mirrors. These optics will be mounted on two solid blocks which can be aligned with respect to each other. The wavefront path design accommodates 3 monochromatic guide stars that can be placed at either 90 km or at infinity. The design relies on the natural separation of the beam into 3 parts because of differences in locations of the guide stars in the field of view. In total four wavefront sensors will be procured from Adaptive Optics Associates (AOA) or built in-house: three for the guide stars and the fourth to collect data from the science source output in

  4. MRMS Experimental Testbed for Operational Products (METOP)

    NASA Astrophysics Data System (ADS)

    Zhang, J.

    2016-12-01

    Accurate high-resolution quantitative precipitation estimation (QPE) at the continental scale is of critical importance to the nation's weather, water and climate services. To address this need, a Multi-Radar Multi-Sensor (MRMS) system was developed at the National Severe Storms Lab of National Oceanic and Atmospheric Administration that integrates radar, gauge, model and satellite data and provides a suite of QPE products at 1-km and 2-min resolution. MRMS system consists of three components: 1) an operational system; 2) a real-time research system; 3) an archive testbed. The operational system currently provides instantaneous precipitation rate, type and 1- to 72-hr accumulations for conterminous United Stated and southern Canada. The research system has the similar hardware infrastructure and data environment as the operational system, but runs newer and more advanced algorithms. The newer algorithms are tested on the research system for robustness and computational efficiency in a pseudo operational environment before they are transitioned into operations. The archive testbed, also called the MRMS Experimental Testbed for Operational Products (METOP), consists of a large database that encompasses a wide range of hydroclimatological and geographical regimes. METOP is for the testing and refinements of the most advanced radar QPE techniques, which are often developed on specific data from limited times and locations. The archive data includes quality controlled in-situ observations for the validation of the new radar QPE across all seasons and geographic regions. A number of operational QPE products derived from different sensors/models are also included in METOP for the fusion of multiple sources of complementary precipitation information. This paper is an introduction of the METOP system.

  5. Stochastic Radiative Transfer in Polar Mixed Phase Clouds

    NASA Astrophysics Data System (ADS)

    Brodie, J.; Veron, D. E.

    2004-12-01

    According to recent research, mixed phase clouds comprise one third of the overall annual cloud cover in the Arctic region. These clouds contain distinct regions of liquid water and ice, which have a different impact on radiation than single-phase clouds. Despite the prevalence of mixed phase clouds in the polar regions, many modern atmospheric general circulation models use single-phase clouds in their radiation routines. A stochastic approach to representating the transfer of shortwave radiation through a cloud layer where the distribution of the ice and liquid is governed by observed statistics is being assessed. Data from the Surface Heat Budget of the Arctic (SHEBA) program and the Atmospheric Radiation Measurement (ARM) program's North Slopes of Alaska Cloud and Radiation Testbed site will be used to determine the characteristic features of the cloud field and to evaluate the performance of this statistical model.

  6. Application of Model-based Prognostics to a Pneumatic Valves Testbed

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.; Gorospe, George

    2014-01-01

    Pneumatic-actuated valves play an important role in many applications, including cryogenic propellant loading for space operations. Model-based prognostics emphasizes the importance of a model that describes the nominal and faulty behavior of a system, and how faulty behavior progresses in time, causing the end of useful life of the system. We describe the construction of a testbed consisting of a pneumatic valve that allows the injection of faulty behavior and controllable fault progression. The valve opens discretely, and is controlled through a solenoid valve. Controllable leaks of pneumatic gas in the testbed are introduced through proportional valves, allowing the testing and validation of prognostics algorithms for pneumatic valves. A new valve prognostics approach is developed that estimates fault progression and predicts remaining life based only on valve timing measurements. Simulation experiments demonstrate and validate the approach.

  7. Development, Demonstration, and Control of a Testbed for Multiterminal HVDC System

    SciTech Connect

    Li, Yalong; Shi, Xiaojie M.; Liu, Bo; Lei, Wanjun; Wang, Fred; Tolbert, Leon M.

    2016-10-21

    This paper presents the development of a scaled four-terminal high-voltage direct current (HVDC) testbed, including hardware structure, communication architecture, and different control schemes. The developed testbed is capable of emulating typical operation scenarios including system start-up, power variation, line contingency, and converter station failure. Some unique scenarios are also developed and demonstrated, such as online control mode transition and station re-commission. In particular, a dc line current control is proposed, through the regulation of a converter station at one terminal. By controlling a dc line current to zero, the transmission line can be opened by using relatively low-cost HVDC disconnects with low current interrupting capability, instead of the more expensive dc circuit breaker. Utilizing the dc line current control, an automatic line current limiting scheme is developed. As a result, when a dc line is overloaded, the line current control will be automatically activated to regulate current within the allowable maximum value.

  8. Telescience testbed result for Japanese experiment module

    NASA Astrophysics Data System (ADS)

    Matsumoto, K.; Higuchi, K.; Kimura, H.; Takeda, N.; Matsubara, S.; Izumita, M.; Toyama, Y.; Kato, M.; Kato, H.

    1990-10-01

    The first telescience testbed experiments for the Japanese Experiment Module (JEM) of the Space Station Freedom, conducted after the three year studies of its system requirements, are described. Three experiment themes of the First Material Processing Test (FMPT) of the Japanese Spacelab Mission are chosen for estimating communications requirements between the JEM and a ground station. A paper folding experiment is used to examine instruction aspects of onboard manual processing and onboard coaching. More than 10 principal investigators partipated in the experiments and were requested to answer a rating questionnaire for data acquisition. The results extracted from the questionnaire are summarized.

  9. Wavefront Control Testbed (WCT) Experiment Results

    NASA Technical Reports Server (NTRS)

    Burns, Laura A.; Basinger, Scott A.; Campion, Scott D.; Faust, Jessica A.; Feinberg, Lee D.; Hayden, William L.; Lowman, Andrew E.; Ohara, Catherine M.; Petrone, Peter P., III

    2004-01-01

    The Wavefront Control Testbed (WCT) was created to develop and test wavefront sensing and control algorithms and software for the segmented James Webb Space Telescope (JWST). Last year, we changed the system configuration from three sparse aperture segments to a filled aperture with three pie shaped segments. With this upgrade we have performed experiments on fine phasing with line-of-sight and segment-to-segment jitter, dispersed fringe visibility and grism angle;. high dynamic range tilt sensing; coarse phasing with large aberrations, and sampled sub-aperture testing. This paper reviews the results of these experiments.

  10. ITS detector testbed system design and implementation

    NASA Astrophysics Data System (ADS)

    Chang, Edmond C. P.

    1999-03-01

    Intelligent Transportation Systems (ITS) implemented all over the world, has become an important and practical traffic management technique. Among all ITS subsystems, the detection system plays an integral element that provides all the necessary environmental information to the ITS infrastructure. This paper describes the ITS Detector testbed design, currently being implemented with these potential ITS applications on the State Highway 6 in College Station, Texas to provide a multi-sensor, multi-source fusion environment that utilizes both multi-sensor and distributed sensor system testing environment.

  11. A commercial space technology testbed on ISS

    NASA Astrophysics Data System (ADS)

    Boyle, David R.

    2000-01-01

    There is a significant and growing commercial market for new, more capable communications and remote sensing satellites. Competition in this market strongly motivates satellite manufacturers and spacecraft component developers to test and demonstrate new space hardware in a realistic environment. External attach points on the International Space Station allow it to function uniquely as a space technology testbed to satisfy this market need. However, space industry officials have identified three critical barriers to their commercial use of the ISS: unpredictable access, cost risk, and schedule uncertainty. Appropriate NASA policy initiatives and business/technical assistance for industry from the Commercial Space Center for Engineering can overcome these barriers. .

  12. Airborne Open Polar/Imaging Nephelometer for Ice Particles in Cirrus Clouds and Aerosols Field Campaign Report

    SciTech Connect

    Martins, JV

    2016-04-01

    The Open Imaging Nephelometer (O-I-Neph) instrument is an adaptation of a proven laboratory instrument built and tested at the University of Maryland, Baltimore County (UMBC), the Polarized Imaging Nephelometer (PI-Neph). The instrument design of both imaging nephelometers uses a narrow-beam laser source and a wide-field-of-view imaging camera to capture the entire scattering-phase function in one image, quasi-instantaneously.

  13. Application developer's tutorial for the CSM testbed architecture

    NASA Technical Reports Server (NTRS)

    Underwood, Phillip; Felippa, Carlos A.

    1988-01-01

    This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.

  14. The Magellan Final Report on Cloud Computing

    SciTech Connect

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  15. Developing the science product algorithm testbed for Chinese next-generation geostationary meteorological satellites: Fengyun-4 series

    NASA Astrophysics Data System (ADS)

    Min, Min; Wu, Chunqiang; Li, Chuan; Liu, Hui; Xu, Na; Wu, Xiao; Chen, Lin; Wang, Fu; Sun, Fenglin; Qin, Danyu; Wang, Xi; Li, Bo; Zheng, Zhaojun; Cao, Guangzhen; Dong, Lixin

    2017-08-01

    Fengyun-4A (FY-4A), the first of the Chinese next-generation geostationary meteorological satellites, launched in 2016, offers several advances over the FY-2: more spectral bands, faster imaging, and infrared hyperspectral measurements. To support the major objective of developing the prototypes of FY-4 science algorithms, two science product algorithm testbeds for imagers and sounders have been developed by the scientists in the FY-4 Algorithm Working Group (AWG). Both testbeds, written in FORTRAN and C programming languages for Linux or UNIX systems, have been tested successfully by using Intel/g compilers. Some important FY-4 science products, including cloud mask, cloud properties, and temperature profiles, have been retrieved successfully through using a proxy imager, Himawari-8/Advanced Himawari Imager (AHI), and sounder data, obtained from the Atmospheric InfraRed Sounder, thus demonstrating their robustness. In addition, in early 2016, the FY-4 AWG was developed based on the imager testbed—a near real-time processing system for Himawari-8/AHI data for use by Chinese weather forecasters. Consequently, robust and flexible science product algorithm testbeds have provided essential and productive tools for popularizing FY-4 data and developing substantial improvements in FY-4 products.

  16. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    SciTech Connect

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  17. Testbed for the development of intelligent robot control

    SciTech Connect

    Harrigan, R.W.

    1986-01-01

    The Sensor Driven Robot Systems Testbed has been constructed to provide a working environment to aid in the development of intelligent robot control software. The Testbed employs vision and force as the robot's means of interrogating its environment. The Testbed, which has been operational for approximately 24 months, consists of a PUMA-560 robot manipulator coupled to a 2-dimensional vision system and force and torque sensing wrist. Recent work within the Testbed environment has led to a highly modularized control software concept with emphasis on detection and resolution of error situations. The objective of the Testbed is to develop intelligent robot control concepts incorporating planning and error recovery which are transportable to a wide variety of robot applications. This project is an ongoing, longterm development project and, as such, this paper represents a status report of the development work.

  18. Cross layer optimization for cloud-based radio over optical fiber networks

    NASA Astrophysics Data System (ADS)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  19. Gemini Planet Imager Coronagraph Testbed Results

    SciTech Connect

    Sivaranmakrishnan, A.; Carr, G.; Soummer, R.; Oppenheimer, B.R.; Mey, J.L.; Brenner, D.; Mandeville, C.W.; Zimmerman, N. Macintosh, B.A.; Graham, J.R.; Saddlemyer, L.; Bauman, B.; Carlotti, A.; Pueyo, L.; Tuthill, P.G.; Dorrer, C.; Roberts, R.; Greenbaum, A.

    2010-12-08

    The Gemini Planet Imager (GPI) is an extreme AO coronagraphic integral field unit YJHK spectrograph destined for first light on the 8m Gemini South telescope in 2011. GPI fields a 1500 channel AO system feeding an apodized pupil Lyot coronagraph, and a nIR non-common-path slow wavefront sensor. It targets detection and characterizion of relatively young (<2GYr), self luminous planets up to 10 million times as faint as their primary star. We present the coronagraph subsystem's in-lab performance, and describe the studies required to specify and fabricate the coronagraph. Coronagraphic pupil apodization is implemented with metallic half-tone screens on glass, and the focal plane occulters are deep reactive ion etched holes in optically polished silicon mirrors. Our JH testbed achieves H-band contrast below a million at separations above 5 resolution elements, without using an AO system. We present an overview of the coronagraphic masks and our testbed coronagraphic data. We also demonstrate the performance of an astrometric and photometric grid that enables coronagraphic astrometry relative to the primary star in every exposure, a proven technique that has yielded on-sky precision of the order of a milliarsecond.

  20. A Turbine-powered UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; High, James W.; Guerreiro, Nelson M.; Chambers, Ryan S.; Howard, Keith D.

    2007-01-01

    The latest version of the NASA Flying Controls Testbed (FLiC) integrates commercial-off-the-shelf components including airframe, autopilot, and a small turbine engine to provide a low cost experimental flight controls testbed capable of sustained speeds up to 200 mph. The series of flight tests leading up to the demonstrated performance of the vehicle in sustained, autopiloted 200 mph flight at NASA Wallops Flight Facility's UAV runway in August 2006 will be described. Earlier versions of the FLiC were based on a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate at Fort Eustis, Virginia and NASA Langley Research Center. The newer turbine powered platform (J-FLiC) builds on the successes using the relatively smaller, slower and less expensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches with the implementation of C-coded experimental controllers. Tracking video was taken during the test flights at Wallops and will be available for presentation at the conference. Analysis of flight data from both remotely piloted and autopiloted flights will be presented. Candidate experimental controllers for implementation will be discussed. It is anticipated that flight testing will resume in Spring 2007 and those results will be included, if possible.

  1. Sparse aperture mask wavefront sensor testbed results

    NASA Astrophysics Data System (ADS)

    Subedi, Hari; Zimmerman, Neil T.; Kasdin, N. Jeremy; Riggs, A. J. E.

    2016-07-01

    Coronagraphic exoplanet detection at very high contrast requires the estimation and control of low-order wave- front aberrations. At Princeton High Contrast Imaging Lab (PHCIL), we are working on a new technique that integrates a sparse-aperture mask (SAM) with a shaped pupil coronagraph (SPC) to make precise estimates of these low-order aberrations. We collect the starlight rejected from the coronagraphic image plane and interfere it using a sparse aperture mask (SAM) at the relay pupil to estimate the low-order aberrations. In our previous work we numerically demonstrated the efficacy of the technique, and proposed a method to sense and control these differential aberrations in broadband light. We also presented early testbed results in which the SAM was used to sense pointing errors. In this paper, we will briefly overview the SAM wavefront sensor technique, explain the design of the completed testbed, and report the experimental estimation results of the dominant low-order aberrations such as tip/tit, astigmatism and focus.

  2. Gemini Planet Imager coronagraph testbed results

    NASA Astrophysics Data System (ADS)

    Sivaramakrishnan, Anand; Soummer, Rémi; Oppenheimer, Ben R.; Carr, G. Lawrence; Mey, Jacob L.; Brenner, Doug; Mandeville, Charles W.; Zimmerman, Neil; Macintosh, Bruce A.; Graham, James R.; Saddlemyer, Les; Bauman, Brian; Carlotti, Alexis; Pueyo, Laurent; Tuthill, Peter G.; Dorrer, Christophe; Roberts, Robin; Greenbaum, Alexandra

    2010-07-01

    The Gemini Planet Imager (GPI) is an extreme AO coronagraphic integral field unit YJHK spectrograph destined for first light on the 8m Gemini South telescope in 2011. GPI fields a 1500 channel AO system feeding an apodized pupil Lyot coronagraph, and a nIR non-common-path slow wavefront sensor. It targets detection and characterizion of relatively young (<2GYr), self luminous planets up to 10 million times as faint as their primary star. We present the coronagraph subsystem's in-lab performance, and describe the studies required to specify and fabricate the coronagraph. Coronagraphic pupil apodization is implemented with metallic half-tone screens on glass, and the focal plane occulters are deep reactive ion etched holes in optically polished silicon mirrors. Our JH testbed achieves H-band contrast below a million at separations above 5 resolution elements, without using an AO system. We present an overview of the coronagraphic masks and our testbed coronagraphic data. We also demonstrate the performance of an astrometric and photometric grid that enables coronagraphic astrometry relative to the primary star in every exposure, a proven technique that has yielded on-sky precision of the order of a milliarsecond.

  3. The Gemini Planet Imager coronagraph testbed

    NASA Astrophysics Data System (ADS)

    Soummer, Rémi; Sivaramakrishnan, Anand; Oppenheimer, Ben R.; Roberts, Robin; Brenner, Douglas; Carlotti, Alexis; Pueyo, Laurent; Macintosh, Bruce; Bauman, Brian; Saddlemyer, Les; Palmer, David; Erickson, Darren; Dorrer, Christophe; Caputa, Kris; Marois, Christian; Wallace, Kent; Griffiths, Emily; Mey, Jacob

    2009-08-01

    The Gemini Planet Imager (GPI) is a new facility instrument to be commissioned at the 8-m Gemini South telescope in early 2011. It combines of several subsystems including a 1500 subaperture Extreme Adaptive Optics system, an Apodized Pupil Lyot Coronagraph, a near-infrared high-accuracy interferometric wavefront sensor, and an Integral Field Unit Spectrograph, which serves as the science instrument. GPI's main scientific goal is to detect and characterize relatively young (<2GYr), self luminous planets with planet-star brightness ratios of <= 10-7 in the near infrared. Here we present an overview of the coronagraph subsystem, which includes a pupil apodization, a hard-edged focal plane mask and a Lyot stop. We discuss designs optimization, masks fabrication and testing. We describe a near infrared testbed, which achieved broadband contrast (H-band) below 10-6 at separations > 5λ/D, without active wavefront control (no deformable mirror). We use Fresnel propagation modeling to analyze the testbed results.

  4. xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud[OPEN

    PubMed Central

    Merchant, Nirav

    2016-01-01

    Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today’s pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant’s Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. PMID:27020957

  5. Expediting Experiments across Testbeds with AnyBed: A Testbed-Independent Topology Configuration System and Its Tool Set

    NASA Astrophysics Data System (ADS)

    Suzuki, Mio; Hazeyama, Hiroaki; Miyamoto, Daisuke; Miwa, Shinsuke; Kadobayashi, Youki

    Building an experimental network within a testbed has been a tiresome process for experimenters, due to the complexity of the physical resource assignment and the configuration overhead. Also, the process could not be expedited across testbeds, because the syntax of a configuration file varies depending on specific hardware and software. Re-configuration of an experimental topology for each testbed wastes time, an experimenter could not carry out his/her experiments during the limited lease time of a testbed at worst. In this paper, we propose the AnyBed: the experimental network-building system. The conceptual idea of AnyBed is “If experimental network topologies can be portable across any kinds of testbed, then, it would expedite building an experimental network on a testbed while manipulating experiments by each testbed support tool”. To achieve this concept, AnyBed divide an experimental network configuration into the logical and physical network topologies. Mapping these two topologies, AnyBed can build intended logical network topology on any PC clusters. We have evaluated the AnyBed implementation using two distinct clusters. The evaluation result shows a BGP topology with 150 nodes can be constructed on a large scale testbed in less than 113 seconds.

  6. A numerical testbed for the characterization and optimization of aerosol remote sensing

    NASA Astrophysics Data System (ADS)

    Wang, J.; Xu, X.; Ding, S.; Zeng, J.; Spurr, R. J.; Liu, X.; Chance, K.; Holben, B. N.; Dubovik, O.; Mishchenko, M. I.

    2013-12-01

    Remote sensing of aerosols from satellite and ground-based platforms provides key datasets to help understand the effect of air-borne particulates on air quality, visibility, surface temperature, clouds, and precipitation. However, global measurements of aerosol parameters have only been generated in the last decade or so, with the advent of dedicated low-earth-orbit sun-synchronous satellite sensors such as those of NASA's Earth Observation System (EOS). Many EOS sensors are now past their design lifetimes. Meanwhile, a number of aerosol-related satellite missions are planned for the future, and several of these will have measurements of polarization. A common question often arises: How can a sensor be optimally configured (in terms of spectral wavelength ranges, viewing angles, and measurement quantities such as radiance and polarization) to best fulfill the scientific requirements within the mission's budget constraints? To address these kind of questions in a cost-effective manner, a numerical testbed for remote sensing aerosols is an important requirement. This testbed is a tool that can generate an objective assessment of aerosol information content anticipated from any (planned or real) instrument configuration. Here, we present a numerical testbed that combines the inverse optimal estimation theory with a forward model containing linearized particle scattering and radiative transfer code. Specifically, the testbed comprises the following components: (1) a linearized vector radiative transfer model that computes the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering parameters at each layer and over the column; (2) linearized Mie and T-matrix electromagnetic scattering codes to compute the macroscopic aerosol single scattering optical properties and their sensitivities with respect to refractive index, size, and shape; (3) a linearized land surface model that uses the Lambertian, Ross-Thick, and Li

  7. Search Cloud

    MedlinePlus

    ... this page: https://medlineplus.gov/cloud.html Search Cloud To use the sharing features on this page, ... chest pa and lateral Share the MedlinePlus search cloud with your users by embedding our search cloud ...

  8. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    NASA Astrophysics Data System (ADS)

    Maloney, Thomas M.; Prokopius, Paul R.; Voecks, Gerald E.

    1995-01-01

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway.

  9. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  10. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  11. The Goddard Space Flight Center (GSFC) robotics technology testbed

    NASA Technical Reports Server (NTRS)

    Schnurr, Rick; Obrien, Maureen; Cofer, Sue

    1989-01-01

    Much of the technology planned for use in NASA's Flight Telerobotic Servicer (FTS) and the Demonstration Test Flight (DTF) is relatively new and untested. To provide the answers needed to design safe, reliable, and fully functional robotics for flight, NASA/GSFC is developing a robotics technology testbed for research of issues such as zero-g robot control, dual arm teleoperation, simulations, and hierarchical control using a high level programming language. The testbed will be used to investigate these high risk technologies required for the FTS and DTF projects. The robotics technology testbed is centered around the dual arm teleoperation of a pair of 7 degree-of-freedom (DOF) manipulators, each with their own 6-DOF mini-master hand controllers. Several levels of safety are implemented using the control processor, a separate watchdog computer, and other low level features. High speed input/output ports allow the control processor to interface to a simulation workstation: all or part of the testbed hardware can be used in real time dynamic simulation of the testbed operations, allowing a quick and safe means for testing new control strategies. The NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) hierarchical control scheme, is being used as the reference standard for system design. All software developed for the testbed, excluding some of simulation workstation software, is being developed in Ada. The testbed is being developed in phases. The first phase, which is nearing completion, and highlights future developments is described.

  12. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  13. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  14. Aerodynamic design of the National Rotor Testbed.

    SciTech Connect

    Kelley, Christopher Lee

    2015-10-01

    A new wind turbine blade has been designed for the National Rotor Testbed (NRT) project and for future experiments at the Scaled Wind Farm Technology (SWiFT) facility with a specific focus on scaled wakes. This report shows the aerodynamic design of new blades that can produce a wake that has similitude to utility scale blades despite the difference in size and location in the atmospheric boundary layer. Dimensionless quantities circulation, induction, thrust coefficient, and tip-speed-ratio were kept equal between rotor scales in region 2 of operation. The new NRT design matched the aerodynamic quantities of the most common wind turbine in the United States, the GE 1.5sle turbine with 37c model blades. The NRT blade design is presented along with its performance subject to the winds at SWiFT. The design requirements determined by the SWiFT experimental test campaign are shown to be met.

  15. Supersonic combustion engine testbed, heat lightning

    NASA Technical Reports Server (NTRS)

    Hoying, D.; Kelble, C.; Langenbahn, A.; Stahl, M.; Tincher, M.; Walsh, M.; Wisler, S.

    1990-01-01

    The design of a supersonic combustion engine testbed (SCET) aircraft is presented. The hypersonic waverider will utilize both supersonic combustion ramjet (SCRAMjet) and turbofan-ramjet engines. The waverider concept, system integration, electrical power, weight analysis, cockpit, landing skids, and configuration modeling are addressed in the configuration considerations. The subsonic, supersonic and hypersonic aerodynamics are presented along with the aerodynamic stability and landing analysis of the aircraft. The propulsion design considerations include: engine selection, turbofan ramjet inlets, SCRAMjet inlets and the SCRAMjet diffuser. The cooling requirements and system are covered along with the topics of materials and the hydrogen fuel tanks and insulation system. A cost analysis is presented and the appendices include: information about the subsonic wind tunnel test, shock expansion calculations, and an aerodynamic heat flux program.

  16. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

  17. A land-surface Testbed for EOSDIS

    NASA Technical Reports Server (NTRS)

    Emery, William; Kelley, Tim

    1994-01-01

    The main objective of the Testbed project was to deliver satellite images via the Internet to scientific and educational users free of charge. The main method of operations was to store satellite images on a low cost tape library system, visually browse the raw satellite data, access the raw data filed, navigate the imagery through 'C' programming and X-Windows interface software, and deliver the finished image to the end user over the Internet by means of file transfer protocol methods. The conclusion is that the distribution of satellite imagery by means of the Internet is feasible, and the archiving of large data sets can be accomplished with low cost storage systems allowing multiple users.

  18. Construction and Modeling of a Controls Testbed

    NASA Technical Reports Server (NTRS)

    Nagle, James C.; Homaifar, Abdollah; Nasser, Ahmed A.; Bikdash, Marwan

    1997-01-01

    This paper describes the construction and modeling of a control system testbed to be used for the comparison of various control methodologies. We specifically wish to test fuzzy logic control and compare performance of various fuzzy controllers, including Hybrid Fuzzy-PID (HFPID) and Hierarchical Hybrid Fuzzy-PID (HHFPID) to other controllers including localized rate feedback, LQR/LTR, and H2/H(sub infinity). The control problem is that of vibration suppression in a thin plate with inputs coming from accelerometers and outputs going to piezoelectric actuators or 'patches'. A model based on experimental modal analysis of the plate is conducted and compared with an analytical model. The analytical model uses a boundary condition which is a mix of clamped and simply supported.

  19. easyGWAS: A Cloud-Based Platform for Comparing the Results of Genome-Wide Association Studies[OPEN

    PubMed Central

    Roqueiro, Damian; Salomé, Patrice A.; Kleeberger, Stefan; Zhu, Wangsheng; Lippert, Christoph; Stegle, Oliver; Schölkopf, Bernhard

    2017-01-01

    The ever-growing availability of high-quality genotypes for a multitude of species has enabled researchers to explore the underlying genetic architecture of complex phenotypes at an unprecedented level of detail using genome-wide association studies (GWAS). The systematic comparison of results obtained from GWAS of different traits opens up new possibilities, including the analysis of pleiotropic effects. Other advantages that result from the integration of multiple GWAS are the ability to replicate GWAS signals and to increase statistical power to detect such signals through meta-analyses. In order to facilitate the simple comparison of GWAS results, we present easyGWAS, a powerful, species-independent online resource for computing, storing, sharing, annotating, and comparing GWAS. The easyGWAS tool supports multiple species, the uploading of private genotype data and summary statistics of existing GWAS, as well as advanced methods for comparing GWAS results across different experiments and data sets in an interactive and user-friendly interface. easyGWAS is also a public data repository for GWAS data and summary statistics and already includes published data and results from several major GWAS. We demonstrate the potential of easyGWAS with a case study of the model organism Arabidopsis thaliana, using flowering and growth-related traits. PMID:27986896

  20. Development of a space-systems network testbed

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  1. Development of Hardware-in-the-loop Microgrid Testbed

    SciTech Connect

    Xiao, Bailu; Prabakar, Kumaraguru; Starke, Michael R; Liu, Guodong; Dowling, Kevin; Ollis, T Ben; Irminger, Philip; Xu, Yan; Dimitrovski, Aleksandar D

    2015-01-01

    A hardware-in-the-loop (HIL) microgrid testbed for the evaluation and assessment of microgrid operation and control system has been presented in this paper. The HIL testbed is composed of a real-time digital simulator (RTDS) for modeling of the microgrid, multiple NI CompactRIOs for device level control, a prototype microgrid energy management system (MicroEMS), and a relay protection system. The applied communication-assisted hybrid control system has been also discussed. Results of function testing of HIL controller, communication, and the relay protection system are presented to show the effectiveness of the proposed HIL microgrid testbed.

  2. Performance and characterization results of a lasercom testbed for the pointing, acquisition, and tracking subsystem of a satellite-to-satellite laser communications link

    NASA Astrophysics Data System (ADS)

    Cardema, Jason C.; Tanzillo, Jennifer N.; Lee, Shinhak; Dunbar, Christopher B.

    2008-08-01

    The Aerospace Corporation has developed a testbed for studying pointing, acquisition, and tracking systems for lasercom terminals. The testbed consists of two configurable terminals that are currently set up to represent a GEO-to- GEO link. Each terminal has the ability to point open-loop, execute scan patterns, and track a received beam. The system operates in small-beam space and consists of a far-field space simulator and two lasercom terminals operating at 473 nm and 633 nm with representative hardware (fast steering mirrors, optical detectors, etc.). This paper discusses the software developed for the testbed and the characterization of its performance, which includes open-loop pointing accuracy and residual tracking error in the presence of applied disturbances. Analytical predictions are compared to experimental results. Each terminal has the ability to progress from acquisition to tracking mode and the two terminals together demonstrate the cooperative acquisition process.

  3. Event metadata records as a testbed for scalable data mining

    NASA Astrophysics Data System (ADS)

    van Gemmeren, P.; Malon, D.

    2010-04-01

    At a data rate of 200 hertz, event metadata records ("TAGs," in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise "data mining," but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  4. JINR cloud infrastructure evolution

    NASA Astrophysics Data System (ADS)

    Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.

    2016-09-01

    To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.

  5. The telerobot testbed: An architecture for remote servicing

    NASA Technical Reports Server (NTRS)

    Matijevic, J. R.

    1990-01-01

    The NASA/OAST Telerobot Testbed will reach its next increment in development by the end of FY-89. The testbed will have the capability for: force reflection in teleoperation, shared control, traded control, operator designate and relative update. These five capabilities will be shown in a module release and exchange operation using mockups of Orbital Replacement Units (ORU). This development of the testbed shows examples of the technologies needed for remote servicing, particularly under conditions of delay in transmissions to the servicing site. Here, the following topics are presented: the system architecture of the testbed which incorporates these telerobotic technologies for servicing, the implementation of the five capabilities and the operation of the ORU mockups.

  6. The Living With a Star Space Environment Testbed Payload

    NASA Technical Reports Server (NTRS)

    Xapsos, Mike

    2015-01-01

    This presentation outlines a brief description of the Living With a Star (LWS) Program missions and detailed information about the Space Environment Testbed (SET) payload consisting of a space weather monitor and carrier containing 4 board experiments.

  7. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system are documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  8. CT-directed robotic biopsy testbed: motivation and concept

    NASA Astrophysics Data System (ADS)

    Cleary, Kevin R.; Stoianovici, Dan S.; Glossop, Neil D.; Gary, Kevin A.; Onda, Sumiyo; Cody, Richard; Lindisch, David; Stanimir, Alexandru; Mazilu, Dumitru; Patriciu, Alexandru; Watson, Vance; Levy, Elliot

    2001-05-01

    As a demonstration platform, we are developing a robotic biopsy testbed incorporating a mobile CT scanner, a small needle driver robot, and an optical localizer. This testbed will be used to compare robotically assisted biopsy to the current manual technique, and allow us to investigate software architectures for integrating multiple medical devices. This is a collaboration between engineers and physicians from three universities and a commercial vendor. In this paper we describe the CT-directed biopsy technique, review some other biopsy systems including passive and semi- autonomous devices, describe our testbed components, and present our software architecture. This testbed is a first step in developing the image-guided, robotically assisted, physician directed, biopsy systems of the future.

  9. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system is documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  10. Situational descriptions of behavioral procedures: the in situ testbed.

    PubMed Central

    Kemp, S M; Eckerman, D A

    2001-01-01

    We demonstrate the In Situ testbed, a system that aids in evaluating computational models of learning, including artificial neural networks. The testbed models contingencies of reinforcement rising an extension of Mechner's (1959) notational system for the description of behavioral procedures. These contingencies are input to the model under test. The model's output is displayed as cumulative records. The cumulative record can then be compared to one produced by a pigeon exposed to the same contingencies. The testbed is tried with three published models of learning. Each model is exposed to up to three reinforcement schedules (testing ends when the model does not produce acceptable cumulative records): continuous reinforcement and extinction, fixed ratio, and fixed interval. The In Sitt testbed appears to be a reliable and valid testing procedure for comparing models of learning. PMID:11394484

  11. Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2

    NASA Technical Reports Server (NTRS)

    Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.

    2003-01-01

    Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used

  12. Phoenix Missile Hypersonic Testbed (PMHT): System Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    A viewgraph presentation of the Phoenix Missile Hypersonic Testbed (PMHT) is shown. The contents include: 1) Need and Goals; 2) Phoenix Missile Hypersonic Testbed; 3) PMHT Concept; 4) Development Objectives; 5) Possible Research Payloads; 6) Possible Research Program Participants; 7) PMHT Configuration; 8) AIM-54 Internal Hardware Schematic; 9) PMHT Configuration; 10) New Guidance and Armament Section Profiles; 11) Nomenclature; 12) PMHT Stack; 13) Systems Concept; 14) PMHT Preflight Activities; 15) Notional Ground Path; and 16) Sample Theoretical Trajectories.

  13. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  14. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-28

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  15. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  16. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    PubMed Central

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  17. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps.

    PubMed

    O'Reilly-Shah, Vikas; Mackey, Sean

    2016-06-03

    We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.

  18. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  19. Micro-Pixel Image Position Sensing Testbed

    NASA Technical Reports Server (NTRS)

    Nemati, Bijan; Shao, Michael; Zhai, Chengxing; Erlig, Hernan; Wang, Xu; Goullioud, Renaud

    2011-01-01

    The search for Earth-mass planets in the habitable zones of nearby Sun-like stars is an important goal of astrophysics. This search is not feasible with the current slate of astronomical instruments. We propose a new concept for microarcsecond astrometry which uses a simplified instrument and hence promises to be low cost. The concept employs a telescope with only a primary, laser metrology applied to the focal plane array, and new algorithms for measuring image position and displacement on the focal plane. The required level of accuracy in both the metrology and image position sensing is at a few micro-pixels. We have begun a detailed investigation of the feasibility of our approach using simulations and a micro-pixel image position sensing testbed called MCT. So far we have been able to demonstrate that the pixel-to-pixel distances in a focal plane can be measured with a precision of 20 micro-pixels and image-to-image distances with a precision of 30 micro-pixels. We have also shown using simulations that our image position algorithm can achieve accuracy of 4 micro-pixels in the presence of lambda/20 wavefront errors.

  20. Optical testbed for the LISA phasemeter

    NASA Astrophysics Data System (ADS)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  1. MEMS deformable mirror CubeSat testbed

    NASA Astrophysics Data System (ADS)

    Cahoy, Kerri L.; Marinan, Anne D.; Novak, Benjamin; Kerr, Caitlin; Nguyen, Tam; Webber, Matthew; Falkenburg, Grant; Barg, Andrew; Berry, Kristen; Carlton, Ashley; Belikov, Ruslan; Bendek, Eduardo A.

    2013-09-01

    To meet the high contrast requirement of 1 × 10-10to image an Earth-like planet around a Sun-like star, space telescopes equipped with coronagraphs require wavefront control systems. Deformable mirrors are a key element of these systems that correct for optical imperfections, thermal distortions, and diffraction that would otherwise corrupt the wavefront and ruin the contrast. However, high-actuator-count MEMS deformable mirrors have yet to fly in space long enough to characterize their on-orbit performance and reduce risk by developing and operating their supporting systems. The goal of the MEMS Deformable Mirror CubeSat Testbed is to develop a CubeSat-scale demonstration of MEMS deformable mirror and wavefront sensing technology. In this paper, we consider two approaches for a MEMS deformable mirror technology demonstration payload that will fit within the mass, power, and volume constraints of a CubeSat: 1) a Michelson interferometer and 2) a Shack-Hartmann wavefront sensor. We clarify the constraints on the payload based on the resources required for supporting CubeSat subsystems drawn from subsystems that we have developed for a different CubeSat flight project. We discuss results from payload lab prototypes and their utility in defining mission requirements.

  2. Ames life science telescience testbed evaluation

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  3. Advanced turboprop testbed systems study. Volume 1: Testbed program objectives and priorities, drive system and aircraft design studies, evaluation and recommendations and wind tunnel test plans

    NASA Technical Reports Server (NTRS)

    Bradley, E. S.; Little, B. H.; Warnock, W.; Jenness, C. M.; Wilson, J. M.; Powell, C. W.; Shoaf, L.

    1982-01-01

    The establishment of propfan technology readiness was determined and candidate drive systems for propfan application were identified. Candidate testbed aircraft were investigated for testbed aircraft suitability and four aircraft selected as possible propfan testbed vehicles. An evaluation of the four candidates was performed and the Boeing KC-135A and the Gulfstream American Gulfstream II recommended as the most suitable aircraft for test application. Conceptual designs of the two recommended aircraft were performed and cost and schedule data for the entire testbed program were generated. The program total cost was estimated and a wind tunnel program cost and schedule is generated in support of the testbed program.

  4. X.509 Authentication/Authorization in FermiCloud

    SciTech Connect

    Kim, Hyunwoo; Timm, Steven

    2014-11-11

    We present a summary of how X.509 authentication and authorization are used with OpenNebula in FermiCloud. We also describe a history of why the X.509 authentication was needed in FermiCloud, and review X.509 authorization options, both internal and external to OpenNebula. We show how these options can be and have been used to successfully run scientific workflows on federated clouds, which include OpenNebula on FermiCloud and Amazon Web Services as well as other community clouds. We also outline federation options being used by other commercial and open-source clouds and cloud research projects.

  5. Cloud regimes as phase transitions

    NASA Astrophysics Data System (ADS)

    Stechmann, Samuel N.; Hottovy, Scott

    2016-06-01

    Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes -- open versus closed cells -- fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. With this new conceptual viewpoint, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions.

  6. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  7. Ensemble forecasting for a hydrological testbed

    NASA Astrophysics Data System (ADS)

    Jankov, Isidora; Albers, Steve; Wharton, Linda; Tollerud, Ed; Yuan, Huiling; Toth, Zoltan

    2010-05-01

    Significant precipitation events in California during the winter season are often caused by land-falling "atmospheric rivers" associated with extratropical cyclones from the Pacific Ocean. Atmospheric rivers are narrow, elongated plumes of enhanced water vapor transport over the Pacific and Atlantic oceans that can extend from the tropics and subtropics into the extratropics. Large values of integrated water vapor are advected within the warm sector of extratropical cyclones immediately ahead of polar cold fronts, although the source of these vapor plumes can originate in the tropics beyond the cyclone warm sector. When an atmospheric river makes a landfall on the coast of California, the northwest to southeast orientation of the Sierra Mountain chain exerts orographic forcing on the southwesterly low-level flow in the warm sector of approaching extratropical cyclones. As a result, sustained precipitation is typically enhanced and modified by the complex terrain. This has major hydrological consequences. The National Oceanic Atmospheric Administration (NOAA) has established the Hydrometeorological Testbed (HMT) to design and support a series of field and numerical modeling experiments to better understand and forecast precipitation in the Central Valley. The main role of the Forecast Application Branch (NOAA/ESRL/GSD) in HMT has been in supporting the real time numerical forecasts as well as research activities targeting better understanding and improvement of Quantitative Precipitation Forecasts (QPF). For this purpose ensemble modeling system has been developed. The ensemble system consists of mixed dynamic cores, mixed physics and mixed lateral boundary conditions. Performance evaluation results for this system will be presented at the conference.

  8. NASA's Coastal and Ocean Airborne Science Testbed

    NASA Astrophysics Data System (ADS)

    Guild, L. S.; Dungan, J. L.; Edwards, M.; Russell, P. B.; Morrow, J. H.; Hooker, S.; Myers, J.; Kudela, R. M.; Dunagan, S.; Soulage, M.; Ellis, T.; Clinton, N. E.; Lobitz, B.; Martin, K.; Zell, P.; Berthold, R. W.; Smith, C.; Andrew, D.; Gore, W.; Torres, J.

    2011-12-01

    The Coastal and Ocean Airborne Science Testbed (COAST) Project is a NASA Earth-science flight mission that will advance coastal ecosystems research by providing a unique airborne payload optimized for remote sensing in the optically complex coastal zone. Teaming NASA Ames scientists and engineers with Biospherical Instruments, Inc. (San Diego) and UC Santa Cruz, the airborne COAST instrument suite combines a customized imaging spectrometer, sunphotometer system, and a new bio-optical radiometer package to obtain ocean/coastal/atmosphere data simultaneously in flight for the first time. The imaging spectrometer (Headwall) is optimized in the blue region of the spectrum to emphasize remote sensing of marine and freshwater ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data will be accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Based on optical detectors called microradiometers, the NASA Ocean Biology and Biogeochemistry Calibration and Validation (cal/val) Office team has deployed advanced commercial off-the-shelf instrumentation that provides in situ measurements of the apparent optical properties at the land/ocean boundary including optically shallow aquatic ecosystems (e.g., lakes, estuaries, coral reefs). A complimentary microradiometer instrument package (Biospherical Instruments, Inc.), optimized for use above water, will be flown for the first time with the airborne instrument suite. Details of the October 2011 COAST airborne mission over Monterey Bay demonstrating this new airborne instrument suite capability will be presented, with associated preliminary data on coastal ocean color products, coincident spatial and temporal data on aerosol optical depth and water vapor column content, as well as derived exact water-leaving radiances.

  9. Cloud Computing for DoD

    DTIC Science & Technology

    2012-05-01

    cloud computing 17 NASA Nebula Platform •  Cloud computing pilot program at NASA Ames •  Integrates open-source components into seamless, self...Mission support •  Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research •  Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 •  NASA Nebula (2010). Retrieved from

  10. Development of the CSI phase-3 evolutionary model testbed

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Davis, D. A.; Tan, M. K.

    1994-01-01

    This report documents the development effort for the reconfiguration of the Controls-Structures Integration (CSI) Evolutionary Model (CEM) Phase-2 testbed into the CEM Phase-3 configuration. This step responds to the need to develop and test CSI technologies associated with typical planned earth science and remote sensing platforms. The primary objective of the CEM Phase-3 ground testbed is to simulate the overall on-orbit dynamic behavior of the EOS AM-1 spacecraft. Key elements of the objective include approximating the low-frequency appendage dynamic interaction of EOS AM-1, allowing for the changeout of components, and simulating the free-free on-orbit environment using an advanced suspension system. The fundamentals of appendage dynamic interaction are reviewed. A new version of the multiple scaling method is used to design the testbed to have the full-scale geometry and dynamics of the EOS AM-1 spacecraft, but at one-tenth the weight. The testbed design is discussed, along with the testing of the solar array, high gain antenna, and strut components. Analytical performance comparisons show that the CEM Phase-3 testbed simulates the EOS AM-1 spacecraft with good fidelity for the important parameters of interest.

  11. Kite: Status of the External Metrology Testbed for SIM

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar; Azizi, Alireza; Moser, Steven; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-01-01

    Kite is a system level testbed for the External Metrology system of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducial that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to tens of picometers in order to correct for thermal measurements, the Kite testbed was build to test both the metrology gauges and out ability to optically model the system at these levels. The Kite testbed is an over-constraint system where 6 lengths are measured, but only 5 are needed to determine the system. The agreement in the over-constrained length needs to be on the order of 140 pm for the SIM Wide-Angle observing scenario and 8 pm for the Narrow-Angle observing scenario. We demonstrate that we have met the Wide-Angle goal with our current setup. For the Narrow-Angle case, we have only reached the goal for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more SIM like.

  12. Kite: Status of the External Metrology Testbed for SIM

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar; Azizi, Alireza; Moser, Steven; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-01-01

    Kite is a system level testbed for the External Metrology system of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducial that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to tens of picometers in order to correct for thermal measurements, the Kite testbed was build to test both the metrology gauges and out ability to optically model the system at these levels. The Kite testbed is an over-constraint system where 6 lengths are measured, but only 5 are needed to determine the system. The agreement in the over-constrained length needs to be on the order of 140 pm for the SIM Wide-Angle observing scenario and 8 pm for the Narrow-Angle observing scenario. We demonstrate that we have met the Wide-Angle goal with our current setup. For the Narrow-Angle case, we have only reached the goal for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more SIM like.

  13. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  14. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  15. In-Space Networking on NASA's SCAN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David E.; Eddy, Wesley M.; Clark, Gilbert J.; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios and a flight computer for supporting in-space communication research. New technologies being studied using the SCaN Testbed include advanced networking, coding, and modulation protocols designed to support the transition of NASAs mission systems from primarily point to point data links and preplanned routes towards adaptive, autonomous internetworked operations needed to meet future mission objectives. Networking protocols implemented on the SCaN Testbed include the Advanced Orbiting Systems (AOS) link-layer protocol, Consultative Committee for Space Data Systems (CCSDS) Encapsulation Packets, Internet Protocol (IP), Space Link Extension (SLE), CCSDS File Delivery Protocol (CFDP), and Delay-Tolerant Networking (DTN) protocols including the Bundle Protocol (BP) and Licklider Transmission Protocol (LTP). The SCaN Testbed end-to-end system provides three S-band data links and one Ka-band data link to exchange space and ground data through NASAs Tracking Data Relay Satellite System or a direct-to-ground link to ground stations. The multiple data links and nodes provide several upgradable elements on both the space and ground systems. This paper will provide a general description of the testbeds system design and capabilities, discuss in detail the design and lessons learned in the implementation of the network protocols, and describe future plans for continuing research to meet the communication needs for evolving global space systems.

  16. High Contrast Imaging Testbed for the Terrestrial Planet Finder Coronagraph

    NASA Technical Reports Server (NTRS)

    Lowmman, Andrew E.; Trauger, John T.; Gordon, Brian; Green, Joseph J.; Moody, Dwight; Niessner, Albert F.; Shi, Fang

    2004-01-01

    The Terrestrial Planet Finder (TPF) mission is planning to launch a visible coronagraphic space telescope in 2014. To achieve TPF science goals, the coronagraph must have extreme levels of wavefront correction (less than 1 Angstrom rms over controllable spatial frequencies) and stability to get the necessary suppression of diffracted starlight (approximately l0(exp -10)) contrast at an angular separation approximately 4 (lamda)/D). TPF Coronagraph's primary platform for experimentation is the High Contrast Imaging Testbed, which will provide laboratory validation of key technologies as well as demonstration of a flight-traceable approach to implementation. Precision wavefront control in the testbed is provided by a high actuator density deformable mirror. Diffracted light control is achieved through use of occulting or apodizing masks and stops. Contrast measurements will establish the technical feasibility of TPF requirements, while model and error budget validation will demonstrate implementation viability. This paper describes the current testbed design, development approach, and recent experimental results.

  17. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  18. Laser Metrology in the Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    An, Xin; Marx, D.; Goullioud, Renaud; Zhao, Feng

    2004-01-01

    The Space Interferometer Mission (SIM), scheduled for launch in 2009, is a space-born visible light stellar interferometer capable of micro-arcsecond-level astrometry. The Micro-Arcsecond Metrology testbed (MAM) is the ground-based testbed that incorporates all the functionalities of SIM minus the telescope, for mission-enabling technology development and verification. MAM employs a laser heterodyne metrology system using the Sub-Aperture Vertex-to-Vertex (SAVV) concept. In this paper, we describe the development and modification of the SAVV metrology launchers and the metrology instrument electronics, precision alignments and pointing control, locating cyclic error sources in the MAM testbed and methods to mitigate the cyclic errors, as well as the performance under the MAM performance metrics.

  19. Experimental Test-Bed for Intelligent Passive Array Research

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Torres, Miguel; David, Sunil; Isom, Adam; Cotto, Jose; Sharaiha, Samer

    2004-01-01

    This document describes the test-bed designed for the investigation of passive direction finding, recognition, and classification of speech and sound sources using sensor arrays. The test-bed forms the experimental basis of the Intelligent Small-Scale Spatial Direction Finder (ISS-SDF) project, aimed at furthering digital signal processing and intelligent sensor capabilities of sensor array technology in applications such as rocket engine diagnostics, sensor health prognostics, and structural anomaly detection. This form of intelligent sensor technology has potential for significant impact on NASA exploration, earth science and propulsion test capabilities. The test-bed consists of microphone arrays, power and signal distribution modules, web-based data acquisition, wireless Ethernet, modeling, simulation and visualization software tools. The Acoustic Sensor Array Modeler I (ASAM I) is used for studying steering capabilities of acoustic arrays and testing DSP techniques. Spatial sound distribution visualization is modeled using the Acoustic Sphere Analysis and Visualization (ASAV-I) tool.

  20. Spherical Air Bearing testbed for nanosatellite attitude control development

    NASA Astrophysics Data System (ADS)

    Ustrzycki, Tyler

    Spherical Air Bearing systems have been used as a test bed for attitude control systems for many decades. With the advancements of nanosatellite technologies as a platform for scientific missions, there is an increased demand for comprehensive, pre-launch testing of nanosatellites. Several spherical air bearing systems have been developed for larger satellite applications and add too much parasitic mass to be applicable for nanosatellite applications. This thesis details the design and validation of a Nanosatellite Three Axis Attitude Control Testbed. The testbed consists of the physical design of the system, a complete electronics system, and validation of the testbed using low-cost reaction wheels as actuators. The design of the air bearing platform includes a manual balancing system to align the centre of gravity with the centre of rotation. The electronics system is intended to measure the attitude of the platform and control the actuator system. Validation is achieved through a controlled slew maneuver of the air bearing platform.

  1. A Testbed for Deploying Distributed State Estimation in Power Grid

    SciTech Connect

    Jin, Shuangshuang; Chen, Yousu; Rice, Mark J.; Liu, Yan; Gorton, Ian

    2012-07-22

    Abstract—With the increasing demand, scale and data information of power systems, fast distributed applications are becoming more important in power system operation and control. This paper proposes a testbed for evaluating power system distributed applications, considering data exchange among distributed areas. A high-performance computing (HPC) version of distributed state estimation is implemented and used as a distributed application example. The IEEE 118-bus system is used to deploy the parallel distributed state estimation, and the MeDICi middleware is used for data communication. The performance of the testbed demonstrates its capability to evaluate parallel distributed state estimation by leveraging the HPC paradigm. This testbed can also be applied to evaluate other distributed applications.

  2. Design optimization of the JPL Phase B testbed

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Salama, M.; Wette, M.; Chu, Cheng-Chih

    1993-01-01

    Increasingly complex spacecraft will benefit from integrated design and optimization of structural, optical, and control subsystems. Integrated design optimization will allow designers to make tradeoffs in objectives and constraints across these subsystems. The location, number, and types of passive and active devices distributed along the structure can have a dramatic impact on overall system performance. In addition, the manner in which structural mass is distributed can also serve as an effective mechanism for attenuating disturbance transmission between source and sensitive system components. This paper presents recent experience using optimization tools that have been developed for addressing some of these issues on a challenging testbed design problem. This particular testbed is one of a series of testbeds at the Jet Propulsion Laboratory under the sponsorship of the NASA Control Structure Interaction (CSI) Program to demonstrate nanometer level optical pathlength control on a flexible truss structure that emulates a spaceborne interferometer.

  3. Telescience testbed pilot program, volume 2: Program results

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.

  4. ooi: OpenStack OCCI interface

    NASA Astrophysics Data System (ADS)

    López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo

    In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.

  5. Experiences with the JPL telerobot testbed: Issues and insights

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Balaram, Bob; Beahan, John

    1989-01-01

    The Jet Propulsion Laboratory's (JPL) Telerobot Testbed is an integrated robotic testbed used to develop, implement, and evaluate the performance of advanced concepts in autonomous, tele-autonomous, and tele-operated control of robotic manipulators. Using the Telerobot Testbed, researchers demonstrated several of the capabilities and technological advances in the control and integration of robotic systems which have been under development at JPL for several years. In particular, the Telerobot Testbed was recently employed to perform a near completely automated, end-to-end, satellite grapple and repair sequence. The task of integrating existing as well as new concepts in robot control into the Telerobot Testbed has been a very difficult and timely one. Now that researchers have completed the first major milestone (i.e., the end-to-end demonstration) it is important to reflect back upon experiences and to collect the knowledge that has been gained so that improvements can be made to the existing system. It is also believed that the experiences are of value to the others in the robotics community. Therefore, the primary objective here will be to use the Telerobot Testbed as a case study to identify real problems and technological gaps which exist in the areas of robotics and in particular systems integration. Such problems have surely hindered the development of what could be reasonably called an intelligent robot. In addition to identifying such problems, researchers briefly discuss what approaches have been taken to resolve them or, in several cases, to circumvent them until better approaches can be developed.

  6. The Living With a Star Space Environment Testbed Program

    NASA Technical Reports Server (NTRS)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  7. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  8. Telescience testbed pilot program, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.

  9. UltraSciencenet: High- Performance Network Research Test-Bed

    SciTech Connect

    Rao, Nageswara S; Wing, William R; Poole, Stephen W; Hicks, Susan Elaine; DeNap, Frank A; Carter, Steven M; Wu, Qishi

    2009-04-01

    The high-performance networking requirements for next generation large-scale applications belong to two broad classes: (a) high bandwidths, typically multiples of 10Gbps, to support bulk data transfers, and (b) stable bandwidths, typically at much lower bandwidths, to support computational steering, remote visualization, and remote control of instrumentation. Current Internet technologies, however, are severely limited in meeting these demands because such bulk bandwidths are available only in the backbone, and stable control channels are hard to realize over shared connections. The UltraScience Net (USN) facilitates the development of such technologies by providing dynamic, cross-country dedicated 10Gbps channels for large data transfers, and 150 Mbps channels for interactive and control operations. Contributions of the USN project are two-fold: (a) Infrastructure Technologies for Network Experimental Facility: USN developed and/or demonstrated a number of infrastructure technologies needed for a national-scale network experimental facility. Compared to Internet, USN's data-plane is different in that it can be partitioned into isolated layer-1 or layer-2 connections, and its control-plane is different in the ability of users and applications to setup and tear down channels as needed. Its design required several new components including a Virtual Private Network infrastructure, a bandwidth and channel scheduler, and a dynamic signaling daemon. The control-plane employs a centralized scheduler to compute the channel allocations and a signaling daemon to generate configuration signals to switches. In a nutshell, USN demonstrated the ability to build and operate a stable national-scale switched network. (b) Structured Network Research Experiments: A number of network research experiments have been conducted on USN that cannot be easily supported over existing network facilities, including test-beds and production networks. It settled an open matter by demonstrating

  10. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme.

  11. TopCAT and PySESA: Open-source software tools for point cloud decimation, roughness analyses, and quantitative description of terrestrial surfaces

    NASA Astrophysics Data System (ADS)

    Hensleigh, J.; Buscombe, D.; Wheaton, J. M.; Brasington, J.; Welcker, C. W.; Anderson, K.

    2015-12-01

    The increasing use of high-resolution topography (HRT) constructed from point clouds obtained from technology such as LiDAR, SoNAR, SAR, SfM and a variety of range-imaging techniques, has created a demand for custom analytical tools and software for point cloud decimation (data thinning and gridding) and spatially explicit statistical analysis of terrestrial surfaces. We will present on a number of analytical and computational tools designed to quantify surface roughness and texture, directly from point clouds in a variety of ways (using spatial- and frequency-domain statistics). TopCAT (Topographic Point Cloud Analysis Toolkit; Brasington et al., 2012) and PySESA (Python program for Spatially Explicit Spectral Analysis) both work by applying a small moving window to (x,y,z) data to calculate a suite of (spatial and spectral domain) statistics, which are then spatially-referenced on a regular (x,y) grid at a user-defined resolution. Collectively, these tools facilitate quantitative description of surfaces and may allow, for example, fully automated texture characterization and segmentation, roughness and grain size calculation, and feature detection and classification, on very large point clouds with great computational efficiency. Using tools such as these, it may be possible to detect geomorphic change in surfaces which have undergone minimal elevation difference, for example deflation surfaces which have coarsened but undergone no net elevation change, or surfaces which have eroded and accreted, leaving behind a different textural surface expression than before. The functionalities of the two toolboxes are illustrated with example high-resolution bathymetric point cloud data collected with multibeam echosounder, and topographic data collected with LiDAR.

  12. An Integrated Testbed for Cooperative Perception with Heterogeneous Mobile and Static Sensors

    PubMed Central

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  13. An integrated testbed for cooperative perception with heterogeneous mobile and static sensors.

    PubMed

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper.

  14. Interferometric adaptive optics testbed for laser pointing, wave-front control and phasing.

    PubMed

    Baker, K L; Homoelle, D; Utternback, E; Stappaerts, E A; Siders, C W; Barty, C P J

    2009-09-14

    Implementing the capability to perform fast ignition experiments, as well as, radiography experiments on the National Ignition Facility (NIF) places stringent requirements on the control of each of the beam's pointing, intra-beam phasing and overall wave-front quality. In this article experimental results are presented which were taken on an interferometric adaptive optics testbed that was designed and built to test the capabilities of such a system to control phasing, pointing and higher order beam aberrations. These measurements included quantification of the reduction in Strehl ratio incurred when using the MEMS device to correct for pointing errors in the system. The interferometric adaptive optics system achieved a Strehl ratio of 0.83 when correcting for a piston, tip/tilt error between two adjacent rectangular apertures, the geometry expected for the National ignition Facility. The interferometric adaptive optics system also achieved a Strehl ratio of 0.66 when used to correct for a phase plate aberration of similar magnitude as expected from simulations of the ARC beam line. All of these corrections included measuring both the upstream and downstream aberrations in the testbed and applying the sum of these two measurements in open-loop to the MEMS deformable mirror.

  15. Development, Demonstration, and Control of a Testbed for Multiterminal HVDC System

    DOE PAGES

    Li, Yalong; Shi, Xiaojie M.; Liu, Bo; ...

    2016-10-21

    This paper presents the development of a scaled four-terminal high-voltage direct current (HVDC) testbed, including hardware structure, communication architecture, and different control schemes. The developed testbed is capable of emulating typical operation scenarios including system start-up, power variation, line contingency, and converter station failure. Some unique scenarios are also developed and demonstrated, such as online control mode transition and station re-commission. In particular, a dc line current control is proposed, through the regulation of a converter station at one terminal. By controlling a dc line current to zero, the transmission line can be opened by using relatively low-cost HVDC disconnectsmore » with low current interrupting capability, instead of the more expensive dc circuit breaker. Utilizing the dc line current control, an automatic line current limiting scheme is developed. As a result, when a dc line is overloaded, the line current control will be automatically activated to regulate current within the allowable maximum value.« less

  16. A programmable laboratory testbed in support of evaluation of functional brain activation and connectivity.

    PubMed

    Barbour, Randall L; Graber, Harry L; Xu, Yong; Pei, Yaling; Schmitz, Christoph H; Pfeil, Douglas S; Tyagi, Anandita; Andronica, Randy; Lee, Daniel C; Barbour, San-Lian S; Nichols, J David; Pflieger, Mark E

    2012-03-01

    An important determinant of the value of quantitative neuroimaging studies is the reliability of the derived information, which is a function of the data collection conditions. Near infrared spectroscopy (NIRS) and electroencelphalography are independent sensing domains that are well suited to explore principal elements of the brain's response to neuroactivation, and whose integration supports development of compact, even wearable, systems suitable for use in open environments. In an effort to maximize the translatability and utility of such resources, we have established an experimental laboratory testbed that supports measures and analysis of simulated macroscopic bioelectric and hemodynamic responses of the brain. Principal elements of the testbed include 1) a programmable anthropomorphic head phantom containing a multisignal source array embedded within a matrix that approximates the background optical and bioelectric properties of the brain, 2) integrated translatable headgear that support multimodal studies, and 3) an integrated data analysis environment that supports anatomically based mapping of experiment-derived measures that are directly and not directly observable. Here, we present a description of system components and fabrication, an overview of the analysis environment, and findings from a representative study that document the ability to experimentally validate effective connectivity models based on NIRS tomography.

  17. Operation Duties on the F-15B Research Testbed

    NASA Technical Reports Server (NTRS)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  18. Asynchronous Message Passing in the JPL Flight System Testbed

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    1996-01-01

    The flight mission simulation software in the Jet Propulsion Laboratory's Flight System Testbed (FST) is a heterogeneous, distributed system that is built on an interprocess communication model of asynchronous message passing rather than remote procedure calls (RPCs). The reasoning behind this design decision is discussed; the mechanism used to implement it (.

  19. In-Space Networking On NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David; Eddy, Wesley M.; Clark, Gilbert J., III; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios (SDRs) and a programmable flight computer. The purpose of the Testbed is to conduct inspace research in the areas of communication, navigation, and networking in support of NASA missions and communication infrastructure. Multiple reprogrammable elements in the end to end system, along with several communication paths and a semi-operational environment, provides a unique opportunity to explore networking concepts and protocols envisioned for the future Solar System Internet (SSI). This paper will provide a general description of the system's design and the networking protocols implemented and characterized on the testbed, including Encapsulation, IP over CCSDS, and Delay-Tolerant Networking (DTN). Due to the research nature of the implementation, flexibility and robustness are considered in the design to enable expansion for future adaptive and cognitive techniques. Following a detailed design discussion, lessons learned and suggestions for future missions and communication infrastructure elements will be provided. Plans for the evolving research on SCaN Testbed as it moves towards a more adaptive, autonomous system will be discussed.

  20. A Portable MIMO Testbed and Selected Channel Measurements

    NASA Astrophysics Data System (ADS)

    Goud, Paul, Jr.; Hang, Robert; Truhachev, Dmitri; Schlegel, Christian

    2006-12-01

    A portable[InlineEquation not available: see fulltext.] multiple-input multiple-output (MIMO) testbed that is based on field programmable gate arrays (FPGAs) and which operates in the 902-928 MHz industrial, scientific, and medical (ISM) band has been developed by the High Capacity Digital Communications (HCDC) Laboratory at the University of Alberta. We present a description of the HCDC testbed along with MIMO channel capacities that were derived from measurements taken with the HCDC testbed for three special locations: a narrow corridor, an athletics field that is surrounded by a metal fence, and a parkade. These locations are special because the channel capacities are different from what is expected for a typical indoor or outdoor channel. For two of the cases, a ray-tracing analysis has been performed and the simulated channel capacity values closely match the values calculated from the measured data. A ray-tracing analysis, however, requires accurate geometrical measurements and sophisticated modeling for each specific location. A MIMO testbed is ideal for quickly obtaining accurate channel capacity information.

  1. Integrated microfluidic test-bed for energy conversion devices.

    PubMed

    Modestino, Miguel A; Diaz-Botia, Camilo A; Haussener, Sophia; Gomez-Sjoberg, Rafael; Ager, Joel W; Segalman, Rachel A

    2013-05-21

    Energy conversion devices require the parallel functionality of a variety of components for efficient operation. We present a versatile microfluidic test-bed for facile testing of integrated catalysis and mass transport components for energy conversion via water electrolysis. This system can be readily extended to solar-fuels generators and fuel-cell devices.

  2. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  3. Optical Design of the Developmental Cryogenic Active Telescope Testbed (DCATT)

    NASA Technical Reports Server (NTRS)

    Davila, Pam; Wilson, Mark; Young, Eric W.; Lowman, Andrew E.; Redding, David C.

    1997-01-01

    In the summer of 1996, three Study teams developed conceptual designs and mission architectures for the Next Generation Space Telescope (NGST). Each group highlighted areas of technology development that need to be further advanced to meet the goals of the NGST mission. The most important areas for future study included: deployable structures, lightweight optics, cryogenic optics and mechanisms, passive cooling, and on-orbit closed loop wavefront sensing and control. NASA and industry are currently planning to develop a series of ground testbeds and validation flights to demonstrate many of these technologies. The Deployed Cryogenic Active Telescope Testbed (DCATT) is a system level testbed to be developed at Goddard Space Flight Center in three phases over an extended period of time. This testbed will combine an actively controlled telescope with the hardware and software elements of a closed loop wavefront sensing and control system to achieve diffraction limited imaging at 2 microns. We will present an overview of the system level requirements, a discussion of the optical design, and results of performance analyses for the Phase 1 ambient concept for DCATT,

  4. Asynchronous Message Passing in the JPL Flight System Testbed

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    1996-01-01

    The flight mission simulation software in the Jet Propulsion Laboratory's Flight System Testbed (FST) is a heterogeneous, distributed system that is built on an interprocess communication model of asynchronous message passing rather than remote procedure calls (RPCs). The reasoning behind this design decision is discussed; the mechanism used to implement it (.

  5. Extending the Information Commons: From Instructional Testbed to Internet2

    ERIC Educational Resources Information Center

    Beagle, Donald

    2002-01-01

    The author's conceptualization of an Information Commons (IC) is revisited and elaborated in reaction to Bailey and Tierney's article. The IC's role as testbed for instructional support and knowledge discovery is explored, and progress on pertinent research is reviewed. Prospects for media-rich learning environments relate the IC to the…

  6. Extending the Information Commons: From Instructional Testbed to Internet2

    ERIC Educational Resources Information Center

    Beagle, Donald

    2002-01-01

    The author's conceptualization of an Information Commons (IC) is revisited and elaborated in reaction to Bailey and Tierney's article. The IC's role as testbed for instructional support and knowledge discovery is explored, and progress on pertinent research is reviewed. Prospects for media-rich learning environments relate the IC to the…

  7. Survey of Two-Way Cable Television Testbeds.

    ERIC Educational Resources Information Center

    Cable Television Information Center, Washington, DC.

    Surveys of 10 two-way interactive cable experiments indicate that little is happening in this field. The location of the testbeds, the names and addresses of both the parent company and its local subsidiary are included in the survey together with a description of each project. A brief note on the subscriber's right to privacy concludes this short…

  8. Neptune Clouds

    NASA Image and Video Library

    1999-10-14

    The bright cirrus-like clouds of Neptune change rapidly, often forming and dissipating over periods of several to tens of hours. In this sequence NASA Voyager 2 observed cloud evolution in the region around the Great Dark Spot GDS.

  9. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  10. Developmental Cryogenic Active Telescope Testbed, a Wavefront Sensing and Control Testbed for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Leboeuf, Claudia M.; Davila, Pamela S.; Redding, David C.; Morell, Armando; Lowman, Andrew E.; Wilson, Mark E.; Young, Eric W.; Pacini, Linda K.; Coulter, Dan R.

    1998-01-01

    As part of the technology validation strategy of the next generation space telescope (NGST), a system testbed is being developed at GSFC, in partnership with JPL and Marshall Space Flight Center (MSFC), which will include all of the component functions envisioned in an NGST active optical system. The system will include an actively controlled, segmented primary mirror, actively controlled secondary, deformable, and fast steering mirrors, wavefront sensing optics, wavefront control algorithms, a telescope simulator module, and an interferometric wavefront sensor for use in comparing final obtained wavefronts from different tests. The developmental. cryogenic active telescope testbed (DCATT) will be implemented in three phases. Phase 1 will focus on operating the testbed at ambient temperature. During Phase 2, a cryocapable segmented telescope will be developed and cooled to cryogenic temperature to investigate the impact on the ability to correct the wavefront and stabilize the image. In Phase 3, it is planned to incorporate industry developed flight-like components, such as figure controlled mirror segments, cryogenic, low hold power actuators, or different wavefront sensing and control hardware or software. A very important element of the program is the development and subsequent validation of the integrated multidisciplinary models. The Phase 1 testbed objectives, plans, configuration, and design will be discussed.

  11. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  12. Trusted Computing Strengthens Cloud Authentication

    PubMed Central

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  13. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.

  14. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets

  15. Using Cloud Computing infrastructure with CloudBioLinux, CloudMan and Galaxy

    PubMed Central

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-01-01

    Cloud computing has revolutionized availability and access to computing and storage resources; making it possible to provision a large computational infrastructure with only a few clicks in a web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this protocol, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatics analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to setup the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command line interface, and the web-based Galaxy interface. PMID:22700313

  16. Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.

    PubMed

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-06-01

    Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.

  17. Reusable, extendible flight software for a planetary spacecraft prototype testbed

    NASA Technical Reports Server (NTRS)

    Krasner, Sanford M.

    1995-01-01

    As part of the 'Faster, Better, Cheaper' paradigm for NASA missions, the Jet Propulsion Laboratory (JPL) is developing a Flight System Testbed for prototyping and early integration of future planetary missions. This paper describes the development of a set of reusable, extendible spacecraft flight software to be used as a basis for prototypes in the testbed. This effort has focused on identification and implementation of functions which are common across multiple missions. This effort has also developed an intertask messaging system which supports modification of existing functions, additions of new functions, and porting to various computation and input/output (I/O) architectures. This paper also identifies a number of other JPL activities which support standardization and reusability of planetary spacecraft designs.

  18. Collaboration in a Wireless Grid Innovation Testbed by Virtual Consortium

    NASA Astrophysics Data System (ADS)

    Treglia, Joseph; Ramnarine-Rieks, Angela; McKnight, Lee

    This paper describes the formation of the Wireless Grid Innovation Testbed (WGiT) coordinated by a virtual consortium involving academic and non-academic entities. Syracuse University and Virginia Tech are primary university partners with several other academic, government, and corporate partners. Objectives include: 1) coordinating knowledge sharing, 2) defining key parameters for wireless grids network applications, 3) dynamically connecting wired and wireless devices, content and users, 4) linking to VT-CORNET, Virginia Tech Cognitive Radio Network Testbed, 5) forming ad hoc networks or grids of mobile and fixed devices without a dedicated server, 6) deepening understanding of wireless grid application, device, network, user and market behavior through academic, trade and popular publications including online media, 7) identifying policy that may enable evaluated innovations to enter US and international markets and 8) implementation and evaluation of the international virtual collaborative process.

  19. Amplitude variations on the Extreme Adaptive Optics testbed

    SciTech Connect

    Evans, J; Thomas, S; Dillon, D; Gavel, D; Phillion, D; Macintosh, B

    2007-08-14

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. At the Laboratory for Adaptive Optics on the Extreme Adaptive Optics testbed, we have already demonstrated wavefront control of better than 1 nm rms within controllable spatial frequencies. Corresponding contrast measurements, however, are limited by amplitude variations, including those introduced by the micro-electrical-mechanical-systems (MEMS) deformable mirror. Results from experimental measurements and wave optic simulations of amplitude variations on the ExAO testbed are presented. We find systematic intensity variations of about 2% rms, and intensity variations with the MEMS to be 6%. Some errors are introduced by phase and amplitude mixing because the MEMS is not conjugate to the pupil, but independent measurements of MEMS reflectivity suggest that some error is introduced by small non-uniformities in the reflectivity.

  20. Satellite Testbed for Evaluating Cryogenic-Liquid Behavior in Microgravity

    NASA Technical Reports Server (NTRS)

    Putman, Philip Travis (Inventor)

    2017-01-01

    Provided is a testbed for conducting an experiment on a substance in a cryogenic liquid state in a microgravity environment. The testbed includes a frame with rectangular nominal dimensions, and a source section including a supply of the substance to be evaluated in the cryogenic liquid state. An experiment section includes an experiment vessel in fluid communication with the storage section to receive the substance from the storage section and condense the substance into the cryogenic liquid state. A sensor is adapted to sense a property of the substance in the cryogenic liquid state in the experiment vessel as part of the experiment. A bus section includes a controller configured to control delivery of the substance from the storage section to the experiment vessel, and receive property data indicative of the property sensed by the sensor for subsequent evaluation on Earth.

  1. Telescience testbed pilot program, volume 3: Experiment summaries

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.

  2. A MIMO-OFDM Testbed for Wireless Local Area Networks

    NASA Astrophysics Data System (ADS)

    Fàbregas, Albert Guilléni; Guillaud, Maxime; Slock, Dirk TM; Caire, Giuseppe; Gosse, Karine; Rouquette, Stéphanie; Dias, Alexandre Ribeiro; Bernardin, Philippe; Miet, Xavier; Conrat, Jean-Marc; Toutain, Yann; Peden, Alain; Li, Zaiqing

    2006-12-01

    We describe the design steps and final implementation of a MIMO OFDM prototype platform developed to enhance the performance of wireless LAN standards such as HiperLAN/2 and 802.11, using multiple transmit and multiple receive antennas. We first describe the channel measurement campaign used to characterize the indoor operational propagation environment, and analyze the influence of the channel on code design through a ray-tracing channel simulator. We also comment on some antenna and RF issues which are of importance for the final realization of the testbed. Multiple coding, decoding, and channel estimation strategies are discussed and their respective performance-complexity trade-offs are evaluated over the realistic channel obtained from the propagation studies. Finally, we present the design methodology, including cross-validation of the Matlab, C++, and VHDL components, and the final demonstrator architecture. We highlight the increased measured performance of the MIMO testbed over the single-antenna system.

  3. The Gemini Planet Imager Coronagraph Testbed Preliminary Performance Results

    NASA Astrophysics Data System (ADS)

    Roberts, Robin

    2010-01-01

    The Gemini Planet Imager (GPI) is a new science instrument being developed and slated for first light early 2011 on the twin 8m Gemini telescopes. Operating in the near infrared, this ground-based, extreme Adaptive Optics (ExAO) coronographic instrument will provide the ability to detect, characterize and analyze young (< 2GYr), self-luminous, extrasolar planets with brightness contrast ratios ≤ 10-7 when compared to their parent star. The coronagraph subsystem includes a pupil apodization, a hard-edged focal plane mask as well as a Lyot stop. Preliminary results indicate that the testbed is performing at very high contrast, having achieved broadband contrasts (H-band) below 10-6 at separations > 5λ/D. Fraunhoffer and Fresnel propagation modeling were used to analyze the testbed results.

  4. Optical modeling of the wide-field imaging interferometry testbed

    NASA Astrophysics Data System (ADS)

    Thompson, Anita K.; Martino, Anthony J.; Rinehart, Stephen A.; Leisawitz, David T.; Leviton, Douglas B.; Frey, Bradley J.

    2006-06-01

    The technique of wide field imaging for optical/IR interferometers for missions like Space Infrared Interferometric (SPIRIT), Submillimeter Probe of the Evolution of Cosmic Structure (SPECS), and the Terrestrial Planet Finder (TPF-I)/DARWIN has been demonstrated through the Wide-field Imaging Interferometry Testbed (WIIT). In this paper, we present an optical model of the WIIT testbed using the commercially available optical modeling and analysis software FRED. Interferometric results for some simple source targets are presented for a model with ideal surfaces and compared with theoretical closed form solutions. Measured surface deformation data of all mirror surfaces in the form of Zernike coefficients are then added to the optical model compared with results of some simple source targets to laboratory test data. We discuss the sources of error and approximations in the current FRED optical model. Future plans to refine the optical model are also be discussed.

  5. FDIR Validation Test-Bed Development and Results

    NASA Astrophysics Data System (ADS)

    Karlsson, Alexander; Sakthivel, Anandhavel; Aberg, Martin; Andersson, Jan; Habinc, Sandi; Dellandrea, Brice; Nodet, Jean-Christian; Guettache, Farid; Furano, Gianluca

    2015-09-01

    This paper describes work being performed by Cobham Gaisler and Thales Alenia Space France for the European Space Agency to develop an extension of the existing avionics system testbed facility in ESTEC's Avionics Lab. The work is funded by the European Space Agency under contract 4000109928/13/NL/AK. The resulting FDIR (Fault Detection, Isolation and Recovery) testbed will allow to test concepts, strategy mechanisms and tools related to FDIR. The resulting facility will have the capabilities to support nominal and off-nominal test cases and to support tools for post testing and post simulation analysis. Ultimately the purpose of the output of this activity is to provide a tool for assessment and validation at laboratory level. This paper describes an on-going development; at the time of writing the activity is in the validation phase.

  6. The advanced orbiting systems testbed program: Results to date

    NASA Technical Reports Server (NTRS)

    Newsome, Penny A.; Otranto, John F.

    1993-01-01

    The Consultative Committee for Space Data Systems Recommendations for Packet Telemetry and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. Central to the AOST Program are the development of an end-to-end Testbed and its use in a comprehensive testing program. Other Program activities include flight-qualifiable component development, supporting studies, and knowledge dissemination. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations. The results presented in this paper include architectural issues, a draft proposed standardized test suite and flight-qualifiable components.

  7. Cloud-Based NoSQL Open Database of Pulmonary Nodules for Computer-Aided Lung Cancer Diagnosis and Reproducible Research.

    PubMed

    Ferreira Junior, José Raniery; Oliveira, Marcelo Costa; de Azevedo-Marques, Paulo Mazzoncini

    2016-12-01

    Lung cancer is the leading cause of cancer-related deaths in the world, and its main manifestation is pulmonary nodules. Detection and classification of pulmonary nodules are challenging tasks that must be done by qualified specialists, but image interpretation errors make those tasks difficult. In order to aid radiologists on those hard tasks, it is important to integrate the computer-based tools with the lesion detection, pathology diagnosis, and image interpretation processes. However, computer-aided diagnosis research faces the problem of not having enough shared medical reference data for the development, testing, and evaluation of computational methods for diagnosis. In order to minimize this problem, this paper presents a public nonrelational document-oriented cloud-based database of pulmonary nodules characterized by 3D texture attributes, identified by experienced radiologists and classified in nine different subjective characteristics by the same specialists. Our goal with the development of this database is to improve computer-aided lung cancer diagnosis and pulmonary nodule detection and classification research through the deployment of this database in a cloud Database as a Service framework. Pulmonary nodule data was provided by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), image descriptors were acquired by a volumetric texture analysis, and database schema was developed using a document-oriented Not only Structured Query Language (NoSQL) approach. The proposed database is now with 379 exams, 838 nodules, and 8237 images, 4029 of them are CT scans and 4208 manually segmented nodules, and it is allocated in a MongoDB instance on a cloud infrastructure.

  8. Planning and reasoning in the JPL telerobot testbed

    NASA Technical Reports Server (NTRS)

    Peters, Stephen; Mittman, David; Collins, Carol; Omeara, Jacquie; Rokey, Mark

    1990-01-01

    The Telerobot Interactive Planning System is developed to serve as the highest autonomous-control level of the Telerobot Testbed. A recent prototype is described which integrates an operator interface for supervisory control, a task planner supporting disassembly and re-assembly operations, and a spatial planner for collision-free manipulator motion through the workspace. Each of these components is described in detail. Descriptions of the technical problem, approach, and lessons learned are included.

  9. Testbeds for Logic Programming and Very Large Databases

    DTIC Science & Technology

    1987-09-30

    included two Sun workstations. :;me Motorola C31 workstation testbed. and threeX Xenologic Prolog Accelerator oards. This equipment has led completion of...including 68020 (12.5 MHz) processor, 160MB disk. 4MIB RAM; " Three (3) Xenologic P’rolog Accellerator boards for installation in each of the above...workstations; Reasons for revision of equipment request: * Inability to utilize Xenologic Prolog Accellerator boards for planned . ... • = mmm~mmm mmm mmmm m m

  10. (DURIP) MIMO Radar Testbed for Waveform Adaptive Sensing Research

    DTIC Science & Technology

    2015-06-17

    and includes an onboard Phase Locked Loop ( PLL ) and Voltage Controlled Oscillator (VCO) to generate Local Oscillator (LO) signal from a local GPS...board PLL requires a reference signal of 50 MHz signal for locking, however the GPS conditioned reference is designed to generate a 10 MHz signal. Hence a...Final Report (DURIP) MIMO Radar Testbed for Waveform Adaptive Sensing Research PLL . Output of PLL is divided equally by a 3 dB splitter and further

  11. Optical modeling in Testbed Environment for Space Situational Awareness (TESSA).

    PubMed

    Nikolaev, Sergei

    2011-08-01

    We describe optical systems modeling in the Testbed Environment for Space Situational Awareness (TESSA) simulator. We begin by presenting a brief outline of the overall TESSA architecture and focus on components for modeling optical sensors. Both image generation and image processing stages are described in detail, highlighting the differences in modeling ground- and space-based sensors. We conclude by outlining the applicability domains for the TESSA simulator, including potential real-life scenarios.

  12. Remotely Accessible Testbed for Software Defined Radio Development

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2012-01-01

    Previous development testbeds have assumed that the developer was physically present in front of the hardware being used. No provision for remote operation of basic functions (power on/off or reset) was made, because the developer/operator was sitting in front of the hardware, and could just push the button manually. In this innovation, a completely remotely accessible testbed has been created, with all diagnostic equipment and tools set up for remote access, and using standardized interfaces so that failed equipment can be quickly replaced. In this testbed, over 95% of the operating hours were used for testing without the developer being physically present. The testbed includes a pair of personal computers, one running Linux and one running Windows. A variety of peripherals is connected via Ethernet and USB (universal serial bus) interfaces. A private internal Ethernet is used to connect to test instruments and other devices, so that the sole connection to the outside world is via the two PCs. An important design consideration was that all of the instruments and interfaces used stable, long-lived industry standards, such as Ethernet, USB, and GPIB (general purpose interface bus). There are no plug-in cards for the two PCs, so there are no problems with finding replacement computers with matching interfaces, device drivers, and installation. The only thing unique to the two PCs is the locally developed software, which is not specific to computer or operating system version. If a device (including one of the computers) were to fail or become unavailable (e.g., a test instrument needed to be recalibrated), replacing it is a straightforward process with a standard, off-the-shelf device.

  13. Advanced Unmanned Search System (AUSS) Testbed. Search Demonstration Testing

    DTIC Science & Technology

    1992-11-01

    AUSS) Testbed Search Demonstration Testing J. Walton NflS (r15 CA&I u1•C IA ,_ D•’,ltr Ib~u tion I - rC;1Availabiity -udes Dit A ,1 w () r NAVAL COMMAND...CONTROL AND OCEAN SURVEILLANCE CENTER RDT&E DIVISION San Diego, California 92152-5000 J. D. FONTANA, CAPT, USN R . T. SHEARER Commanding Officer...1 OBJECTIVES TE ST A R E A .................................................. ....... VEHICLE CONFIGURATION

  14. Helix Project Testbed: Towards the Self-Regenerative Incorruptible Enterprise

    DTIC Science & Technology

    2011-09-01

    browsers and other clients, web and application servers, will become prevalent throughout the enterprise. We have set up the testbed so that researchers...content at the server side. However, different browsers may parse the same Web content differently, partly in an attempt to tolerate or auto-correct...environment. The mobile Internet devices was used to develop Web page sanitization and other policy enforcement in Web browsers and to evaluate their

  15. QDES User’s Guide. National PDES Testbed Report Series

    DTIC Science & Technology

    1990-06-29

    related software [Smith88]. A National PDES Testbed has been established at the National I.istitute of Standards and Technol- ogy to provide testing ...the Office of the Secretary of Defense. As part of the testing effort, NIST is charged with providing a software toolkit for manipulating PDES data...text, mechanism for creating simple test cases for PDES/STEP tools; hence the name Quick-and-Dirty. Speed was not a concern at any point during the

  16. Development and experimentation of an eye/brain/task testbed

    NASA Technical Reports Server (NTRS)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  17. Software Testbed for Developing and Evaluating Integrated Autonomous Systems

    DTIC Science & Technology

    2015-03-01

    978-1-4799-5380-6/15/$31.00 ©2015 IEEE 1 Software Testbed for Developing and Evaluating Integrated Autonomous Systems James Ong , Emilio...Remolina, Axel Prompt Stottler Henke Associates, Inc. 1670 S. Amphlett Blvd., suite 310 San Mateo, CA 94402 650-931-2700 ong , remolina, aprompt...www.stottlerhenke.com/datamontage/ [13] Ong , J., E. Remolina, D. E. Smith, M. S. Boddy (2013) A Visual Integrated Development Environment for Automated Planning

  18. Sensing and Navigation System for a Multiple-AUV Testbed

    DTIC Science & Technology

    2002-09-30

    on a part-time basis. WORK COMPLETED As part of the larger testbed development, we have designed and constructed three “ grouper ” vehicles. The...technology for relative position/heading measurements of neighboring vehicles. We are currently implementing two vision systems into a grouper vehicle to...tested on a grouper vehicle and is expected to considerably improve the positioning system. In an effort to facilitate controller development and

  19. Summary of parallel session I: grid testbeds and applications

    NASA Astrophysics Data System (ADS)

    Olson, D. L.

    2003-04-01

    This paper is a summary of talks presented at ACAT 2002 in parallel session I on grid testbeds and applications. There were 12 presentations on this topic that show a lot of enthusiasm and hard work by many people in bringing physics applications onto the grid. There are encouraging success stories and also a clear view that the middleware has a way to go until it is as robust, reliable and complete as we would like it to be.

  20. Summary of Parallel Session I: Grid testbeds and applications

    SciTech Connect

    Olson, Douglas L.

    2002-10-10

    This paper is a summary of talks presented at ACAT 2002 in parallel session I on grid testbeds and applications. There were 12 presentations on this topic that show a lot of enthusiasm and hard work by many people in bringing physics applications onto the grid. There are encouraging success stories and also a clear view that the middleware has a way to go until it is as robust, reliable and complete as we would like it to be.

  1. The Northrop Grumman External Occulter Testbed: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Lo, Amy; Glassman, T.; Lillie, C.

    2007-05-01

    We have built a subscale testbed to demonstrate and validate the performance of the New Worlds Observer (NWO), a terrestrial planet finder external-occulter mission concept. The external occulter concept allows observations of nearby exo-Earths using two spacecraft: one carrying an occulter that is tens of meters in diameter and the other carrying a generic space telescope. The occulter is completely opaque, resembling a flower, with petals having a hypergaussian profile that enable 10-10 intensity suppression of stars that potentially harbor terrestrial planets. The baseline flight NWO system has a 30 meter occulter flying 30,000 km in front of a 4 meter class telescope. Testing the flight configuration on the ground is not feasible, so we have matched the Fresnel number of the flight configuration ( 10) using a subscale occulter. Our testbed consists of an 80 meter length evacuated tube, with a high precision occulter in the center of the tube. The occulter is 4 cm in diameter, manufactured with ¼ micron metrological accuracy and less than 2 micron tip truncation. This mimics a 30 meter occulter with millimeter figure accuracy and less than centimeter tip truncation. Our testbed is an evolving experiment, and we report here the first, preliminary, results using a single wavelength laser (532 nm) as the source.

  2. The Mini-Mast CSI testbed: Lessons learned

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Belvin, W. Keith; Horta, Lucas G.; Pappa, R. S.

    1993-01-01

    The Mini-Mast testbed was one of the first large scale Controls-Structure-Interaction (CSI) systems used to evaluate state-of-the-art methodology in flexible structure control. Now that all the testing at Langley Research Center has been completed, a look back is warranted to evaluate the program. This paper describes some of the experiences and technology development studies by NASA, university, and industry investigators. Lessons learned are presented from three categories: the testbed development, control methods, and the operation of a guest investigator program. It is shown how structural safety margins provided a realistic environment to simulate on-orbit CSI research, even though they also reduced the research flexibility afforded to investigators. The limited dynamic coupling between the bending and torsion modes of the cantilevered test article resulted in highly successful SISO and MIMO controllers. However, until accurate models were obtained for the torque wheel actuators, sensors, filters, and the structure itself, most controllers were unstable. Controls research from this testbed should be applicable to cantilevered appendages of future large space structures.

  3. The Mini-Mast CSI testbed: Lessons learned

    NASA Astrophysics Data System (ADS)

    Tanner, Sharon E.; Belvin, W. Keith; Horta, Lucas G.; Pappa, R. S.

    1993-02-01

    The Mini-Mast testbed was one of the first large scale Controls-Structure-Interaction (CSI) systems used to evaluate state-of-the-art methodology in flexible structure control. Now that all the testing at Langley Research Center has been completed, a look back is warranted to evaluate the program. This paper describes some of the experiences and technology development studies by NASA, university, and industry investigators. Lessons learned are presented from three categories: the testbed development, control methods, and the operation of a guest investigator program. It is shown how structural safety margins provided a realistic environment to simulate on-orbit CSI research, even though they also reduced the research flexibility afforded to investigators. The limited dynamic coupling between the bending and torsion modes of the cantilevered test article resulted in highly successful SISO and MIMO controllers. However, until accurate models were obtained for the torque wheel actuators, sensors, filters, and the structure itself, most controllers were unstable. Controls research from this testbed should be applicable to cantilevered appendages of future large space structures.

  4. Cyber security analysis testbed : combining real, emulation, and simulation.

    SciTech Connect

    Villamarin, Charles H.; Eldridge, John M.; Van Leeuwen, Brian P.; Urias, Vincent E.

    2010-07-01

    Cyber security analysis tools are necessary to evaluate the security, reliability, and resilience of networked information systems against cyber attack. It is common practice in modern cyber security analysis to separately utilize real systems of computers, routers, switches, firewalls, computer emulations (e.g., virtual machines) and simulation models to analyze the interplay between cyber threats and safeguards. In contrast, Sandia National Laboratories has developed novel methods to combine these evaluation platforms into a hybrid testbed that combines real, emulated, and simulated components. The combination of real, emulated, and simulated components enables the analysis of security features and components of a networked information system. When performing cyber security analysis on a system of interest, it is critical to realistically represent the subject security components in high fidelity. In some experiments, the security component may be the actual hardware and software with all the surrounding components represented in simulation or with surrogate devices. Sandia National Laboratories has developed a cyber testbed that combines modeling and simulation capabilities with virtual machines and real devices to represent, in varying fidelity, secure networked information system architectures and devices. Using this capability, secure networked information system architectures can be represented in our testbed on a single, unified computing platform. This provides an 'experiment-in-a-box' capability. The result is rapidly-produced, large-scale, relatively low-cost, multi-fidelity representations of networked information systems. These representations enable analysts to quickly investigate cyber threats and test protection approaches and configurations.

  5. Autonomous docking algorithm development and experimentation using the SPHERES testbed

    NASA Astrophysics Data System (ADS)

    Nolet, Simon; Kong, Edmund; Miller, David W.

    2004-08-01

    The MIT Space Systems Laboratory (SSL) has developed a testbed for the testing of formation flight and autonomous docking algorithms in both 1-g and microgravity environments. The SPHERES testbed consists of multiple micro-satellites, or Spheres, which can autonomously control their position and attitude. The testbed can be operated on an air table in a 1-g laboratory environment, in NASA"s KC-135 reduced gravity research aircraft and inside the International Space Station (ISS). SPHERES launch to the ISS is currently manifested for May 19 2004 on Progress 14P. Various types of docking maneuvers, ranging from docking with a cooperative target to docking with a tumbling target, have been developed. The ultimate objective of this research is to integrate the different algorithms into one program that can assess the health status of the target vehicle, plan an optimal docking maneuver while accounting for the existing constraints and finally, execute that maneuver even in the presence of simulated failures. In this paper, results obtained to date on the ground based air table using the initial version of the program will be presented, as well as results obtained from microgravity experiments onboard the KC-135.

  6. Visible Nulling Coronagraphy Testbed Development for Exoplanet Detection

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-01-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 10(exp 8) , 10(exp 9) and 10(exp 10) at an inner working angle of 2*lambda/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  7. Development of Liquid Propulsion Systems Testbed at MSFC

    NASA Technical Reports Server (NTRS)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  8. The computational structural mechanics testbed generic structural-element processor manual

    NASA Technical Reports Server (NTRS)

    Stanley, Gary M.; Nour-Omid, Shahram

    1990-01-01

    The usage and development of structural finite element processors based on the CSM Testbed's Generic Element Processor (GEP) template is documented. By convention, such processors have names of the form ESi, where i is an integer. This manual is therefore intended for both Testbed users who wish to invoke ES processors during the course of a structural analysis, and Testbed developers who wish to construct new element processors (or modify existing ones).

  9. Sensor Networking Testbed with IEEE 1451 Compatibility and Network Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Gurkan, Deniz; Yuan, X.; Benhaddou, D.; Figueroa, F.; Morris, Jonathan

    2007-01-01

    Design and implementation of a testbed for testing and verifying IEEE 1451-compatible sensor systems with network performance monitoring is of significant importance. The performance parameters measurement as well as decision support systems implementation will enhance the understanding of sensor systems with plug-and-play capabilities. The paper will present the design aspects for such a testbed environment under development at University of Houston in collaboration with NASA Stennis Space Center - SSST (Smart Sensor System Testbed).

  10. Intersatellite optical crosslink testbed for 300/650 Mbps

    NASA Astrophysics Data System (ADS)

    Carlson, R. T.

    This paper described the design, fabrication, and preliminary testing of a high datarate intersatellite laser crosslink testbed being built at The MITRE Corporation in 1993 on IR&D funding. Wideband laser drivers and optical receivers for signals at datarates of 300 Mbps and 650 Mbps have been designed and built, and are being integrated into a laboratory testbed for system level testing. This testbed includes laser wavelength division multiplexing of three colors, representative of that required for a flight implementation. The optical crosslink testbed has input/output interfaces with RF signals from uplink/downlink hardware simulators, described in a companion paper. The design of this optical crosslink is unique, in that it is based on direct analog modulation of the crosslink lasers with the wideband signal waveform from an incoming RF link (i.e., an uplink or space crosslink). Nearly all of the optical crosslink designs proposed to date have been for digitally modulated laser links, typically for applications where a wideband digital data stream originates at a sensing or imaging satellite and needs to be crosslinked to a destination satellite. The motivation for the analog modulation scheme in this paper is driven by end-to-end applications that include satellite uplinks, crosslinks, and downlinks, to avoid the requirement for processing satellites to detect, demodulate, regenerate, and remodulate multiple FDM channels of data at each of the two (or more) satellite crosslink nodes. This is particularly important for wideband channels that represent a composite of many different signals and modulation formats. The datarates, frequencies, interfaces, and requirements chosen for this testbed were based on the NASA TDRSS satellite and an upgrade being considered for a TDRSS-to-TDRSS crosslink capability. More generally, these datarates and the resulting testbed implementation are representative of an arbitrary RF-modulated wideband optical crosslink. The optical

  11. Development and Evaluation of a Stochastic Cloud-radiation Parameterization

    NASA Astrophysics Data System (ADS)

    Veron, D. E.; Secora, J.; Foster, M.

    2004-12-01

    Previous studies have shown that a stochastic cloud-radiation model accurately represents the domain-averaged shortwave fluxes when compared to observations. Using continuously sampled cloud property observations from the three Atmospheric Radiation Measurement (ARM) Program's Clouds and Radiation Testbed (CART) sites, we run a multiple-layer stochastic model and compare the results to that of the single-layer version of the model used in previous studies. In addition, we compare both to plane parallel model output and independent observations. We will use these results to develop a shortwave cloud-radiation parameterization that will incorporate the influence of the stochastic approach on the calculated radiative fluxes. Initial results using this resulting parameterization in a single-column model will be shown.

  12. Wavefront Amplitude Variation of TPF's High Contrast Imaging Testbed: Modeling and Experiment

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Lowman, Andrew E.; Moody, Dwight C.; Niessner, Albert F.; Trauger, John T.

    2005-01-01

    Knowledge of wavefront amplitude is as important as the knowledge of phase for a coronagraphic high contrast imaging system. Efforts have been made to understand various contributions of the amplitude variation in Terrestrial Planet Finder's (TPF) High Contrast Imaging Testbed (HCIT). Modeling of HCIT with as-built mirror surfaces has shown an amplitude variation of 1.3% due to the phase-amplitude mixing for the testbed's front-end optics. Experimental measurements on the testbed have shown the amplitude variation is about 2.5% with the testbed's illumination pattern has a major contribution as the low order amplitude variation.

  13. Spacelab system analysis: A study of the Marshall Avionics System Testbed (MAST)

    NASA Technical Reports Server (NTRS)

    Ingels, Frank M.; Owens, John K.; Daniel, Steven P.; Ahmad, F.; Couvillion, W.

    1988-01-01

    An analysis of the Marshall Avionics Systems Testbed (MAST) communications requirements is presented. The average offered load for typical nodes is estimated. Suitable local area networks are determined.

  14. Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test S

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test Sets and Networks Integration Management Office Testing for the Tracking and Data Relay Satellite System

  15. Comparisons of cloud cover estimates and cloud fraction profiles from ARM's cloud-detecting instruments and GOES-8 data

    SciTech Connect

    Krueger, S K; Rodriguez, D

    1999-05-07

    The DOE's Atmospheric Radiation Measurement (ARM) Program employs both upward- and downward-looking remote-sensing instruments to measure the horizontal and vertical distributions of clouds across its Southern Great Plains (SGP) site. No single instrument is capable of completely determining these distributions over the scales of interest to ARM's Single Column Modeling (SCM) and Instantaneous Radiative Flux (IRF) groups; these groups embody the primary strategies through which ARM expects to achieve its objectives of developing and testing cloud formation parameterizations (USDOE, 1996). Collectively, however, the data from ARM's cloud-detecting instruments offer the potential for such a three-dimensional characterization. Data intercomparisons, like the ones illustrated in this paper, are steps in this direction. Examples of some initial comparisons, involving satellite, millimeter cloud radar, whole sky imager and ceilometer data, are provided herein. that many of the lessons learned can later be adapted to cloud data at the Boundary and Extended Facilities. Principally, we are concerned about: (1) the accuracy of various estimates of cloud properties at a single point, or within a thin vertical column, above the CF over time, and (2) the accuracy of various estimates of cloud properties over the Cloud and Radiation Testbed (CART) site, which can then be reduced to single, representative profiles over time. In the former case, the results are usable in the IRF and SCM strategies; in the latter case, they satisfy SCM needs specifically. The Whole Sky Imager (WSI) and ceilometer data used in one study were collected at the SGP CF between October 1 and December 31, 1996 (Shields, et. al., 1990). This three-month period, corresponding to the first set of WSI data released by ARM's Experiment Center, was sufficiently long to reveal important trends (Rodriguez, 1998).

  16. Cloud Control

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2012-01-01

    Your learning curriculum needs a new technological platform, but you don't have the expertise or IT equipment to pull it off in-house. The answer is a learning system that exists online, "in the cloud," where learners can access it anywhere, anytime. For trainers, cloud-based coursework often means greater ease of instruction resulting in greater…

  17. Arctic Clouds

    Atmospheric Science Data Center

    2013-04-19

    ...   View Larger Image Stratus clouds are common in the Arctic during the summer months, and are important modulators of ... from MISR's two most obliquely forward-viewing cameras. The cold, stable air causes the clouds to persist in stratified layers, and this ...

  18. Cloud Modeling

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Moncrieff, Mitchell; Einaud, Franco (Technical Monitor)

    2001-01-01

    Numerical cloud models have been developed and applied extensively to study cloud-scale and mesoscale processes during the past four decades. The distinctive aspect of these cloud models is their ability to treat explicitly (or resolve) cloud-scale dynamics. This requires the cloud models to be formulated from the non-hydrostatic equations of motion that explicitly include the vertical acceleration terms since the vertical and horizontal scales of convection are similar. Such models are also necessary in order to allow gravity waves, such as those triggered by clouds, to be resolved explicitly. In contrast, the hydrostatic approximation, usually applied in global or regional models, does allow the presence of gravity waves. In addition, the availability of exponentially increasing computer capabilities has resulted in time integrations increasing from hours to days, domain grids boxes (points) increasing from less than 2000 to more than 2,500,000 grid points with 500 to 1000 m resolution, and 3-D models becoming increasingly prevalent. The cloud resolving model is now at a stage where it can provide reasonably accurate statistical information of the sub-grid, cloud-resolving processes poorly parameterized in climate models and numerical prediction models.

  19. Thin Clouds

    Atmospheric Science Data Center

    2013-04-18

    ... their delicate appearance, thin, feathery clouds of ice crystals called cirrus may contribute to global warming. Some scientists ... minutes after MISR imaged the cloud from space. At the same time, another NASA high-altitude jet, the WB-57, flew right through the ...

  20. Cloud Control

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2012-01-01

    Your learning curriculum needs a new technological platform, but you don't have the expertise or IT equipment to pull it off in-house. The answer is a learning system that exists online, "in the cloud," where learners can access it anywhere, anytime. For trainers, cloud-based coursework often means greater ease of instruction resulting in greater…

  1. Cloud Control

    ERIC Educational Resources Information Center

    Ramaswami, Rama; Raths, David; Schaffhauser, Dian; Skelly, Jennifer

    2011-01-01

    For many IT shops, the cloud offers an opportunity not only to improve operations but also to align themselves more closely with their schools' strategic goals. The cloud is not a plug-and-play proposition, however--it is a complex, evolving landscape that demands one's full attention. Security, privacy, contracts, and contingency planning are all…

  2. Cloud Cover

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2012-01-01

    This article features a major statewide initiative in North Carolina that is showing how a consortium model can minimize risks for districts and help them exploit the advantages of cloud computing. Edgecombe County Public Schools in Tarboro, North Carolina, intends to exploit a major cloud initiative being refined in the state and involving every…

  3. Cloud Control

    ERIC Educational Resources Information Center

    Ramaswami, Rama; Raths, David; Schaffhauser, Dian; Skelly, Jennifer

    2011-01-01

    For many IT shops, the cloud offers an opportunity not only to improve operations but also to align themselves more closely with their schools' strategic goals. The cloud is not a plug-and-play proposition, however--it is a complex, evolving landscape that demands one's full attention. Security, privacy, contracts, and contingency planning are all…

  4. Cloud Cover

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2012-01-01

    This article features a major statewide initiative in North Carolina that is showing how a consortium model can minimize risks for districts and help them exploit the advantages of cloud computing. Edgecombe County Public Schools in Tarboro, North Carolina, intends to exploit a major cloud initiative being refined in the state and involving every…

  5. Screaming Clouds

    NASA Astrophysics Data System (ADS)

    Fikke, Svein; Egill Kristjánsson, Jón; Nordli, Øyvind

    2017-04-01

    "Mother-of-pearl clouds" appear irregularly in the winter stratosphere at high northern latitudes, about 20-30 km above the surface of the Earth. The size range of the cloud particles is near that of visible light, which explains their extraordinary beautiful colours. We argue that the Norwegian painter Edvard Munch could well have been terrified when the sky all of a sudden turned "bloodish red" after sunset, when darkness was expected. Hence, there is a high probability that it was an event of mother-of-pearl clouds which was the background for Munch's experience in nature, and for his iconic Scream. Currently, the leading hypothesis for explaining the dramatic colours of the sky in Munch's famous painting is that the artist was captivated by colourful sunsets following the enormous Krakatoa eruption in 1883. After carefully considering the historical accounts of some of Munch's contemporaries, especially the physicist Carl Störmer, we suggest an alternative hypothesis, namely that Munch was inspired by spectacular occurrences of mother-of-pearl clouds. Such clouds, which have a wave-like structure akin to that seen in the Scream were first observed and described only a few years before the first version of this motive was released in 1892. Unlike clouds related to conventional weather systems in the troposphere, mother-of-pearl clouds appear in the stratosphere, where significantly different physical conditions prevail. This result in droplet sizes within the range of visible light, creating the spectacular colour patterns these clouds are famous for. Carl Störmer observed such clouds, and described them in minute details at the age of 16, but already with a profound interest in science. He later noted that "..these mother-of-pearl clouds was a vision of indescribable beauty!" The authors find it logical that the same vision could appear scaring in the sensible mind of a young artist unknown to such phenomena.

  6. COMPARISON OF MILLIMETER-WAVE CLOUD RADAR MEASUREMENTS FOR THE FALL 1997 CLOUD IOP

    SciTech Connect

    SEKELSKY,S.M.; LI,L.; GALLOWAY,J.; MCINTOSH,R.E.; MILLER,M.A.; CLOTHIAUX,E.E.; HAIMOV,S.; MACE,G.; SASSEN,K.

    1998-03-23

    One of the primary objectives of the Fall 1997 IOP was to intercompare Ka-band (35GHz) and W-band (95GHz) cloud radar observations and verify system calibrations. During September 1997, several cloud radars were deployed at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site, including the full time operation 35 GHz CART Millimeter-wave Cloud Radar (MMCR), (Moran, 1997), the University of Massachusetts (UMass) single antenna 33GHz/95 GHz Cloud Profiling Radar System (CPRS), (Sekelsky, 1996), the 95 GHz Wyoming Cloud Radar (WCR) flown on the University of Wyoming King Air (Galloway, 1996), the University of Utah 95 GHz radar and the dual-antenna Pennsylvania State University 94 GHz radar (Clothiaux, 1995). In this paper the authors discuss several issues relevant to comparison of ground-based radars, including the detection and filtering of insect returns. Preliminary comparisons of ground-based Ka-band radar reflectivity data and comparisons with airborne radar reflectivity measurements are also presented.

  7. Delivering Unidata Technology via the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  8. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    Ahronovitz 2010, Harding 2010, Badger 2011, Kundra 2011]. Risks of vendor lock-in include reduced negotiation power in reaction to price increases and...use cases classified into three groups: cloud management, cloud interoperability, and cloud security [ Badger 2010]. These use cases are listed below... Badger 2010]: • Cloud Management Use Cases − Open an Account − Close an Account − Terminate an Account − Copy Data Objects into a Cloud − Copy

  9. Space construction: an experimental testbed to develop enabling technologies

    NASA Astrophysics Data System (ADS)

    Schubert, Heidi C.; How, Jonathan P.

    1997-12-01

    This paper discusses a new testbed developed at the Stanford Aerospace Robotics Laboratory (ARL) to address some of the key issues associated with semi-autonomous construction in a hazardous environment like space. The new testbed consists of a large two-link manipulator carrying two smaller two-link arms. This macro/mini combination was developed to be representative of actual space manipulators, such as the SSRMS/SPDM planned for the Space Station. This new testbed will allow us to investigate several key issues associated with space construction, including teleoperation versus supervised autonomy, dexterous control of a robot with flexibility, and construction with multiple robots. A supervised autonomy approach has several advantages over the traditional teleoperation mode, including operation with time delay, smart control of a redundant manipulator, and improved contact control. To mimic the dynamics found in space manipulators, the main arm was designed to include joint flexibility. The arm operates in 2-D, with the end-point floating on air-bearing. This setup allows cooperation with existing free-flying robots in the ARL. This paper reports the first experiments with the arm which explore the advantages of moving from teleoperation or human-in-the-loop control to the human supervisory or task-level control. A simple task, such as capturing a satellite-like object floating on the table, is attempted first with the human directly driving the end-point and second with the human directing the robot at a task-level. Initial experimental results of these two control approaches are presented and compared.

  10. The Advanced Orbiting Systems Testbed Program: Results to date

    NASA Technical Reports Server (NTRS)

    Otranto, John F.; Newsome, Penny A.

    1994-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Recommendations for Packet Telemetry (PT) and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's (GSFC's) AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations.

  11. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  12. Phoenix Missile Hypersonic Testbed (PMHT): Project Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    An over view of research into a low cost hypersonic research flight test capability to increase the amount of hypersonic flight data to help bridge the large developmental gap between ground testing/analysis and major flight demonstrator Xplanes is provided. The major objectives included: develop an air launched missile booster research testbed; accurately deliver research payloads through programmable guidance to hypersonic test conditions; low cost; a high flight rate minimum of two flights per year and utilize surplus air launched missiles and NASA aircraft.

  13. A clock for the manufacturing systems integration testbed

    NASA Astrophysics Data System (ADS)

    Libes, Don

    1991-09-01

    Described here is a software module that provides timing services to the Manufacturing Systems Integration (MSI) testbed in the automated factory. The software 'alarm clock' provides services to other MSI software, including synchrony; real time, or non-real time adjusted in a variety of ways; and alarms at relative or absolute intervals. By providing a central time service, these services are provided more reliably, efficiently, and flexibly than any client could provide on his own. Described are the implementation, the interfaces, and how to design and write programs that use MSI.

  14. The CSM testbed matrix processors internal logic and dataflow descriptions

    NASA Technical Reports Server (NTRS)

    Regelbrugge, Marc E.; Wright, Mary A.

    1988-01-01

    This report constitutes the final report for subtask 1 of Task 5 of NASA Contract NAS1-18444, Computational Structural Mechanics (CSM) Research. This report contains a detailed description of the coded workings of selected CSM Testbed matrix processors (i.e., TOPO, K, INV, SSOL) and of the arithmetic utility processor AUS. These processors and the current sparse matrix data structures are studied and documented. Items examined include: details of the data structures, interdependence of data structures, data-blocking logic in the data structures, processor data flow and architecture, and processor algorithmic logic flow.

  15. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    NASA Technical Reports Server (NTRS)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  16. CCPP-ARM Parameterization Testbed Model Forecast Data

    DOE Data Explorer

    Klein, Stephen

    2008-01-15

    Dataset contains the NCAR CAM3 (Collins et al., 2004) and GFDL AM2 (GFDL GAMDT, 2004) forecast data at locations close to the ARM research sites. These data are generated from a series of multi-day forecasts in which both CAM3 and AM2 are initialized at 00Z every day with the ECMWF reanalysis data (ERA-40), for the year 1997 and 2000 and initialized with both the NASA DAO Reanalyses and the NCEP GDAS data for the year 2004. The DOE CCPP-ARM Parameterization Testbed (CAPT) project assesses climate models using numerical weather prediction techniques in conjunction with high quality field measurements (e.g. ARM data).

  17. Performance of the PARCS Testbed Cesium Fountain Frequency Standard

    NASA Technical Reports Server (NTRS)

    Enzer, Daphna G.; Klipstein, William M.

    2004-01-01

    A cesium fountain frequency standard has been developed as a ground testbed for the PARCS (Primary Atomic Reference Clock in Space) experiment, an experiment intended to fly on the International Space Station. We report on the performance of the fountain and describe some of the implementations motivated in large part by flight considerations, but of relevance for ground fountains. In particular, we report on a new technique for delivering cooling and trapping laser beams to the atom collection region, in which a given beam is recirculated three times effectively providing much more optical power than traditional configurations. Allan deviations down to 10 have been achieved with this method.

  18. The Living With a Star Space Environment Testbed Experiments

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  19. The Living With a Star Program Space Environment Testbed

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  20. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  1. Vehicle-network development on a communications-network testbed

    NASA Astrophysics Data System (ADS)

    Rapanotti, John L.

    2006-05-01

    Light armoured vehicles will rely on sensors, on-board computing and digital wireless communications to achieve improved performance and survivability. Constrained by low latency response to threats, individual vehicles will share sensory information with other platoon vehicles benefiting from a flexible, dynamic, self-adapting network environment. As sensor and computing capability increases, network communications will become saturated. To understand the operational requirements for these future vehicle networks, the High Capacity Technical Communications Network (HCTCN) Low Bandwidth Testbed (LBTB) has been developed to provide a simulated environment for the radios and candidate database and transmission protocols selected. These concepts and approach to network communications will be discussed in the paper.

  2. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John

    1991-01-01

    Docking concepts include capture, berthing, and docking. The definitions of these terms, consistent with AIAA, are as follows: (1) capture (grasping)--the use of a manipulator to make initial contact and attachment between transfer vehicle and a platform; (2) berthing--positioning of a transfer vehicle or payload into platform restraints using a manipulator; and (3) docking--propulsive mechanical connection between vehicle and platform. The combination of the capture and berthing operations is effectively the same as docking; i.e., capture (grasping) + berthing = docking. These concepts are discussed in terms of Martin Marietta's ability to develop validation methods using robotics testbeds.

  3. Gas dispersion with induced airflow in mobile olfaction testbed

    NASA Astrophysics Data System (ADS)

    Mamduh, S. M.; Kamarudin, K.; Visvanathan, R.; Yeon, A. S. A.; Shakaff, A. Y. M.; Zakaria, A.; Kamarudin, L. M.; Abdullah, A. H.

    2017-03-01

    The unpredictable nature of gas dispersion is a well-known issue in mobile olfaction. As roboticists tend to depend on simulations and try to recreate environmental conditions in such simulations, an accurate representation of the gas plume is needed. Current model based simulations may not be able to capture the time-varying and unpredictable nature of gas distribution accurately. This paper presents the real-time gas distribution dataset collected in a mobile olfaction testbed which captures the time varying nature of a gas plume.

  4. Automating NEURON Simulation Deployment in Cloud Resources.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  5. A Simulation Testbed for Adaptive Modulation and Coding in Airborne Telemetry

    DTIC Science & Technology

    2014-05-29

    development, implementation, and testing/verification of algorithms for airborne telemetry applications. This testbed utilizes both SOQPSK and OFDM for...SOQPSK), Orthogonal Frequency Division Multiplexing ( OFDM ), Bit Error Rate, (BER) 16. SECURITY CLASSIFICATION OF: Unclassified 17. LIMITATION OF...implementation, and testing/verification of algorithms for airborne telemetry applications. This testbed utilizes both SOQPSK and OFDM for its modulation

  6. Development of a flexible test-bed for robotics, telemanipulation and servicing research

    NASA Technical Reports Server (NTRS)

    Davies, Barry F.

    1989-01-01

    The development of a flexible operation test-bed, based around a commercially available ASEA industrial robot is described. The test-bed was designed to investigate fundamental human factors issues concerned with the unique problems of robotic manipulation in the hostile environment of Space.

  7. PORT: A Testbed Paradigm for On-line Digital Archive Development.

    ERIC Educational Resources Information Center

    Keeler, Mary; Kloesel, Christian

    1997-01-01

    Discusses the Peirce On-line Resource Testbed (PORT), a digital archive of primary data. Highlights include knowledge processing testbeds for digital resource development; Peirce's pragmatism in operation; PORT and knowledge processing; obstacles to archive access; and PORT as a paradigm for critical control in knowledge processing. (AEF)

  8. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... National Telecommunications and Information Administration Spectrum Sharing Innovation Test-Bed Pilot... conduct in Phase II/III of the Spectrum Sharing Innovation Test-Bed pilot program to assess whether devices employing Dynamic Spectrum Access techniques can share the frequency spectrum with land...

  9. Complex Clouds

    Atmospheric Science Data Center

    2013-04-16

    ...     View Larger Image The complex structure and beauty of polar clouds are highlighted by these images acquired ... MD. The MISR data were obtained from the NASA Langley Research Center Atmospheric Science Data Center in Hampton, VA. Image ...

  10. Polar Clouds

    NASA Image and Video Library

    2012-02-27

    With the changing of seasons comes changes in weather. This image from NASA 2001 Mars Odyssey spacecraft shows clouds in the north polar region. The surface is just barely visible in part of the image.

  11. Deep Clouds

    NASA Image and Video Library

    2008-05-27

    Bright puffs and ribbons of cloud drift lazily through Saturn's murky skies. In contrast to the bold red, orange and white clouds of Jupiter, Saturn's clouds are overlain by a thick layer of haze. The visible cloud tops on Saturn are deeper in its atmosphere due to the planet's cooler temperatures. This view looks toward the unilluminated side of the rings from about 18 degrees above the ringplane. Images taken using red, green and blue spectral filters were combined to create this natural color view. The images were acquired with the Cassini spacecraft wide-angle camera on April 15, 2008 at a distance of approximately 1.5 million kilometers (906,000 miles) from Saturn. Image scale is 84 kilometers (52 miles) per pixel. http://photojournal.jpl.nasa.gov/catalog/PIA09910

  12. Curious Clouds

    NASA Image and Video Library

    2006-06-13

    Saturn atmosphere produces beautiful and sometimes perplexing features. Is the bright feature below center a rare crossing of a feature from a zone to a belt, or is it an illusion created by different cloud layers at different levels?

  13. Cloud Formation

    NASA Astrophysics Data System (ADS)

    Graham, Mark Talmage

    2004-05-01

    Cloud formation is crucial to the heritage of modern physics, and there is a rich literature on this important topic. In 1927, Charles T.R. Wilson was awarded the Nobel Prize in physics for applications of the cloud chamber.2 Wilson was inspired to study cloud formation after working at a meteorological observatory on top of the highest mountain in Scotland, Ben Nevis, and testified near the end of his life, "The whole of my scientific work undoubtedly developed from the experiments I was led to make by what I saw during my fortnight on Ben Nevis in September 1894."3 To form clouds, Wilson used the sudden expansion of humid air.4 Any structure the cloud may have is spoiled by turbulence in the sudden expansion, but in 1912 Wilson got ion tracks to show up by using strobe photography of the chamber immediately upon expansion.5 In the interim, Millikan's study in 1909 of the formation of cloud droplets around individual ions was the first in which the electron charge was isolated. This study led to his famous oil drop experiment.6 To Millikan, as to Wilson, meteorology and physics were professionally indistinct. With his meteorological physics expertise, in WWI Millikan commanded perhaps the first meteorological observation and forecasting team essential to military operation in history.7 But even during peacetime meteorology is so much of a concern to everyone that a regular news segment is dedicated to it. Weather is the universal conversation topic, and life on land could not exist as we know it without clouds. One wonders then, why cloud formation is never covered in physics texts.

  14. Observational evidence linking precipitation and mesoscale cloud fraction in the southeast Pacific

    NASA Astrophysics Data System (ADS)

    Rapp, Anita D.

    2016-07-01

    Precipitation has been hypothesized to play an important role in the transition of low clouds from closed to open cell cumulus in regions of large-scale subsidence. A synthesis of A-Train satellite measurements is used to examine the relationship between precipitation and mesoscale cloud fraction across a transition region in the southeastern Pacific. Low cloud pixels are identified in 4 years of CloudSat/CALIPSO observations and along-track mean cloud fraction within 2.5-500 km surrounding the clouds calculated. Results show that cloud fraction decreases more rapidly in areas surrounding precipitating clouds than around nonprecipitating clouds. The closed to open cell transition region appears especially sensitive, with the surrounding mesoscale cloud fraction decreasing 30% faster in the presence of precipitation compared to nonprecipitating clouds. There is also dependence on precipitation rate and cloud liquid water path (LWP), with higher rain rates or lower LWP showing larger decreases in surrounding cloud fraction.

  15. High performance testbed for four-beam infrared interferometric nulling and exoplanet detection.

    PubMed

    Martin, Stefan; Booth, Andrew; Liewer, Kurt; Raouf, Nasrat; Loya, Frank; Tang, Hong

    2012-06-10

    Technology development for a space-based infrared nulling interferometer capable of earthlike exoplanet detection and characterization started in earnest in the last 10 years. At the Jet Propulsion Laboratory, the planet detection testbed was developed to demonstrate the principal components of the beam combiner train for a high performance four-beam nulling interferometer. Early in the development of the testbed, the importance of "instability noise" for nulling interferometer sensitivity was recognized, and the four-beam testbed would produce this noise, allowing investigation of methods for mitigating this noise source. The testbed contains the required features of a four-beam combiner for a space interferometer and performs at a level matching that needed for the space mission. This paper describes in detail the design, functions, and controls of the testbed.

  16. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; hide

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  17. Development of optical packet and circuit integrated ring network testbed.

    PubMed

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate < 1×10(-4)) operation was achieved with optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated.

  18. Function-based integration strategy for an agile manufacturing testbed

    NASA Astrophysics Data System (ADS)

    Park, Hisup

    1997-01-01

    This paper describes an integration strategy for plug-and- play software based on functional descriptions of the software modules. The functional descriptions identify explicitly the role of each module with respect to the overall systems. They define the critical dependencies that affect the individual modules and thus affect the behavior of the system. The specified roles, dependencies and behavioral constraints are then incorporated in a group of shared objects that are distributed over a network. These objects may be interchanged with others without disrupting the system so long as the replacements meet the interface and functional requirements. In this paper, we propose a framework for modeling the behavior of plug-and-play software modules that will be used to (1) design and predict the outcome of the integration, (2) generate the interface and functional requirements of individual modules, and (3) form a dynamic foundation for applying interchangeable software modules. I describe this strategy in the context of the development of an agile manufacturing testbed. The testbed represents a collection of production cells for machining operations, supported by a network of software modules or agents for planning, fabrication, and inspection. A process definition layer holds the functional description of the software modules. A network of distributed objects interact with one another over the Internet and comprise the plug-compatible software nodes that execute these functions. This paper will explore the technical and operational ramifications of using the functional description framework to organize and coordinate the distributed object modules.

  19. Characterization of Vegetation using the UC Davis Remote Sensing Testbed

    NASA Astrophysics Data System (ADS)

    Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.

    2006-12-01

    Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.

  20. A Battery Certification Testbed for Small Satellite Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  1. The Fizeau Interferometer Testbed (FIT) for Stellar Imager

    NASA Technical Reports Server (NTRS)

    Carpenter, Kenneth G.; Lyon, Richard G.; Mazzuca, Lisa M.; Solyar, Gregory; Mundy, Lee G.; Armstrong, J. T.; Zhang, Xiaolei; Marzouk, Joe

    2003-01-01

    Goddard Space Flight Center is pursuing the development of space-based, long-baseline (less than 0.5km) UV-optical Fizeau imaging interferometers to enable the next major stride toward very high angular resolution astronomical observations. This effort includes the development and operation of the Fizeau Interferometer Testbed (FIT), in collaboration with the Naval Research Lab/NPOI, Univ. of MD, and Sigma Space Corporation. The FIT will be used to explore the principles of and requirements for the Stellar Imager (SI) mission concept (http://hires.gsfc.nasa.gov/-si) and other such Fizeau Interferometers/Sparse Aperture Telescope missions. The primary FIT goal is to demonstrate closed-loop control of a many-element (7 - 30) system which keeps the optical beams in phase and thus enables high quality imaging. The FIT will also be used to assess various wavefront reconstruction and sensing and image reconstruction algorithms for utility and accuracy by application to real data generated by the Testbed. In this paper, we describe the design and goals of the system, provide a status report on its construction, and note our future plans. The FIT development is supported by NASA-ROSS/SARA grants to GSFC, UMD, and NRL and by internal GSFC R&D funds.

  2. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2012-01-01

    NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  3. Airborne Subscale Transport Aircraft Research Testbed: Aircraft Model Development

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas L.; Langford, William M.; Hill, Jeffrey S.

    2005-01-01

    The Airborne Subscale Transport Aircraft Research (AirSTAR) testbed being developed at NASA Langley Research Center is an experimental flight test capability for research experiments pertaining to dynamics modeling and control beyond the normal flight envelope. An integral part of that testbed is a 5.5% dynamically scaled, generic transport aircraft. This remotely piloted vehicle (RPV) is powered by twin turbine engines and includes a collection of sensors, actuators, navigation, and telemetry systems. The downlink for the plane includes over 70 data channels, plus video, at rates up to 250 Hz. Uplink commands for aircraft control include over 30 data channels. The dynamic scaling requirement, which includes dimensional, weight, inertial, actuator, and data rate scaling, presents distinctive challenges in both the mechanical and electrical design of the aircraft. Discussion of these requirements and their implications on the development of the aircraft along with risk mitigation strategies and training exercises are included here. Also described are the first training (non-research) flights of the airframe. Additional papers address the development of a mobile operations station and an emulation and integration laboratory.

  4. The Hyperion Project: Partnership for an Advaned Technology Cluster Testbed

    SciTech Connect

    Seager, M; Leininger, M

    2008-04-28

    The Hyperion project offers a unique opportunity to participate in a community-driven testing and development resource at a scale beyond what can be accomplished by one entity alone. Hyperion is a new strategic technology partnership intended to support the member-driven development and testing at scale. This partnership will allow commodity clusters to scale up to meet the growing demands of customers multi-core petascale simulation environments. Hyperion will tightly couple together the outstanding research and development capabilities of Lawrence Livermore National Laboratory with leading technology companies, including Cisco, Data Direct Networks, Dell, Intel, LSI, Mellanox, Qlogic, RedHat, SuperMicro and Sun. The end goal of this project is to revolutionize cluster computing in fundamental ways by providing the critical software and hardware components for a highly scalable simulation environment. This environment will include support for high performance networking, parallel file systems, operating system, and cluster management. This goal will be achieved by building a scalable technology cluster testbed that will be fully dedicated to the partners and provide: (1) A scalable development testing and benchmarking environment for critical enabling Linux cluster technologies; (2) An evaluation testbed for new hardware and software technologies; and (3) A vehicle for forming long term collaborations.

  5. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2013-01-01

    NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  6. Off-road perception testbed vehicle design and evaluation

    NASA Astrophysics Data System (ADS)

    Spofford, John R.; Herron, Jennifer B.; Anhalt, David J.; Morgenthaler, Matthew K.; DeHerrera, Clinton

    2003-09-01

    Off-road robotics efforts such as DARPA"s PerceptOR program have motivated the development of testbed vehicles capable of sustained operation in a variety of terrain and environments. This paper describes the retrofitting of a minimally-modified ATV chassis into such a testbed which has been used by multiple programs for autonomous mobility development and sensor characterization. Modular mechanical interfaces for sensors and equipment enclosures enabled integration of multiple payload configurations. The electric power subsystem was capable of short-term operation on batteries with refueled generation for continuous operation. Processing subsystems were mounted in sealed, shock-dampened enclosures with heat exchangers for internal cooling to protect against external dust and moisture. The computational architecture was divided into a real-time vehicle control layer and an expandable high level processing and perception layer. The navigation subsystem integrated real time kinematic GPS with a three-axis IMU for accurate vehicle localization and sensor registration. The vehicle software system was based on the MarsScape architecture developed under DARPA"s MARS program. Vehicle mobility software capabilities included route planning, waypoint navigation, teleoperation, and obstacle detection and avoidance. The paper describes the vehicle design in detail and summarizes its performance during field testing.

  7. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  8. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  9. Research on private cloud computing based on analysis on typical opensource platform: a case study with Eucalyptus and Wavemaker

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyuan; Yuan, Jian; Chen, Shi

    2013-03-01

    Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.

  10. Astronomy In The Cloud: Using Mapreduce For Image Coaddition

    NASA Astrophysics Data System (ADS)

    Wiley, Keith; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-01-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computational challenges such as anomaly detection, classification, and moving object tracking. Since such studies require the highest quality data, methods such as image coaddition, i.e., registration, stacking, and mosaicing, will be critical to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources, e.g., asteroids, or transient objects, e.g., supernovae, these datastreams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this paper we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data is partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources, i.e., platforms where Hadoop is offered as a service. We report on our experience implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multi-terabyte imaging dataset provides a good testbed for algorithm development since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image coaddition to the MapReduce framework. Then we describe a number of optimizations to our basic approach and report experimental results compring their performance. This work is funded by

  11. Aerosol-cloud interactions in ship tracks using Terra MODIS/MISR

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Chun; Christensen, Matthew W.; Diner, David J.; Garay, Michael J.

    2015-04-01

    Simultaneous ship track observations from Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) have been compiled to investigate how ship-injected aerosols affect marine warm boundary layer clouds for different cloud types and environmental conditions. By taking advantage of the high spatial resolution multiangle observations available from MISR, we utilized the retrieved cloud albedo, cloud top height, and cloud motion vectors to examine cloud property responses in ship-polluted and nearby unpolluted clouds. The strength of the cloud albedo response to increased aerosol level is primarily dependent on cloud cell structure, dryness of the free troposphere, and boundary layer depth, corroborating a previous study by Chen et al. (2012) where A-Train satellite data were utilized. Under open cell cloud structure the cloud properties are more susceptible to aerosol perturbations as compared to closed cells. Aerosol plumes caused an increase in liquid water amount (+38%), cloud top height (+13%), and cloud albedo (+49%) for open cell clouds, whereas for closed cell clouds, little change in cloud properties was observed. Further capitalizing on MISR's unique capabilities, the MISR cross-track cloud speed was used to derive cloud top divergence. Statistically averaging the results from the identified plume segments to reduce random noise, we found evidence of cloud top divergence in the ship-polluted clouds, whereas the nearby unpolluted clouds showed cloud top convergence, providing observational evidence of a change in local mesoscale circulation associated with enhanced aerosols. Furthermore, open cell polluted clouds revealed stronger cloud top divergence as compared to closed cell clouds, consistent with different dynamical mechanisms driving their responses. These results suggest that detailed cloud responses, classified by cloud type and environmental conditions, must be accounted for in global climate modeling

  12. Uncertainty Quantification and the Development of Ocean Model Testbeds

    NASA Astrophysics Data System (ADS)

    Hecht, M. W.; Bhat, K. S.; Gattiker, J.

    2012-12-01

    In ocean modeling for climate science, the feasibility of testbeds is strongly constrained by the availability of observational data from which to form sufficiently diagnostic metrics. The use of high resolution simulations as targets, akin to observations, for simulations at lower resolution with parameterized physics, can be an effective means of determining optimal parameter settings. The construction of this sort of idealized testbed can also be used to guide the design of a more realistic ocean climate model testbed based on the use of observations. This presentation focuses on our work with the POP ocean model, studying the Gent-McWilliams (GM) mesoscale eddy parameterization in our "channel model" that approximates some features of the Southern Ocean and Antarctic Circumpolar Current. This test case makes use of a simple reentrant channel with a single ridge as the only bathymetry, forced by an eastward wind and surface buoyancy fluxes. Our target (pseudo-observation) data is from a simulation at 5.5 km resolution, where the effect of mesoscale eddies are directly resolved. Our study is to compare this to coarsened, non-eddying runs in which the effects of mesoscale eddies are parameterized through GM, as is done in most ocean components of coupled climate models. We consider both a single parameter version of GM and the more recent implementation, as used in the Community Earth System Model, in which we vary two parameters. A primary metric for comparison is the potential temperature vs. depth profile, horizontally averaged over the domain. We also integrate results from additional metrics related to poleward transports and domain-averaged vertical transports. This work presents specific results in two areas, using techniques of model calibration and uncertainty quantification. First, we examine the calibration of the multi-parameter GM model, and discuss how this methodology gives a grounded and quantitative way to determine parameter values and the

  13. Space Station technology testbed: 2010 deep space transport

    NASA Technical Reports Server (NTRS)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  14. SPHERES: Design of a Formation Flying Testbed for ISS

    NASA Astrophysics Data System (ADS)

    Sell, S. W.; Chen, S. E.

    2002-01-01

    The SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) payload is an innovative formation-flying spacecraft testbed currently being developed for use internally aboard the International Space Station (ISS). The purpose of the testbed is to provide a cost-effective, long duration, replenishable, and easily reconfigurable platform with representative dynamics for the development and validation of metrology, formation flying, and autonomy algorithms. The testbed components consist of three 8-inch diameter free-flying "satellites," five ultrasound beacons, and an ISS laptop workstation. Each satellite is self-contained with on-board battery power, cold-gas propulsion (CO2), and processing systems. Satellites use two packs of eight standard AA batteries for approximately 90 minutes of lifetime while beacons last the duration of the mission powered by a single AA battery. The propulsion system uses pressurized carbon dioxide gas, stored in replaceable tanks, distributed through an adjustable regulator and associated tubing to twelve thrusters located on the faces of the satellites. A Texas Instruments C6701 DSP handles control algorithm data while an FPGA manages all sensor data, timing, and communication processes on the satellite. All three satellites communicate with each other and with the controlling laptop via a wireless RF link. Five ultrasound beacons, located around a predetermined work area, transmit ultrasound signals that are received by each satellite. The system effectively acts as a pseudo-GPS system, allowing the satellites to determine position and attitude and to navigate within the test arena. The payload hardware are predominantly Commercial Off The Shelf (COTS) products with the exception of custom electronics boards, selected propulsion system adaptors, and beacon and satellite structural elements. Operationally, SPHERES will run in short duration test sessions with approximately two weeks between each session. During

  15. Development of a hybrid cloud parameterization for general circulation models

    SciTech Connect

    Kao, C.Y.J.; Kristjansson, J.E.; Langley, D.L.

    1995-04-01

    We have developed a cloud package with state-of-the-art physical schemes that can parameterize low-level stratus or stratocumulus, penetrative cumulus, and high-level cirrus. Such parameterizations will improve cloud simulations in general circulation models (GCMs). The principal tool in this development comprises the physically based Arakawa-Schubert scheme for convective clouds and the Sundqvist scheme for layered, nonconvective clouds. The term {open_quotes}hybrid{close_quotes} addresses the fact that the generation of high-attitude layered clouds can be associated with preexisting convective clouds. Overall, the cloud parameterization package developed should better determine cloud heating and drying effects in the thermodynamic budget, realistic precipitation patterns, cloud coverage and liquid/ice water content for radiation purposes, and the cloud-induced transport and turbulent diffusion for atmospheric trace gases.

  16. Neptune's clouds

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The bright cirrus-like clouds of Neptune change rapidly, often forming and dissipating over periods of several to tens of hours. In this sequence Voyager 2 observed cloud evolution in the region around the Great Dark Spot (GDS). The surprisingly rapid changes which occur separating each panel shows that in this region Neptune's weather is perhaps as dynamic and variable as that of the Earth. However, the scale is immense by our standards -- the Earth and the GDS are of similar size -- and in Neptune's frigid atmosphere, where temperatures are as low as 55 degrees Kelvin (-360 F), the cirrus clouds are composed of frozen methane rather than Earth's crystals of water ice. The Voyager Mission is conducted by JPL for NASA's Office of Space Science and Applications

  17. CLOUD CHEMISTRY.

    SciTech Connect

    SCHWARTZ,S.E.

    2001-03-01

    Clouds present substantial concentrations of liquid-phase water, which can potentially serve as a medium for dissolution and reaction of atmospheric gases. The important precursors of acid deposition, SO{sub 2} and nitrogen oxides NO and NO{sub 2} are only sparingly soluble in clouds without further oxidation to sulfuric and nitric acids. In the case of SO{sub 2} aqueous-phase reaction with hydrogen peroxide, and to lesser extent ozone, are identified as important processes leading to this oxidation, and methods have been described by which to evaluate the rates of these reactions. The limited solubility of the nitrogen oxides precludes significant aqueous-phase reaction of these species, but gas-phase reactions in clouds can be important especially at night.

  18. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Mike; Munson, Mike; Teate, George

    2006-01-01

    A new testbed for hypersonic flight research is proposed. Known as the Phoenix air-launched small missile (ALSM) flight testbed, it was conceived to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of two unique and very capable flight assets: the United States Navy Phoenix AIM-54 long-range, guided air-to-air missile and the NASA Dryden F-15B testbed airplane. The U.S. Navy retirement of the Phoenix AIM-54 missiles from fleet operation has presented an excellent opportunity for converting this valuable flight asset into a new flight testbed. This cost-effective new platform will fill an existing gap in the test and evaluation of current and future hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform. When launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will be valuable for the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite small-payload air-launched space boosters.

  19. Our World: Cool Clouds

    NASA Image and Video Library

    Learn how clouds are formed and watch an experiment to make a cloud using liquid nitrogen. Find out how scientists classify clouds according to their altitude and how clouds reflect and absorb ligh...

  20. mdtmFTP and its evaluation on ESNET SDN testbed

    DOE PAGES

    Zhang, Liang; Wu, Wenji; DeMar, Phil; ...

    2017-04-21

    In this paper, to address the high-performance challenges of data transfer in the big data era, we are developing and implementing mdtmFTP: a high-performance data transfer tool for big data. mdtmFTP has four salient features. First, it adopts an I/O centric architecture to execute data transfer tasks. Second, it more efficiently utilizes the underlying multicore platform through optimized thread scheduling. Third, it implements a large virtual file mechanism to address the lots-of-small-files (LOSF) problem. In conclusion, mdtmFTP integrates multiple optimization mechanisms, including–zero copy, asynchronous I/O, pipelining, batch processing, and pre-allocated buffer pools–to enhance performance. mdtmFTP has been extensively tested andmore » evaluated within the ESNET 100G testbed. Evaluations show that mdtmFTP can achieve higher performance than existing data transfer tools, such as GridFTP, FDT, and BBCP.« less

  1. Intelligent Elements for the ISHM Testbed and Prototypes (ITP) Project

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Park, Han; Schwabacher, Mark; Watson, Michael; Mackey, Ryan; Fijany, Amir; Trevino, Luis; Weir, John

    2005-01-01

    Deep-space manned missions will require advanced automated health assessment capabilities. Requirements such as in-space assembly, long dormant periods and limited accessibility during flight, present significant challenges that should be addressed through Integrated System Health Management (ISHM). The ISHM approach will provide safety and reliability coverage for a complete system over its entire life cycle by determining and integrating health status and performance information from the subsystem and component levels. This paper will focus on the potential advanced diagnostic elements that will provide intelligent assessment of the subsystem health and the planned implementation of these elements in the ISHM Testbed and Prototypes (ITP) Project under the NASA Exploration Systems Research and Technology program.

  2. The computational structural mechanics testbed architecture. Volume 1: The language

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the first set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP, and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 1 presents the basic elements of the CLAMP language and is intended for all users.

  3. The Benchmark Extensible Tractable Testbed Engineering Resource (BETTER)

    SciTech Connect

    Siranosian, Antranik Antonio; Schembri, Philip Edward; Miller, Nathan Andrew

    2016-06-02

    The Benchmark Extensible Tractable Testbed Engineering Resource (BETTER) is proposed as a family of modular test bodies that are intended to support engineering capability development by helping to identify weaknesses and needs. Weapon systems, subassemblies, and components are often complex and difficult to test and analyze, resulting in low confidence and high uncertainties in experimental and simulated results. The complexities make it difficult to distinguish between inherent uncertainties and errors due to insufficient capabilities. BETTER test bodies will first use simplified geometries and materials such that testing, data collection, modeling and simulation can be accomplished with high confidence and low uncertainty. Modifications and combinations of simple and well-characterized BETTER test bodies can then be used to increase complexity in order to reproduce relevant mechanics and identify weaknesses. BETTER can provide both immediate and long-term improvements in testing and simulation capabilities. This document presents the motivation, concept, benefits and examples for BETTER.

  4. Photovoltaic Engineering Testbed Designed for Calibrating Photovoltaic Devices in Space

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    2002-01-01

    Accurate prediction of the performance of solar arrays in space requires that the cells be tested in comparison with a space-flown standard. Recognizing that improvements in future solar cell technology will require an ever-increasing fidelity of standards, the Photovoltaics and Space Environment Branch at the NASA Glenn Research Center, in collaboration with the Ohio Aerospace Institute, designed a prototype facility to allow routine calibration, measurement, and qualification of solar cells on the International Space Station, and then the return of the cells to Earth for laboratory use. For solar cell testing, the Photovoltaic Engineering Testbed (PET) site provides a true air-mass-zero (AM0) solar spectrum. This allows solar cells to be accurately calibrated using the full spectrum of the Sun.

  5. The computational structural mechanics testbed architecture. Volume 2: Directives

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the second of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 2 describes the CLIP directives in detail. It is intended for intermediate and advanced users.

  6. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John; Schmitz, Eric; Hoff, William

    1991-01-01

    This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.

  7. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; hide

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  8. Simulation to Flight Test for a UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; Logan, Michael J.; French, Michael L.; Guerreiro, Nelson M.

    2006-01-01

    The NASA Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis, Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights, including a fully autonomous demonstration at the Association of Unmanned Vehicle Systems International (AUVSI) UAV Demo 2005. Simulations based on wind tunnel data are being used to further develop advanced controllers for implementation and flight test.

  9. A Simulation Testbed for Airborne Merging and Spacing

    NASA Technical Reports Server (NTRS)

    Santos, Michel; Manikonda, Vikram; Feinberg, Art; Lohr, Gary

    2008-01-01

    The key innovation in this effort is the development of a simulation testbed for airborne merging and spacing (AM&S). We focus on concepts related to airports with Super Dense Operations where new airport runway configurations (e.g. parallel runways), sequencing, merging, and spacing are some of the concepts considered. We focus on modeling and simulating a complementary airborne and ground system for AM&S to increase efficiency and capacity of these high density terminal areas. From a ground systems perspective, a scheduling decision support tool generates arrival sequences and spacing requirements that are fed to the AM&S system operating on the flight deck. We enhanced NASA's Airspace Concept Evaluation Systems (ACES) software to model and simulate AM&S concepts and algorithms.

  10. Test applications for heterogeneous real-time network testbed

    SciTech Connect

    Mines, R.F.; Knightly, E.W.

    1994-07-01

    This paper investigates several applications for a heterogeneous real-time network testbed. The network is heterogeneous in terms of network devices, technologies, protocols, and algorithms. The network is real-time in that its services can provide per-connection end-to-end performance guarantees. Although different parts of the network use different algorithms, all components have the necessary mechanisms to provide performance guarantees: admission control and priority scheduling. Three applications for this network are described in this paper: a video conferencing tool, a tool for combustion modeling using distributed computing, and an MPEG video archival system. Each has minimum performance requirements that must be provided by the network. By analyzing these applications, we provide insights to the traffic characteristics and performance requirements of practical real-time loads.

  11. SIM Interferometer Testbed (SCDU) Status and Recent Results

    NASA Technical Reports Server (NTRS)

    Nemati, Bijan; An, Xin; Goullioud, Renaud; Shao, Michael; Shen, Tsae-Pyng; Wehmeier, Udo J.; Weilert, Mark A.; Wang, Xu; Werne, Thomas A.; Wu, Janet P.; Zhai, Chengxing

    2010-01-01

    SIM Lite is a space-borne stellar interferometer capable of searching for Earth-size planets in the habitable zones of nearby stars. This search will require measurement of astrometric angles with sub micro-arcsecond accuracy and optical pathlength differences to 1 picometer by the end of the five-year mission. One of the most significant technical risks in achieving this level of accuracy is from systematic errors that arise from spectral differences between candidate stars and nearby reference stars. The Spectral Calibration Development Unit (SCDU), in operation since 2007, has been used to explore this effect and demonstrate performance meeting SIM goals. In this paper we present the status of this testbed and recent results.

  12. Easy and hard testbeds for real-time search algorithms

    SciTech Connect

    Koenig, S.; Simmons, R.G.

    1996-12-31

    Although researchers have studied which factors influence the behavior of traditional search algorithms, currently not much is known about how domain properties influence the performance of real-time search algorithms. In this paper we demonstrate, both theoretically and experimentally, that Eulerian state spaces (a super set of undirected state spaces) are very easy for some existing real-time search algorithms to solve: even real-time search algorithms that can be intractable, in general, are efficient for Eulerian state spaces. Because traditional real-time search testbeds (such as the eight puzzle and gridworlds) are Eulerian, they cannot be used to distinguish between efficient and inefficient real-time search algorithms. It follows that one has to use non-Eulerian domains to demonstrate the general superiority of a given algorithm. To this end, we present two classes of hard-to-search state spaces and demonstrate the performance of various real-time search algorithms on them.

  13. Experimental Testbed for the Study of Hydrodynamic Issues in Supernovae

    SciTech Connect

    Robey, H F; Kane, J O; Remington, B A; Drake, R P; Hurricane, O A; Louis, H; Wallace, R J; Knauer, J; Keiter, P; Arnett, D

    2000-10-09

    More than a decade after the explosion of SN 1987A, unresolved discrepancies still remain in attempts to numerically simulate the mixing processes initiated by the passage of a very strong shock through the layered structure of the progenitor star. Numerically computed velocities of the radioactive {sup 56}Ni and {sup 56}CO, produced by shock-induced explosive burning within the silicon layer for example, are still more than 50% too low as compared with the measured velocities. In order to resolve such discrepancies between observation and simulation, an experimental testbed has been designed on the Omega Laser for the study of hydrodynamic issues of importance to supernovae (SNe). In this paper, we present results from a series of scaled laboratory experiments designed to isolate and explore several issues in the hydrodynamics of SN explosions. The results of the experiments are compared with numerical simulations and are generally found to be in reasonable agreement.

  14. Active structural subsystem of the OISI interferometry testbed

    NASA Astrophysics Data System (ADS)

    Döngi, Frank; Johann, Ulrich; Szerdahelyi, Laszlo

    1999-12-01

    An adaptive truss structure has been realized for active vibration damping within a laboratory testbed for future spaceborne optical and infra-red interferometers. The active elements are based on piezoelectric sensors and actuators. The paper first surveys configuration scenarios for space interferometers that aim at nanometre accuracy of optical pathlengths. It then focuses on the function of active structural control. For the laboratory truss, practical design considerations as well as analytical approaches for modelling and system identification, placement of active elements and design of active damping control are discussed in detail. Experimental results of the active damping performance achieved with integral feedback of strut force signals are compared with analytical predictions. The combined effects of active damping and passive vibration isolation are presented, and conclusions are drawn regarding further activities towards nanometre stabilization of optical pathlengths.

  15. Modular, Rapid Propellant Loading System/Cryogenic Testbed

    NASA Technical Reports Server (NTRS)

    Hatfield, Walter, Sr.; Jumper, Kevin

    2012-01-01

    The Cryogenic Test Laboratory (CTL) at Kennedy Space Center (KSC) has designed, fabricated, and installed a modular, rapid propellant-loading system to simulate rapid loading of a launch-vehicle composite or standard cryogenic tank. The system will also function as a cryogenic testbed for testing and validating cryogenic innovations and ground support equipment (GSE) components. The modular skid-mounted system is capable of flow rates of liquid nitrogen from 1 to 900 gpm (approx equals 3.8 to 3,400 L/min), of pressures from ambient to 225 psig (approx equals 1.5 MPa), and of temperatures to -320 F (approx equals -195 C). The system can be easily validated to flow liquid oxygen at a different location, and could be easily scaled to any particular vehicle interface requirements

  16. Phase retrieval algorithm for JWST Flight and Testbed Telescope

    NASA Astrophysics Data System (ADS)

    Dean, Bruce H.; Aronstein, David L.; Smith, J. Scott; Shiri, Ron; Acton, D. Scott

    2006-06-01

    An image-based wavefront sensing and control algorithm for the James Webb Space Telescope (JWST) is presented. The algorithm heritage is discussed in addition to implications for algorithm performance dictated by NASA's Technology Readiness Level (TRL) 6. The algorithm uses feedback through an adaptive diversity function to avoid the need for phase-unwrapping post-processing steps. Algorithm results are demonstrated using JWST Testbed Telescope (TBT) commissioning data and the accuracy is assessed by comparison with interferometer results on a multi-wave phase aberration. Strategies for minimizing aliasing artifacts in the recovered phase are presented and orthogonal basis functions are implemented for representing wavefronts in irregular hexagonal apertures. Algorithm implementation on a parallel cluster of high-speed digital signal processors (DSPs) is also discussed.

  17. Telescience testbed: Operational support functions for biomedical experiments

    NASA Astrophysics Data System (ADS)

    Yamashita, Masamichi; Watanabe, Satoru; Shoji, Takatoshi; Clarke, Andrew H.; Suzuki, Hiroyuki; Yanagihara, Dai

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  18. The computational structural mechanics testbed architecture. Volume 2: The interface

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the third set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 3 describes the CLIP-Processor interface and related topics. It is intended only for processor developers.

  19. An Overview of Research Activity at the Launch Systems Testbed

    NASA Technical Reports Server (NTRS)

    Vu, Bruce; Kandula, Max

    2003-01-01

    This paper summarizes the acoustic testing and analysis activities at the Launch System Testbed (LST) of Kennedy Space Center (KSC). A major goal is to develop passive methods of mitigation of sound from rocket exhaust jets with ducted systems devoid of traditional water injection. Current testing efforts are concerned with the launch-induced vibroacoustic behavior of scaled exhaust jets. Numerical simulations are also developed to study the sound propagation from supersonic jets in free air and through enclosed ducts. Scaling laws accounting for the effects of important parameters such as jet Mach number, jet velocity, and jet temperature on the far-field noise are investigated in order to deduce full-scale environment from small-scale tests.

  20. The magic crayon: an object definition and volume calculation testbed

    NASA Astrophysics Data System (ADS)

    Beard, David V.; Faith, R. E.; Eberly, David H.; Pizer, Stephen M.; Kurak, Charles; Johnston, Richard E.

    1993-09-01

    Rapid, accurate definition and volume calculation of anatomical objects is essential for effective CT and MR diagnosis. Absolute volumes often signal abnormalities while relative volumes--such as a change in tumor size--can provide critical information on the effectiveness of radiation therapy. To this end, we have developed the 'magic crayon' (MC) anatomical object visualization, object definition, and volume calculation tool as a follow on to UNC's Image Hierarchy Editor (IHE) and Image Hierarchy Visualizer (IHV). This paper presents the magic crayon system detailing interaction, implementation, and preliminary observer studies. MC has several features: (1) it uses a number of 3D visualization methods to visualize rapidly an anatomical object. (2) MC can serve as a test bed for various object definition algorithms. (3) MC serves as a testbed allowing the comparative evaluation of various volume calculation methods including pixel counting and Dr. David Eberly's divergence method.

  1. Demo III: Department of Defense testbed for unmanned ground mobility

    NASA Astrophysics Data System (ADS)

    Shoemaker, Chuck M.; Bornstein, Jonathan A.; Myers, Scott D.; Brendle, Bruce E., Jr.

    1999-07-01

    Robotics has been identified by numerous recent Department of Defense (DOD) studies as a key enabling technology for future military operational concepts. The Demo III Program is a multiyear effort encompassing technology development and demonstration on testbed platforms, together with modeling simulation and experimentation directed toward optimization of operational concepts to employ this technology. Primary program focus is the advancement of capabilities for autonomous mobility through unstructured environments, concentrating on both perception and intelligent control technology. The scout mission will provide the military operational context for demonstration of this technology, although a significant emphasis is being placed upon both hardware and software modularity to permit rapid extension to other military missions. The Experimental Unmanned Vehicle (XUV) is a small (approximately 1150 kg, V-22 transportable) technology testbed vehicle designed for experimentation with multiple military operational concepts. Currently under development, the XUV is scheduled for roll-out in Summer 1999, with an initial troop experimentation to be conducted in September 1999. Though small, and relatively lightweight, modeling has shown the chassis capable of automotive mobility comparable to the current Army lightweight high-mobility, multipurpose, wheeled vehicle (HMMWV). The XUV design couples multisensor perception with intelligent control to permit autonomous cross-country navigation at speeds of up to 32 kph during daylight and 16 kph during hours of darkness. A small, lightweight, highly capable user interface will permit intuitive control of the XUV by troops from current-generation tactical vehicles. When it concludes in 2002, Demo III will provide the military with both the technology and the initial experience required to develop and field the first generation of semi-autonomous tactical ground vehicles for combat, combat support, and logistics applications.

  2. Modular algorithm concept evaluation tool (MACET) sensor fusion algorithm testbed

    NASA Astrophysics Data System (ADS)

    Watson, John S.; Williams, Bradford D.; Talele, Sunjay E.; Amphay, Sengvieng A.

    1995-07-01

    Target acquisition in a high clutter environment in all-weather at any time of day represents a much needed capability for the air-to-surface strike mission. A considerable amount of the research at the Armament Directorate at Wright Laboratory, Advanced Guidance Division WL/MNG, has been devoted to exploring various seeker technologies, including multi-spectral sensor fusion, that may yield a cost efficient system with these capabilities. Critical elements of any such seekers are the autonomous target acquisition and tracking algorithms. These algorithms allow the weapon system to operate independently and accurately in realistic battlefield scenarios. In order to assess the performance of the multi-spectral sensor fusion algorithms being produced as part of the seeker technology development programs, the Munition Processing Technology Branch of WL/MN is developing an algorithm testbed. This testbed consists of the Irma signature prediction model, data analysis workstations, such as the TABILS Analysis and Management System (TAMS), and the Modular Algorithm Concept Evaluation Tool (MACET) algorithm workstation. All three of these components are being enhanced to accommodate multi-spectral sensor fusion systems. MACET is being developed to provide a graphical interface driven simulation by which to quickly configure algorithm components and conduct performance evaluations. MACET is being developed incrementally with each release providing an additional channel of operation. To date MACET 1.0, a passive IR algorithm environment, has been delivered. The second release, MACET 1.1 is presented in this paper using the MMW/IR data from the Advanced Autonomous Dual Mode Seeker (AADMS) captive flight demonstration. Once completed, the delivered software from past algorithm development efforts will be converted to the MACET library format, thereby providing an on-line database of the algorithm research conducted to date.

  3. Testbeds for Wind Resource Characterization: Needs and Potential Facilities

    NASA Astrophysics Data System (ADS)

    Shaw, W. J.; Berg, L. K.; Rishel, J. P.; Flaherty, J. E.

    2008-12-01

    With the emergence of wind as a significant source of alternative energy, it is becoming increasingly clear that some problems associated with the installation and operation of wind plants arise because of continuing gaps in our knowledge of fundamental physical processes in the lower atmospheric boundary layer. Over the years, a number of well-designed intensive field campaigns have yielded significant insight into boundary layer structure and turbulence under targeted conditions. However, to be able to usefully simulate the atmosphere for applications of wind power, it is important to evaluate the resulting parameterizations under a realistic spectrum of atmospheric conditions. To do this, facilities - testbeds - are required that operate continually over long periods. Such facilities could also be used, among other things, to establish long-term statistics of mean wind and low-level shear, to explore the representativeness of shorter-period (e.g. one year) statistics, to explore techniques for extrapolating wind statistics in space, and to serve as host infrastructure for boundary layer campaigns targeted to wind energy applications. During the last half of the 20th century, a number of tall instrumented towers were installed at locations around the United States for studies of atmospheric dispersion and other purposes. Many of these are no longer in service, but some have operated continuously for decades and continue to collect calibrated wind and temperature information from multiple heights extending to hub height or higher for many current operational wind turbines. This talk will review the status of tall towers in the U.S. that could anchor testbeds for research related wind power production and will use data from the 120-m meteorological tower on the Hanford Site in southeastern Washington State to illustrate the kind of information is available.

  4. The Earth Data System and The National Information Infrastructure Testbed

    NASA Astrophysics Data System (ADS)

    Christian, C. A.; Murray, S. S.

    Earth science data is presently stored in many different archives with many different catalog and access methods. These range from large, government archives of satellite imagery with full data management and access services to small, field data sets under the control of an individual scientist. The prototype Earth Data System (EDS) is designed to provide access of such heterogeneous data researchers interested in global weather changes, ocean current movements, and coastal drainage. In particular, the EDS includes a collaborative data browser and analysis capability as an example of a truly distributed computing capability that allows people at geographically separate institutions to search, view, and analyze data together. The EDS is a testbed application being developed by the National Information Infrastructure Testbed organization; a consortium of industry, government and academic institutions. The application is constructed using an implementation of OSF's Distributed Computing Environment (DCE) software, wrapped in a software toolkit reminiscent of the Astrophysics Data System toolkit. It uses a communications network of high-speed cross-country DS3 (45Mbps) lines coupled with ATM switches, and routers with SMDS, FDDI, and Ethernet interfaces. The initial tests in May, 1993, showed outstanding performance with network nodes distributed between California and New Mexico locations linked by a T3 network. The functional system will be demonstrated in November 1993 at several forums including Supercomputing 93. Future developments will incorporate distributed real time device control and data retrieval as well as high speed computing, data retrieval and network management. In this paper we will discuss the EDS reference application as well as several of the technical issues regarding the full deployment of large scale distributed computing infrastructure.

  5. Vacuum Nuller Testbed Performance, Characterization and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  6. A multispectral testbed for cardiovascular sensing using imaging photoplethysmography

    NASA Astrophysics Data System (ADS)

    Blackford, Ethan B.; Estepp, Justin R.

    2017-02-01

    Imaging photoplethysmography uses image sensors to measure changes in light absorption resulting from skin microvascular blood volume pulsations throughout the cardiac cycle. Imaging photoplethysmography has been demonstrated as an effective, non-contact means of assessing pulse rate, pulse rate variability, and respiration rate. Other potential uses include measuring spatial blood perfusion, oxygenation, and flow dynamics. Herein we demonstrate the development of a multispectral testbed for imaging photoplethysmography consisting of 12 monochromatic, 120fps imagers with 50nm, bandpass filters distributed from 400-750nm and contained in a 3D-printed, 4x3 grid housing mounted on a tripod positioned orthogonal to the subject. A co-located dual-CCD RGB/near-infrared imager records conventional RGB and NIR images expanding the spectral window recorded. After image registration, a multispectral image cube of the 13, partially overlapping bands is created. A spectrometer records high (spectral) resolution data from the participant's right cheek using a collimating lens attached to the measurement fiber. In addition, a spatial array of 5 RGB imagers placed at 0°, +/-20° and +/-40° positions with respect to the subject is employed for motion and spatial robustness. All imagers are synchronized by a hardware trigger source synchronized with a reference, physiological measurement device recording the subject's electrocardiography, bilateral fingertip and/or ear lobe photoplethysmography, bilateral galvanic skin response, and respiration signals. The development of the testbed and pilot data is presented. A full-scale evaluation of the spectral components of the imaging photoplethysmographic signal, optimization of iPPG SNR, and spatial perfusion and blood flow dynamics is currently underway.

  7. Cloud Arcs

    Atmospheric Science Data Center

    2013-04-19

    ... a sinking motion elsewhere, are very common, the degree of organization exhibited here is relatively rare, as the wind field at different altitudes usually disrupts such patterns. The degree of self organization of this cloud image, whereby three or four such circular events ...

  8. Cloud Front

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA02171 Cloud Front

    These clouds formed in the south polar region. The faintness of the cloud system likely indicates that these are mainly ice clouds, with relatively little dust content.

    Image information: VIS instrument. Latitude -86.7N, Longitude 212.3E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  9. Cloud vortices

    NASA Image and Video Library

    2015-11-02

    Cloud vortices off Heard Island, south Indian Ocean. The Moderate Resolution Imaging Spectroradiometer (MODIS) aboard NASA’s Aqua satellite captured this true-color image of sea ice off Heard Island on Nov 2, 2015 at 5:02 AM EST (09:20 UTC). Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team

  10. Design and construction of a testbed for the application of real volcanic ash from the Eyjafjallajökull and Grimsvötn eruptions to microgas turbines

    NASA Astrophysics Data System (ADS)

    Weber, Konradin; Fischer, Christian; Lange, Martin; Schulz, Uwe; Naraparaju, Ravisankar; Kramer, Dietmar

    2017-04-01

    It is well known that volcanic ash clouds emitted from erupting volcanoes pose a considerable threat to the aviation. The volcanic ash particles can damage the turbine blades and their thermal barrier coatings as well as the bearings of the turbine. For a detailed investigation of this damaging effect a testbed was designed and constructed, which allowed to study the damaging effects of real volcanic ash to an especially for these investigations modified microgas turbine. The use of this microgas turbine had the advantage that it delivers near reality conditions, using kerosene and operating at similar temperatures as big turbines, but at a very cost effective level. The testbed consisted out of a disperser for the real volcanic ash and all the equipment needed to control the micro gas turbine. Moreover, in front and behind the microgas turbine the concentration and the distribution of the volcanic ash were measured online by optical particle counters (OPCs). The particle concentration and size distribution of the volcanic ash particles in the intake in front of the microgas turbine was measured by an optical particle counter (OPC) combined with an isokinetic intake. Behind the microgas turbine in the exhaust gas additionally to the measurement with a second OPC ash particles were caught with an impactor, in order to enable the later analysis with an electron microscope concerning the morphology to verify possible melting processes of the ash particles. This testbed is of high importance as it allows detailed investigations of the impact of volcanic ash to jet turbines and appropriate countermeasures.

  11. Independent Technology Assessment within the Federation of Earth Science Information Partners (ESIP) Testbed

    NASA Astrophysics Data System (ADS)

    Burgess, A. B.; Robinson, E.; Graybeal, J.

    2015-12-01

    The Federation of Earth Science Information Partners (ESIP) is a community of science, data and information technology practitioners. ESIP's mission is to support the networking and data dissemination needs of our members and the global community. We do this by linking the functional sectors of education, observation, research and application with the ultimate use of Earth science. Amongst the services provided to ESIP members is the Testbed; a collaborative forum for the development of technology standards, services, protocols and best practices. ESIP has partnered with the NASA Advanced Information Systems Technology (AIST) program to integrate independent assessment of Testing Readiness Level (TRL) into the ESIP Testbed. In this presentation we will 1) demonstrate TRL assessment in the ESIP Testbed using three AIST projects, 2) discuss challenges and insights into creating an independent validation/verification framework and 3) outline the versatility of the ESIP Testbed as applied to other technology projects.

  12. TPF Planet Detection Testbed: demonstrating deep, stable nulling and planet detection

    NASA Technical Reports Server (NTRS)

    Martin, Stefan

    2005-01-01

    The design of a testbed being built at the Jet Propulsion Laboratory is described. Simulatiung a dual chopped Bracewell interferometer, the testbed comprises a four beam star and planet source and nulling beam combiner. Since achieving a stable null is of great concern the testbed has many control systems designed to achieve stability of alignment and optical path difference over long periods of time. Comparisons between the testbed and the flight system are drawn and key performance parameters are discussed. The interaction between designs for phaseplate systems that achromatically invert the electric field of one of each pair of the incoming beams to achieve the null and the choice of fringe tracking schemes is also discussed.

  13. Preliminary Design of a Galactic Cosmic Ray Shielding Materials Testbed for the International Space Station

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen; Sechkar, Edward A.; Panko, Scott R.

    2012-01-01

    The preliminary design of a testbed to evaluate the effectiveness of galactic cosmic ray (GCR) shielding materials, the MISSE Radiation Shielding Testbed (MRSMAT) is presented. The intent is to mount the testbed on the Materials International Space Station Experiment-X (MISSE-X) which is to be mounted on the International Space Station (ISS) in 2016. A key feature is the ability to simultaneously test nine samples, including standards, which are 5.25 cm thick. This thickness will enable most samples to have an areal density greater than 5 g/sq cm. It features a novel and compact GCR telescope which will be able to distinguish which cosmic rays have penetrated which shielding material, and will be able to evaluate the dose transmitted through the shield. The testbed could play a pivotal role in the development and qualification of new cosmic ray shielding technologies.

  14. Precision truss structures from concept to hardware reality: application to the Micro-Precision Interferometer Testbed

    NASA Astrophysics Data System (ADS)

    Sword, Lee F.; Carne, Thomas G.

    1993-09-01

    This paper describes the development of the truss structure at the Jet Propulsion Laboratory that forms the backbone of JPL's Micro-Precision Interferometer (MPI) Testbed. The Micro- Precision Interferometer (MPI) Testbed is the third generation of Control Structure Interaction (CSI) Testbeds constructed by JPL aimed at developing and validating control concepts. The MPI testbed is essentially a space-based Michelson interferometer suspended in a ground- based laboratory. This instrument, mounted to the flexible truss, requires nanometer level precision alignment and positioning of its optical elements to achieve science objectives. A layered control architecture, utilizing isolation, structural control, and active optical control technologies, allow the system to meet its vibration attenuation goals. Success of the structural control design, which involves replacement of truss struts with active and/or passive elements, depends heavily on high fidelity models of the structure to evaluate strut placement locations. The first step in obtaining an accurate structure model is to build a structure which is linear.

  15. Benefits of Model Updating: A Case Study Using the Micro-Precision Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Neat, Gregory W.; Kissil, Andrew; Joshi, Sanjay S.

    1997-01-01

    This paper presents a case study on the benefits of model updating using the Micro-Precision Interferometer (MPI) testbed, a full-scale model of a future spaceborne optical interferometer located at JPL.

  16. X-34 Technology Testbed Demonstrator being mated with the L-1011 mothership

    NASA Image and Video Library

    1999-03-11

    This is the X-34 Technology Testbed Demonstrator being mated with the L-1011 mothership. The X-34 will demonstrate key vehicle and operational technologies applicable to future low-cost resuable launch vehicles.

  17. Description of the control system design for the SSF PMAD DC testbed

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Kimnach, Greg L.

    1991-01-01

    The Power Management and Distribution (PMAD) DC Testbed Control System for Space Station Freedom was developed using a top down approach based on classical control system and conventional terrestrial power utilities design techniques. The design methodology includes the development of a testbed operating concept. This operating concept describes the operation of the testbed under all possible scenarios. A unique set of operating states was identified and a description of each state, along with state transitions, was generated. Each state is represented by a unique set of attributes and constraints, and its description reflects the degree of system security within which the power system is operating. Using the testbed operating states description, a functional design for the control system was developed. This functional design consists of a functional outline, a text description, and a logical flowchart for all the major control system functions. Described here are the control system design techniques, various control system functions, and the status of the design and implementation.

  18. A Real-Time Testbed for Satellite and Terrestrial Communications Experimentation and Development

    NASA Technical Reports Server (NTRS)

    Angkasa, K.; Hamkins, J.; Jao, J.; Lay, N.; Satorius, E.; Zevallos, A.

    1997-01-01

    This paper describes a programmable DSP-based testbed that is employed in the development and evaluation of blind demodulation algorithms to be used in wireless satellite or terrestrial communications systems. The testbed employs a graphical user interface (GUI) to provide independent, real-time control of modulator, channel and demodulator parameters and also affords realtime observation of various diagnostic signals such as carrier, timing recovery and decoder metrics. This interactive flexibility enables an operator to tailor the testbed parameters and environment to investigate the performance of any arbitrary communications system and channel model. Furthermore, a variety of digital and analog interfaces allow the testbed to be used either as a stand-alone digital modulator or receiver, thereby extending its experimental utility from the laboratory to the field.

  19. Optical Network Testbed-Key Enabler in Developing Current and Future Network Solutions

    NASA Astrophysics Data System (ADS)

    Vukovic, Alex; Wu, Jing; Savoie, Michel; Hua, Heng; Campbell, Scott; Zhang, Hanxi

    2005-10-01

    The all-optical network (AON) demonstrator is a trial system-level testbed for the validation and verification of key network building blocks, scalable architectures, as well as control and management solutions for next-generation wavelength division multiplexing (WDM) networks. Developed at the Communications Research Centre (CRC) in Ottawa, ON, Canada, the AON testbed has already validated certain system-level concepts at the physical and upper layers. The paper describes the crucial role of the AON testbed in research, development, and "proof of concept" for both emerging optical technologies at the physical layer (performance characterization) and customer-managed networks at the upper layer (network management). Moreover, it is expected that the AON testbed will continue to be a valuable playground for future developments of emerging technologies, solutions, and applications.

  20. Testbed for extended-scene Shack-Hartmann and phase retrieval wavefront sensing

    NASA Technical Reports Server (NTRS)

    Morgan, Rhonda M.; Ohara, Catherine M.; Green, Joseph J.; Roberts, Jennifer; Sidick, Erkin; Shcheglov, Kirill

    2005-01-01

    We have implemented a testbed to demonstrate wavefront sensing and control on an extended scene using Shack-Hartmann and MGS phase retrieval simultaneously. This dual approach allows for both high sensitivity and high dynamic range wavefront sensing.

  1. Carrier Plus: A Sensor Payload for Living With a Star Space Environment Testbed (LWS/SET)

    NASA Technical Reports Server (NTRS)

    Marshall, Cheryl; Moss, Steven; Howard, Regan; LaBel, Kenneth; Grycewicz, Tom; Barth, Janet; Brewer, Dana

    2003-01-01

    The paper discusses the following: 1. Living with a Star (LWS) program: space environment testbed (SET); natural space environment. 2. Carrier plus: goals and benefits. 3. ON-orbit sensor measurements. 4. Carrier plus architecture. 5. Participation in carrier plus.

  2. Carrier Plus: A Sensor Payload for Living With a Star Space Environment Testbed (LWS/SET)

    NASA Technical Reports Server (NTRS)

    Marshall, Cheryl; Moss, Steven; Howard, Regan; LaBel, Kenneth; Grycewicz, Tom; Barth, Janet; Brewer, Dana

    2003-01-01

    The paper discusses the following: 1. Living with a Star (LWS) program: space environment testbed (SET); natural space environment. 2. Carrier plus: goals and benefits. 3. ON-orbit sensor measurements. 4. Carrier plus architecture. 5. Participation in carrier plus.

  3. A Real-Time Testbed for Satellite and Terrestrial Communications Experimentation and Development

    NASA Technical Reports Server (NTRS)

    Angkasa, K.; Hamkins, J.; Jao, J.; Lay, N.; Satorius, E.; Zevallos, A.

    1997-01-01

    This paper describes a programmable DSP-based testbed that is employed in the development and evaluation of blind demodulation algorithms to be used in wireless satellite or terrestrial communications systems. The testbed employs a graphical user interface (GUI) to provide independent, real-time control of modulator, channel and demodulator parameters and also affords realtime observation of various diagnostic signals such as carrier, timing recovery and decoder metrics. This interactive flexibility enables an operator to tailor the testbed parameters and environment to investigate the performance of any arbitrary communications system and channel model. Furthermore, a variety of digital and analog interfaces allow the testbed to be used either as a stand-alone digital modulator or receiver, thereby extending its experimental utility from the laboratory to the field.

  4. Gulfstream's Quiet Spike sonic boom mitigator being installed on NASA DFRC's F-15B testbed aircraft

    NASA Image and Video Library

    2006-04-17

    Gulfstream's Quiet Spike sonic boom mitigator being installed on NASA DFRC's F-15B testbed aircraft. The project seeks to verify the structural integrity of the multi-segmented, articulating spike attachment designed to reduce and control a sonic boom.

  5. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  6. Validation of the CERTS Microgrid Concept The CEC/CERTS MicrogridTestbed

    SciTech Connect

    Nichols, David K.; Stevens, John; Lasseter, Robert H.; Eto,Joseph H.

    2006-06-01

    The development of test plans to validate the CERTSMicrogrid concept is discussed, including the status of a testbed.Increased application of Distributed Energy Resources on the Distributionsystem has the potential to improve performance, lower operational costsand create value. Microgrids have the potential to deliver these highvalue benefits. This presentation will focus on operationalcharacteristics of the CERTS microgrid, the partners in the project andthe status of the CEC/CERTS microgrid testbed. Index Terms DistributedGeneration, Distributed Resource, Islanding, Microgrid,Microturbine

  7. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    NASA Astrophysics Data System (ADS)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  8. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  9. Experimental Studies in a Reconfigurable C4 Test-bed for Network Enabled Capability

    DTIC Science & Technology

    2006-06-01

    Cross1, Dr R. Houghton1, and Mr R. McMaster1 Defence Technology Centre for Human factors Integration (DTC HFI ) BITlab, School of Engineering and Design...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defence Technology Centre for Human factors Integration (DTC HFI ) BITlab, School of...studies into NEC by the Human Factors Integration Defence Technology Centre ( HFI -DTC). DEVELOPMENT OF THE TESTBED In brief, the C4 test-bed

  10. Crew-integration and Automation Testbed (CAT)Program Overview and RUX06 Introduction

    DTIC Science & Technology

    2006-09-20

    unlimited Crew-integration and Automation Testbed ( CAT ) Program Overview and RUX06 Introduction 26-27 July 2006 Patrick Nunez, Terry Tierney, Brian Novak...3. DATES COVERED 4. TITLE AND SUBTITLE Crew-integration and Automation Testbed ( CAT )Program Overview and RUX06 Introduction 5a. CONTRACT...Experiment • Capstone CAT experiment – Evaluate effectiveness of CAT program in improving the performance and/or reducing the workload for a mounted

  11. Versatile simulation testbed for rotorcraft speech I/O system design

    NASA Technical Reports Server (NTRS)

    Simpson, Carol A.

    1986-01-01

    A versatile simulation testbed for the design of a rotorcraft speech I/O system is described in detail. The testbed will be used to evaluate alternative implementations of synthesized speech displays and speech recognition controls for the next generation of Army helicopters including the LHX. The message delivery logic is discussed as well as the message structure, the speech recognizer command structure and features, feedback from the recognizer, and random access to controls via speech command.

  12. A high-resolution, four-band SAR testbed with real-time image formation

    SciTech Connect

    Walker, B.; Sander, G.; Thompson, M.; Burns, B.; Fellerhoff, R.; Dubbert, D.

    1996-03-01

    This paper describes the Twin-Otter SAR Testbed developed at Sandia National Laboratories. This SAR is a flexible, adaptable testbed capable of operation on four frequency bands: Ka, Ku, X, and VHF/UHF bands. The SAR features real-time image formation at fine resolution in spotlight and stripmap modes. High-quality images are formed in real time using the overlapped subaperture (OSA) image-formation and phase gradient autofocus (PGA) algorithms.

  13. Modeling of clouds and radiation for developing parameterizations for general circulation models. Annual report, 1995

    SciTech Connect

    Toon, O.B.; Westphal, D.L.

    1996-07-01

    We have used a hierarchy of numerical models for cirrus and stratus clouds and for radiative transfer to improve the reliability of general circulation models. Our detailed cloud microphysical model includes all of the physical processes believed to control the lifecycles of liquid and ice clouds in the troposphere. We have worked on specific GCM parameterizations for the radiative properties of cirrus clouds, making use of a mesocale model as the test-bed for the parameterizations. We have also modeled cirrus cloud properties with a detailed cloud physics model to better understand how the radiatively important properties of cirrus are controlled by their environment. We have used another cloud microphysics model to investigate of the interactions between aerosols and clouds. This work is some of the first to follow the details of interactions between aerosols and cloud droplets and has shown some unexpected relations between clouds and aerosols. We have also used line-by- line radiative transfer results verified with ARM data, to derive a GCMS.

  14. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Michael T.; Munson, Michael J.; Teate, George F.

    2007-01-01

    The Phoenix Air-Launched Small Missile (ALSM) flight testbed was conceived and is proposed to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of the United States Navy Phoenix AIM-54 (Hughes Aircraft Company, now Raytheon Company, Waltham, Massachusetts) long-range, guided air-to-air missile and the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (Edwards, California) F-15B (McDonnell Douglas, now the Boeing Company, Chicago, Illinois) testbed airplane. The retirement of the Phoenix AIM-54 missiles from fleet operation has presented an opportunity for converting this flight asset into a new flight testbed. This cost-effective new platform will fill the gap in the test and evaluation of hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform; when launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will assist the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite-small-payload air-launched space boosters.

  15. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    PubMed Central

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  16. Micro-Precision Interferometer Testbed: end-to-end system integration of control structure interaction technologies

    NASA Astrophysics Data System (ADS)

    Neat, Gregory W.; Sword, Lee F.; Hines, Braden E.; Calvet, Robert J.

    1993-09-01

    This paper describes the overall design and planned phased delivery of the ground-based Micro-Precision Interferometer (MPI) Testbed. The testbed is a half scale replica of a future space-based interferometer containing all the spacecraft subsystems necessary to perform an astrometric measurement. Appropriate sized reaction wheels will regulate the testbed attitude as well as provide a flight-like disturbance source. The optical system will consist of two complete Michelson interferometers. Successful interferometric measurements require controlling the positional stabilities of these optical elements to the nanometer level. The primary objective of the testbed is to perform a system integration of Control Structure Interaction (CSI) technologies necessary to demonstrate the end-to-end operation of a space- based interferometer, ultimately proving to flight mission planners that the necessary control technology exists to meet the challenging requirements of future space-based interferometry missions. These technologies form a multi-layered vibration attenuation architecture to achieve the necessary quiet environment. This three layered methodology blends disturbance isolation, structural quieting and active optical control techniques. The paper describes all the testbed subsystems in this end-to-end ground-based system as well as the present capabilities of the evolving testbed.

  17. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  18. Cloud streets in Davis Strait

    NASA Image and Video Library

    2017-09-27

    The late winter sun shone brightly on a stunning scene of clouds and ice in the Davis Strait in late February, 2013. The Moderate Resolution Imaging Spectroradiometer aboard NASA’s Aqua satellite captured this true-color image on February 22 at 1625 UTC. The Davis Strait connects the Labrador Sea (part of the Atlantic Ocean) in the south with Baffin Bay to the north, and separates Canada, to the west, from Greenland to the east. Strong, steady winds frequently blow southward from the colder Baffin Bay to the warmer waters of the Labrador Sea. Over ice, the air is dry and no clouds form. However, as the Arctic air moves over the warmer, open water the rising moist air and the temperature differential gives rise to lines of clouds. In this image, the clouds are aligned in a beautiful, parallel pattern. Known as “cloud streets”, this pattern is formed in a low-level wind, with the clouds aligning in the direction of the wind. Credit: NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team NASA image use policy. NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Like us on Facebook Find us on Instagram

  19. Southern Clouds

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Context image for PIA03026 Southern Clouds

    This image shows a system of clouds just off the margin of the South Polar cap. Taken during the summer season, these clouds contain both water-ice and dust.

    Image information: VIS instrument. Latitude 80.2S, Longitude 57.6E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  20. Linear Clouds

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA03667 Linear Clouds

    These clouds are located near the edge of the south polar region. The cloud tops are the puffy white features in the bottom half of the image.

    Image information: VIS instrument. Latitude -80.1N, Longitude 52.1E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  1. Designing an autonomous helicopter testbed: From conception through implementation

    NASA Astrophysics Data System (ADS)

    Garcia, Richard D.

    Miniature Unmanned Aerial Vehicles (UAVs) are currently being researched for a wide range of tasks, including search and rescue, surveillance, reconnaissance, traffic monitoring, fire detection, pipe and electrical line inspection, and border patrol to name only a few of the application domains. Although small/miniature UAVs, including both Vertical Takeoff and Landing (VTOL) vehicles and small helicopters, have shown great potential in both civilian and military domains, including research and development, integration, prototyping, and field testing, these unmanned systems/vehicles are limited to only a handful of university labs. For VTOL type aircraft the number is less than fifteen worldwide! This lack of development is due to both the extensive time and cost required to design, integrate and test a fully operational prototype as well as the shortcomings of published materials to fully describe how to design and build a "complete" and "operational" prototype system. This dissertation overcomes existing barriers and limitations by describing and presenting in great detail every technical aspect of designing and integrating a small UAV helicopter including the on-board navigation controller, capable of fully autonomous takeoff, waypoint navigation, and landing. The presented research goes beyond previous works by designing the system as a testbed vehicle. This design aims to provide a general framework that will not only allow researchers the ability to supplement the system with new technologies but will also allow researchers to add innovation to the vehicle itself. Examples include modification or replacement of controllers, updated filtering and fusion techniques, addition or replacement of sensors, vision algorithms, Operating Systems (OS) changes or replacements, and platform modification or replacement. This is supported by the testbed's design to not only adhere to the technology it currently utilizes but to be general enough to adhere to a multitude of

  2. Cloud Interactions

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site]

    Released 1 July 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.

    Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.

    This image was acquired during mid-spring near the North Pole. The linear water-ice clouds are now regional in extent and often interact with neighboring cloud system, as seen in this image. The bottom of the image shows how the interaction can destroy the linear nature. While the surface is still visible through most of the clouds, there is evidence that dust is also starting to enter the atmosphere.

    Image information: VIS instrument. Latitude 68.4, Longitude 258.8 East (101.2 West). 38 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration

  3. Clouds and Open Ocean near the Bahamas

    NASA Image and Video Library

    1982-07-04

    STS004-41-1206 (27 June-4July 1982) --- Sunglint reflects off the water of the North Atlantic Ocean in an area to the east of the Bahamas Islands sometimes called the Sargasso Sea. The area has also been referred to as the ?Bermuda Triangle.? Astronauts Thomas K. Mattingly II, STS-4 commander, and Henry W. Hartsfield Jr., pilot, spent seven days and one hour aboard the Earth-orbiting space shuttle Columbia and performed a variety of duties in addition to those of recording 70mm and 35mm imagery. Photo credit: NASA

  4. Assessing the Performance of Computationally Simple and Complex Representations of Aerosol Processes using a Testbed Methodology

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Ma, P.; Easter, R. C.; Liu, X.; Zaveri, R. A.; Rasch, P.

    2012-12-01

    Predictions of aerosol radiative forcing in climate models still contain large uncertainties, resulting from a poor understanding of certain aerosol processes, the level of complexity of aerosol processes represented in models, and the ability of models to account for sub-grid scale variability of aerosols and processes affecting them. In addition, comparing the performance and computational efficiency of new aerosol process modules used in various studies is problematic because different studies often employ different grid configurations, meteorology, trace gas chemistry, and emissions that affect the temporal and spatial evolution of aerosols. To address this issue, we have developed an Aerosol Modeling Testbed (AMT) to systematically and objectively evaluate aerosol process modules. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series of testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from a global climate model, Community Atmosphere Model version 5 (CAM5), has also been ported to WRF so that these parameterizations can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. In this study, we evaluate simple and complex treatments of the aerosol size distribution and secondary organic aerosols using the AMT and measurements collected during three field campaigns: the Megacities Initiative Local and Global Observations (MILAGRO) campaign conducted in the vicinity of Mexico City during March 2006, the

  5. A price- and-time-slot-negotiation mechanism for Cloud service reservations.

    PubMed

    Son, Seokho; Sim, Kwang Mong

    2012-06-01

    When making reservations for Cloud services, consumers and providers need to establish service-level agreements through negotiation. Whereas it is essential for both a consumer and a provider to reach an agreement on the price of a service and when to use the service, to date, there is little or no negotiation support for both price and time-slot negotiations (PTNs) for Cloud service reservations. This paper presents a multi-issue negotiation mechanism to facilitate the following: 1) PTNs between Cloud agents and 2) tradeoff between price and time-slot utilities. Unlike many existing negotiation mechanisms in which a negotiation agent can only make one proposal at a time, agents in this work are designed to concurrently make multiple proposals in a negotiation round that generate the same aggregated utility, differing only in terms of individual price and time-slot utilities. Another novelty of this work is formulating a novel time-slot utility function that characterizes preferences for different time slots. These ideas are implemented in an agent-based Cloud testbed. Using the testbed, experiments were carried out to compare this work with related approaches. Empirical results show that PTN agents reach faster agreements and achieve higher utilities than other related approaches. A case study was carried out to demonstrate the application of the PTN mechanism for pricing Cloud resources.

  6. Estimating Cloud Cover

    ERIC Educational Resources Information Center

    Moseley, Christine

    2007-01-01

    The purpose of this activity was to help students understand the percentage of cloud cover and make more accurate cloud cover observations. Students estimated the percentage of cloud cover represented by simulated clouds and assigned a cloud cover classification to those simulations. (Contains 2 notes and 3 tables.)

  7. Estimating Cloud Cover

    ERIC Educational Resources Information Center

    Moseley, Christine

    2007-01-01

    The purpose of this activity was to help students understand the percentage of cloud cover and make more accurate cloud cover observations. Students estimated the percentage of cloud cover represented by simulated clouds and assigned a cloud cover classification to those simulations. (Contains 2 notes and 3 tables.)

  8. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  9. Priority scheme planning for the robust SSM/PMAD testbed

    NASA Technical Reports Server (NTRS)

    Elges, Michael R.; Ashworth, Barry R.

    1991-01-01

    Whenever mixing priorities of manually controlled resources with those of autonomously controlled resources, the space station module power management and distribution (SSM/PMAD) environment requires cooperating expert system interaction between the planning function and the priority manager. The elements and interactions of the SSM/PMAD planning and priority management functions are presented. Their adherence to cooperating for common achievement are described. In the SSM/PMAD testbed these actions are guided by having a system planning function, KANT, which has insight to the executing system and its automated database. First, the user must be given access to all information which may have an effect on the desired outcome. Second, the fault manager element, FRAMES, must be informed as to the change so that correct diagnoses and operations take place if and when faults occur. Third, some element must engage as mediator for selection of resources and actions to be added or removed at the user's request. This is performed by the priority manager, LPLMS. Lastly, the scheduling mechanism, MAESTRO, must provide future schedules adhering to the user modified resource base.

  10. An integrated dexterous robotic testbed for space applications

    NASA Technical Reports Server (NTRS)

    Li, Larry C.; Nguyen, Hai; Sauer, Edward

    1992-01-01

    An integrated dexterous robotic system was developed as a testbed to evaluate various robotics technologies for advanced space applications. The system configuration consisted of a Utah/MIT Dexterous Hand, a PUMA 562 arm, a stereo vision system, and a multiprocessing computer control system. In addition to these major subsystems, a proximity sensing system was integrated with the Utah/MIT Hand to provide capability for non-contact sensing of a nearby object. A high-speed fiber-optic link was used to transmit digitized proximity sensor signals back to the multiprocessing control system. The hardware system was designed to satisfy the requirements for both teleoperated and autonomous operations. The software system was designed to exploit parallel processing capability, pursue functional modularity, incorporate artificial intelligence for robot control, allow high-level symbolic robot commands, maximize reusable code, minimize compilation requirements, and provide an interactive application development and debugging environment for the end users. An overview is presented of the system hardware and software configurations, and implementation is discussed of subsystem functions.

  11. Finite Element Modeling of the NASA Langley Aluminum Testbed Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Pritchard, Joselyn I.; Buehrle, Ralph D.; Pappa, Richard S.

    2002-01-01

    The NASA Langley Aluminum Testbed Cylinder (ATC) was designed to serve as a universal structure for evaluating structural acoustic codes, modeling techniques and optimization methods used in the prediction of aircraft interior noise. Finite element models were developed for the components of the ATC based on the geometric, structural and material properties of the physical test structure. Numerically predicted modal frequencies for the longitudinal stringer, ring frame and dome component models, and six assembled ATC configurations were compared with experimental modal survey data. The finite element models were updated and refined, using physical parameters, to increase correlation with the measured modal data. Excellent agreement, within an average 1.5% to 2.9%, was obtained between the predicted and measured modal frequencies of the stringer, frame and dome components. The predictions for the modal frequencies of the assembled component Configurations I through V were within an average 2.9% and 9.1%. Finite element modal analyses were performed for comparison with 3 psi and 6 psi internal pressurization conditions in Configuration VI. The modal frequencies were predicted by applying differential stiffness to the elements with pressure loading and creating reduced matrices for beam elements with offsets inside external superelements. The average disagreement between the measured and predicted differences for the 0 psi and 6 psi internal pressure conditions was less than 0.5%. Comparably good agreement was obtained for the differences between the 0 psi and 3 psi measured and predicted internal pressure conditions.

  12. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  13. Priority scheme planning for the robust SSM/PMAD testbed

    NASA Technical Reports Server (NTRS)

    Elges, Michael R.; Ashworth, Barry R.

    1991-01-01

    Whenever mixing priorities of manually controlled resources with those of autonomously controlled resources, the space station module power management and distribution (SSM/PMAD) environment requires cooperating expert system interaction between the planning function and the priority manager. The elements and interactions of the SSM/PMAD planning and priority management functions are presented. Their adherence to cooperating for common achievement are described. In the SSM/PMAD testbed these actions are guided by having a system planning function, KANT, which has insight to the executing system and its automated database. First, the user must be given access to all information which may have an effect on the desired outcome. Second, the fault manager element, FRAMES, must be informed as to the change so that correct diagnoses and operations take place if and when faults occur. Third, some element must engage as mediator for selection of resources and actions to be added or removed at the user's request. This is performed by the priority manager, LPLMS. Lastly, the scheduling mechanism, MAESTRO, must provide future schedules adhering to the user modified resource base.

  14. NN-SITE: A remote monitoring testbed facility

    SciTech Connect

    Kadner, S.; White, R.; Roman, W.; Sheely, K.; Puckett, J.; Ystesund, K.

    1997-08-01

    DOE, Aquila Technologies, LANL and SNL recently launched collaborative efforts to create a Non-Proliferation Network Systems Integration and Test (NN-Site, pronounced N-Site) facility. NN-Site will focus on wide area, local area, and local operating level network connectivity including Internet access. This facility will provide thorough and cost-effective integration, testing and development of information connectivity among diverse operating systems and network topologies prior to full-scale deployment. In concentrating on instrument interconnectivity, tamper indication, and data collection and review, NN-Site will facilitate efforts of equipment providers and system integrators in deploying systems that will meet nuclear non-proliferation and safeguards objectives. The following will discuss the objectives of ongoing remote monitoring efforts, as well as the prevalent policy concerns. An in-depth discussion of the Non-Proliferation Network Systems Integration and Test facility (NN-Site) will illuminate the role that this testbed facility can perform in meeting the objectives of remote monitoring efforts, and its potential contribution in promoting eventual acceptance of remote monitoring systems in facilities worldwide.

  15. Development of a Testbed for Distributed Satellite Command and Control

    NASA Astrophysics Data System (ADS)

    Zetocha, Paul; Brito, Margarita

    2002-01-01

    At the Air Force Research Laboratory's Space Vehicles Directorate we are investigating and developing architectures for commanding and controlling a cluster of cooperating satellites through prototype development for the TechSat-21 program. The objective of this paper is to describe a distributed satellite testbed that is currently under development and to summarize near term prototypes being implemented for cluster command and control. To design, develop, and test our architecture we are using eight PowerPC 750 VME-based single board computers, representing eight satellites. Each of these computers is hosting the OSE(TM) real-time operating system from Enea Systems. At the core of our on-board cluster manager is ObjectAgent. ObjectAgent is an agent-based object-oriented framework for flight systems, which is particularly suitable for distributed applications. In order to handle communication with the ground as well as to assist with the cluster management we are using the Spacecraft Command Language (SCL). SCL is also at the centerpiece of our ground control station and handles cluster commanding, telemetry decommutation, state-of-health monitoring, and Fault Detection, Isolation, and Resolution (FDIR). For planning and scheduling activities we are currently using ASPEN from NASA/JPL. This paper will describe each of the above components in detail and then present the prototypes being implemented.

  16. Wavefront Control Toolbox for James Webb Space Telescope Testbed

    NASA Technical Reports Server (NTRS)

    Shiri, Ron; Aronstein, David L.; Smith, Jeffery Scott; Dean, Bruce H.; Sabatke, Erin

    2007-01-01

    We have developed a Matlab toolbox for wavefront control of optical systems. We have applied this toolbox to the optical models of James Webb Space Telescope (JWST) in general and to the JWST Testbed Telescope (TBT) in particular, implementing both unconstrained and constrained wavefront optimization to correct for possible misalignments present on the segmented primary mirror or the monolithic secondary mirror. The optical models implemented in Zemax optical design program and information is exchanged between Matlab and Zemax via the Dynamic Data Exchange (DDE) interface. The model configuration is managed using the XML protocol. The optimization algorithm uses influence functions for each adjustable degree of freedom of the optical mode. The iterative and non-iterative algorithms have been developed to converge to a local minimum of the root-mean-square (rms) of wavefront error using singular value decomposition technique of the control matrix of influence functions. The toolkit is highly modular and allows the user to choose control strategies for the degrees of freedom to be adjusted on a given iteration and wavefront convergence criterion. As the influence functions are nonlinear over the control parameter space, the toolkit also allows for trade-offs between frequency of updating the local influence functions and execution speed. The functionality of the toolbox and the validity of the underlying algorithms have been verified through extensive simulations.

  17. A Test-Bed Configuration: Toward an Autonomous System

    NASA Astrophysics Data System (ADS)

    Ocaña, F.; Castillo, M.; Uranga, E.; Ponz, J. D.; TBT Consortium

    2015-09-01

    In the context of the Space Situational Awareness (SSA) program of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. In order to fulfill all the security requirements for the TBT project, the use of a autonomous emergency system (AES) is foreseen to monitor the control system. The AES will monitor remotely the health of the observing system and the internal and external environment. It will incorporate both autonomous and interactive actuators to force the protection of the system (i.e., emergency dome close out).

  18. Atmospheric Fluctuation Measurements with the Palomar Testbed Interferometer

    NASA Astrophysics Data System (ADS)

    Linfield, R. P.; Lane, B. F.; Colavita, M. M.; PTI Collaboration

    Observations of bright stars with the Palomar Testbed Interferometer, at a wavelength of 2.2 microns, have been used to measure atmospheric delay fluctuations. The delay structure function Dτ(Δ t) was calculated for 66 scans (each >= 120s in length) on seven nights in 1997 and one in 1998. For all except one scan, Dτ exhibited a clean power law shape over the time interval 50-500 msec. Over shorter time intervals, the effect of the delay line servo loop corrupts Dτ. Over longer time intervals (usually starting at > 1s), the slope of Dτ decreases, presumably due to some combination of saturation e.g. finite turbulent layer thickness) and the effect of the finite wind speed crossing time on our 110 m baseline. The mean power law slopes for the eight nights ranged from 1.16 to 1.36, substantially flatter than the value of 1.67 for three dimensional Kolmogorov turbulence. Such sub-Kolmogorov slopes will result in atmospheric seeling (θ) that improves rapidly with increasing wavelength: θ propto λ1-(2β), where β is the observed power law slope of Dτ. The atmospheric errors in astrometric measurements with an interferometer will average down more quickly than in the Kolmogorov case.

  19. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    SciTech Connect

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  20. Adaptive Signal Processing Testbed application software: User's manual

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1992-05-01

    The Adaptive Signal Processing Testbed (ASPT) application software is a set of programs that provide general data acquisition and minimal processing functions on live digital data. The data are obtained from a digital input interface whose data source is the DAR4000 digital quadrature receiver that receives a phase shift keying signal at 21.4 MHz intermediate frequency. The data acquisition software is used to acquire raw unprocessed data from the DAR4000 and store it on disk in the Sun workstation based ASPT. File processing utilities are available to convert the stored files for analysis. The data evaluation software is used for the following functions: acquisition of data from the DAR4000, conversion to IEEE format, and storage to disk; acquisition of data from the DAR4000, power spectrum estimation, and on-line plotting on the graphics screen; and processing of disk file data, power spectrum estimation, and display and/or storage to disk in the new format. A user's guide is provided that describes the acquisition and evaluation programs along with how to acquire, evaluate, and use the data.