Science.gov

Sample records for open cloud testbed

  1. The Open Cloud Testbed: Supporting Open Source Cloud Computing Systems Based on Large Scale High Performance, Dynamic Network Services

    NASA Astrophysics Data System (ADS)

    Grossman, Robert; Gu, Yunhong; Sabala, Michal; Bennet, Colin; Seidman, Jonathan; Mambratti, Joe

    Recently, a number of cloud platforms and services have been developed for data intensive computing, including Hadoop, Sector, CloudStore (formerly KFS), HBase, and Thrift. In order to benchmark the performance of these systems, to investigate their interoperability, and to experiment with new services based on flexible compute node and network provisioning capabilities, we have designed and implemented a large scale testbed called the Open Cloud Testbed (OCT). Currently OCT has 120 nodes in 4 data centers: Baltimore, Chicago (two locations), and San Diego. In contrast to other cloud testbeds, which are in small geographic areas and which are based on commodity Internet services, the OCT is a wide area testbed and the 4 data centers are connected with a high performance 10Gb/s network, based on a foundation of dedicated lightpaths. This testbed can address the requirements of extremely large data streams that challenge other types of distributed infrastructure. We have also developed several utilities to support the development of cloud computing systems and services, including novel node and network provisioning services, a monitoring system, and an RPC system. In this paper, we describe the OCT concepts, architecture, infrastructure, a few benchmarks that were developed for this platform, interoperability studies, and results.

  2. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  3. Southern Great Plains cloud and radiation testbed site

    SciTech Connect

    1996-09-01

    This document presents information about the Cloud and Radiation Testbed Site and the Atmospheric Radiation Measurement program. Topics include; measuring methods, general circulation methods, milestones, instrumentation, meteorological observations, and computing facilities.

  4. A boundary-layer cloud study using Southern Great Plains Cloud and radiation testbed (CART) data

    SciTech Connect

    Albrecht, B.; Mace, G.; Dong, X.; Syrett, W.

    1996-04-01

    Boundary layer clouds-stratus and fairweather cumulus - are closely coupled involves the radiative impact of the clouds on the surface energy budget and the strong dependence of cloud formation and maintenance on the turbulent fluxes of heat and moisture in the boundary layer. The continuous data collection at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site provides a unique opportunity to study components of the coupling processes associated with boundary layer clouds and to provide descriptions of cloud and boundary layer structure that can be used to test parameterizations used in climate models. But before the CART data can be used for process studies and parameterization testing, it is necessary to evaluate and validate data and to develop techniques for effectively combining the data to provide meaningful descriptions of cloud and boundary layer characteristics. In this study we use measurements made during an intensive observing period we consider a case where low-level stratus were observed at the site for about 18 hours. This case is being used to examine the temporal evolution of cloud base, cloud top, cloud liquid water content, surface radiative fluxes, and boundary layer structure. A method for inferring cloud microphysics from these parameters is currently being evaluated.

  5. Clouds over Open Ocean

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The heavy concentration of these cirrocumulus and nimbostratus clouds over open ocean - location unknown, indicate that a heavy downpouring of rain is occuring on the Earth's surface below. Towering anvils, seen rising high above the base cloud cover and casting long shadows, also indicate high winds and possible tornado activity.

  6. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  7. Site/Systems Operations, Maintenance and Facilities Management of the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site

    SciTech Connect

    Wu, Susan

    2005-08-01

    This contract covered the site/systems operations, maintenance, and facilities management of the DOE Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site.

  8. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-12-31

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  9. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-01-01

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  10. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.; Wasmer, F.; Pentecost, E.D.

    1992-03-01

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth`s atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy`s Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climate change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described.

  11. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  12. An open science cloud for scientific research

    NASA Astrophysics Data System (ADS)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  13. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  14. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  15. Analytical study of the effects of the Low-Level Jet on moisture convergence and vertical motion fields at the Southern Great Plains Cloud and Radiation Testbed site

    SciTech Connect

    Bian, X.; Zhong, S.; Whiteman, C.D.; Stage, S.A.

    1996-04-01

    The Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) is located in a region that is strongly affected by a prominent meteorological phenomenon--the Great Plains Low-Level Jet (LLJ). Observations have shown that the LLJ plays a vital role in spring and summertime cloud formation and precipitation over the Great Plains. An improved understanding of the LLJ characteristics and its impact on the environment is necessary for addressing the fundamental issue of development and testing of radiational transfer and cloud parameterization schemes for the general circulation models (GCMs) using data from the SGP CART site. A climatological analysis of the summertime LLJ over the SGP has been carried out using hourly observations from the National Oceanic and Atmospheric Administration (NOAA) Wind Profiler Demonstration Network and from the ARM June 1993 Intensive Observation Period (IOP). The hourly data provide an enhanced temporal and spatial resolution relative to earlier studies which used 6- and 12-hourly rawinsonde observations at fewer stations.

  16. A comparison of radiometric fluxes influenced by parameterization cirrus clouds with observed fluxes at the Southern Great Plains (SGP) cloud and radiation testbed (CART) site

    SciTech Connect

    Mace, G.G.; Ackerman, T.P.; George, A.T.

    1996-04-01

    The data from the Atmospheric Radiation Measurement (ARM) Program`s Southern Great plains Site (SCP) is a valuable resource. We have developed an operational data processing and analysis methodology that allows us to examine continuously the influence of clouds on the radiation field and to test new and existing cloud and radiation parameterizations.

  17. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-01

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com. PMID:23305951

  18. Kontur: Observations of cloud streets and open cellular structures

    NASA Astrophysics Data System (ADS)

    Brümmer, B.; Bakan, S.; Hinzpeter, H.

    1985-08-01

    In September and October 1981 the experiment KonTur (Convection and turbulence) was conducted over the North Sea. Its objectives were to investigate organized convective patterns, like cloud streets (boundary layer rolls) and cellular cloud structures. Two aircraft (British Hercules C-130 and German Falcon 20) performed detailed measurements within these patterns. Several cases of cloud streets and open cells were observed. Boundary layer rolls appear to be connected with an inflection point in the cross-roll wind component. The aspect ratio of the rolls (wavelength versus depth) is between three and four in accordance with other observations and linear stability analysis. Four scales of motion are involved: the mean flow, the roll circulation, individual clouds and turbulence. The vertical transport are dominated at lower levels by turbulence and at higher levels by roll-scale motions. Open cellular cloud structures are connected with large air-sea temperature differences due to cold air outbreaks from the northwest. The aspect ratio of the cells is of the order of 10. The bulk contribution to the total transport of heat and momentum originates from the cloudy walls of the cells. A vertical cross section through a composite open cell is presented.

  19. Open-cell cloud formation over the Bahamas

    NASA Technical Reports Server (NTRS)

    2002-01-01

    What atmospheric scientists refer to as open cell cloud formation is a regular occurrence on the back side of a low-pressure system or cyclone in the mid-latitudes. In the Northern Hemisphere, a low-pressure system will draw in surrounding air and spin it counterclockwise. That means that on the back side of the low-pressure center, cold air will be drawn in from the north, and on the front side, warm air will be drawn up from latitudes closer to the equator. This movement of an air mass is called advection, and when cold air advection occurs over warmer waters, open cell cloud formations often result. This MODIS image shows open cell cloud formation over the Atlantic Ocean off the southeast coast of the United States on February 19, 2002. This particular formation is the result of a low-pressure system sitting out in the North Atlantic Ocean a few hundred miles east of Massachusetts. (The low can be seen as the comma-shaped figure in the GOES-8 Infrared image from February 19, 2002.) Cold air is being drawn down from the north on the western side of the low and the open cell cumulus clouds begin to form as the cold air passes over the warmer Caribbean waters. For another look at the scene, check out the MODIS Direct Broadcast Image from the University of Wisconsin. Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  20. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  1. Open reading frame phylogenetic analysis on the cloud.

    PubMed

    Hung, Che-Lun; Lin, Chun-Yuan

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  2. Cloud-Based Model Calibration Using OpenStudio: Preprint

    SciTech Connect

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  3. Optical interferometer testbed

    NASA Technical Reports Server (NTRS)

    Blackwood, Gary H.

    1991-01-01

    Viewgraphs on optical interferometer testbed presented at the MIT Space Research Engineering Center 3rd Annual Symposium are included. Topics covered include: space-based optical interferometer; optical metrology; sensors and actuators; real time control hardware; controlled structures technology (CST) design methodology; identification for MIMO control; FEM/ID correlation for the naked truss; disturbance modeling; disturbance source implementation; structure design: passive damping; low authority control; active isolation of lightweight mirrors on flexible structures; open loop transfer function of mirror; and global/high authority control.

  4. Tidal disruption of open clusters in their parent molecular clouds

    NASA Technical Reports Server (NTRS)

    Long, Kevin

    1989-01-01

    A simple model of tidal encounters has been applied to the problem of an open cluster in a clumpy molecular cloud. The parameters of the clumps are taken from the Blitz, Stark, and Long (1988) catalog of clumps in the Rosette molecular cloud. Encounters are modeled as impulsive, rectilinear collisions between Plummer spheres, but the tidal approximation is not invoked. Mass and binding energy changes during an encounter are computed by considering the velocity impulses given to individual stars in a random realization of a Plummer sphere. Mean rates of mass and binding energy loss are then computed by integrating over many encounters. Self-similar evolutionary calculations using these rates indicate that the disruption process is most sensitive to the cluster radius and relatively insensitive to cluster mass. The calculations indicate that clusters which are born in a cloud similar to the Rosette with a cluster radius greater than about 2.5 pc will not survive long enough to leave the cloud. The majority of clusters, however, have smaller radii and will survive the passage through their parent cloud.

  5. Fast Physics Testbed for the FASTER Project

    SciTech Connect

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  6. The EPOS Vision for the Open Science Cloud

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  7. A Business-to-Business Interoperability Testbed: An Overview

    SciTech Connect

    Kulvatunyou, Boonserm; Ivezic, Nenad; Monica, Martin; Jones, Albert

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  8. Open Source Cloud Computing for Transiting Planet Discovery

    NASA Astrophysics Data System (ADS)

    McCullough, Peter R.; Fleming, Scott W.; Zonca, Andrea; Flowers, Jack; Nguyen, Duy Cuong; Sinkovits, Robert; Machalek, Pavel

    2014-06-01

    We provide an update on the development of the open-source software suite designed to detect exoplanet transits using high-performance and cloud computing resources (https://github.com/openEXO). Our collaboration continues to grow as we are developing algorithms and codes related to the detection of transit-like events, especially in Kepler data, Kepler 2.0 and TESS data when available. Extending the work of Berriman et al. (2010, 2012), we describe our use of the XSEDE-Gordon supercomputer and Amazon EMR cloud to search for aperiodic transit-like events in Kepler light curves. Such events may be caused by circumbinary planets or transiting bodies, either planets or stars, with orbital periods comparable to or longer than the observing baseline such that only one transit is observed. As a bonus, we use the same code to find stellar flares too; whereas transits reduce the flux in a box-shaped profile, flares increase the flux in a fast-rise, exponential-decay (FRED) profile that nevertheless can be detected reliably with a square-wave finder.

  9. Open Source Software Reuse in the Airborne Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Khudikyan, S. E.; Hart, A. F.; Hardman, S.; Freeborn, D.; Davoodi, F.; Resneck, G.; Mattmann, C. A.; Crichton, D. J.

    2012-12-01

    Earth science airborne missions play an important role in helping humans understand our climate. A challenge for airborne campaigns in contrast to larger NASA missions is that their relatively modest budgets do not permit the ground-up development of data management tools. These smaller missions generally consist of scientists whose primary focus is on the algorithmic and scientific aspects of the mission, which often leaves data management software and systems to be addressed as an afterthought. The Airborne Cloud Computing Environment (ACCE), developed by the Jet Propulsion Laboratory (JPL) to support Earth Science Airborne Program, is a reusable, multi-mission data system environment for NASA airborne missions. ACCE provides missions with a cloud-enabled platform for managing their data. The platform consists of a comprehensive set of robust data management capabilities that cover everything from data ingestion and archiving, to algorithmic processing, and to data delivery. Missions interact with this system programmatically as well as via browser-based user interfaces. The core components of ACCE are largely based on Apache Object Oriented Data Technology (OODT), an open source information integration framework at the Apache Software Foundation (ASF). Apache OODT is designed around a component-based architecture that allows for selective combination of components to create highly configurable data management systems. The diverse and growing community that currently contributes to Apache OODT fosters on-going growth and maturation of the software. ACCE's key objective is to reduce cost and risks associated with developing data management systems for airborne missions. Software reuse plays a prominent role in mitigating these problems. By providing a reusable platform based on open source software, ACCE enables airborne missions to allocate more resources to their scientific goals, thereby opening the doors to increased scientific discovery.

  10. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  11. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  12. Aspects of the quality of data from the Southern Great Plains (SGP) cloud and radiation testbed (CART) site broadband radiation sensors

    SciTech Connect

    Splitt, M.E.; Wesely, M.L.

    1996-04-01

    A systmatic evaluation of the performance of broadband radiometers at the Radiation Testbed (CART) site is needed to estimate the uncertainties of the irradiance observations. Here, net radiation observed with the net radiometer in the enrgy balance Bowen ratio station at the Central facility is compared with the net radiation computed as the sum of component irradiances recorded by nearby pyranameters and pyrgeometers. In addition, data obtained from the central facility pyranometers, pyrgeometers, and pyrheliometers are examined for April 1994, when intensive operations periods were being carried out. The data used in this study are from central facility radiometers in a solar and infrared observation station, and EBBR station, the so-called `BSRN` set of upward pointing radiometers, and a set of radiometers pointed down at the 25-m level of a 60-m tower.

  13. Modeling Aerosol-Cloud Interactions in Marine Open- and Closed-Cell Stratocumulus

    NASA Astrophysics Data System (ADS)

    Wang, H.; Feingold, G.

    2008-12-01

    Satellite imagery shows the recurrence of striking images of cellular structures exhibiting both closed- and open-cell patterns in marine stratocumulus fields. The open-cell region has much lower cloud albedo than closed cells. Aside from that, previous observational and modeling studies have suggested that open- and closed-cell regions are different in many other aspects, such as concentration of cloud condensation nuclei (CCN), cloud droplet number and size, precipitation efficiency, and cloud dynamics. In this work, aerosol- cloud interactions and dynamical feedbacks are investigated within a large eddy simulation (LES) modeling framework to study the activation, cloud scavenging, mixing and transport of CCN in the open- and closed- cell boundary layer and near the open/closed-cell boundaries. The model domain size of 120 km by 60 km is large enough to represent mesoscale organizations that are associated with different cellular structures and that are promoted by CCN perturbation from ship emissions. Simulation results show that depletion of CCN by collision and coalescence in clouds is critical to the formation of precipitation and open-cell structure in a stratocumulus deck. Once the open cellular structure has formed in the clean environment, a substantial increase of CCN transported from a neighboring polluted environment or from ship emissions do not close it during the 12-hour simulation due to the lack of dynamical and moisture support in the open-cell cloud-free region. However, the contaminated open cells are not able to self-sustain as a result of shutoff of precipitation. This points to the critical role of precipitation-triggered circulations in maintaining an open-cellular structure.

  14. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  15. The Fizeau Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Zhang, Xiaolei; Carpenter, Kenneth G.; Lyon, Richard G,; Huet, Hubert; Marzouk, Joe; Solyar, Gregory

    2003-01-01

    The Fizeau Interferometer Testbed (FIT) is a collaborative effort between NASA's Goddard Space Flight Center, the Naval Research Laboratory, Sigma Space Corporation, and the University of Maryland. The testbed will be used to explore the principles of and the requirements for the full, as well as the pathfinder, Stellar Imager mission concept. It has a long term goal of demonstrating closed-loop control of a sparse array of numerous articulated mirrors to keep optical beams in phase and optimize interferometric synthesis imaging. In this paper we present the optical and data acquisition system design of the testbed, and discuss the wavefront sensing and control algorithms to be used. Currently we have completed the initial design and hardware procurement for the FIT. The assembly and testing of the Testbed will be underway at Goddard's Instrument Development Lab in the coming months.

  16. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  17. AutoGNC Testbed

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Vaughan, Andrew T.; Bayard, David S.; Riedel, Joseph E.; Balaram, J.

    2010-01-01

    A simulation testbed architecture was developed and implemented for the integration, test, and development of a TRL-6 flight software set called Auto- GNC. The AutoGNC software will combine the TRL-9 Deep Impact AutoNAV flight software suite, the TRL-9 Virtual Machine Language (VML) executive, and the TRL-3 G-REX guidance, estimation, and control algorithms. The Auto- GNC testbed was architected to provide software interface connections among the AutoNAV and VML flight code written in C, the G-REX algorithms in MATLAB and C, stand-alone image rendering algorithms in C, and other Fortran algorithms, such as the OBIRON landmark tracking suite. The testbed architecture incorporates software components for propagating a high-fidelity truth model of the environment and the spacecraft dynamics, along with the flight software components for onboard guidance, navigation, and control (GN&C). The interface allows for the rapid integration and testing of new algorithms prior to development of the C code for implementation in flight software. This testbed is designed to test autonomous spacecraft proximity operations around small celestial bodies, moons, or other spacecraft. The software is baselined for upcoming comet and asteroid sample return missions. This architecture and testbed will provide a direct improvement upon the onboard flight software utilized for missions such as Deep Impact, Stardust, and Deep Space 1.

  18. Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun

    NASA Technical Reports Server (NTRS)

    Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.

    2008-01-01

    Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.

  19. MIT's interferometer CST testbed

    NASA Technical Reports Server (NTRS)

    Hyde, Tupper; Kim, ED; Anderson, Eric; Blackwood, Gary; Lublin, Leonard

    1990-01-01

    The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.

  20. Use of AVHRR-derived spectral reflectances to estimate surface albedo across the Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Qiu, J.; Gao, W.

    1997-03-01

    Substantial variations in surface albedo across a large area cause difficulty in estimating regional net solar radiation and atmospheric absorption of shortwave radiation when only ground point measurements of surface albedo are used to represent the whole area. Information on spatial variations and site-wide averages of surface albedo, which vary with the underlying surface type and conditions and the solar zenith angle, is important for studies of clouds and atmospheric radiation over a large surface area. In this study, a bidirectional reflectance model was used to inversely retrieve surface properties such as leaf area index and then the bidirectional reflectance distribution was calculated by using the same radiation model. The albedo was calculated by converting the narrowband reflectance to broadband reflectance and then integrating over the upper hemisphere.

  1. Point Cloud Visualization in AN Open Source 3d Globe

    NASA Astrophysics Data System (ADS)

    De La Calle, M.; Gómez-Deck, D.; Koehler, O.; Pulido, F.

    2011-09-01

    During the last years the usage of 3D applications in GIS is becoming more popular. Since the appearance of Google Earth, users are familiarized with 3D environments. On the other hand, nowadays computers with 3D acceleration are common, broadband access is widespread and the public information that can be used in GIS clients that are able to use data from the Internet is constantly increasing. There are currently several libraries suitable for this kind of applications. Based on these facts, and using libraries that are already developed and connected to our own developments, we are working on the implementation of a real 3D GIS with analysis capabilities. Since a 3D GIS such as this can be very interesting for tasks like LiDAR or Laser Scanner point clouds rendering and analysis, special attention is given to get an optimal handling of very large data sets. Glob3 will be a multidimensional GIS in which 3D point clouds could be explored and analysed, even if they are consist of several million points.The latest addition to our visualization libraries is the development of a points cloud server that works regardless of the cloud's size. The server receives and processes petitions from a 3d client (for example glob3, but could be any other, such as one based on WebGL) and delivers the data in the form of pre-processed tiles, depending on the required level of detail.

  2. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program is developing initial recommendations for requirements and design approaches for the information systems of the Space Station era. During this quarter, drafting of the final reports of the various participants was initiated. Several drafts are included in this report as the University technical reports.

  3. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License

  4. Continuation: The EOSDIS testbed data system

    NASA Technical Reports Server (NTRS)

    Emery, Bill; Kelley, Timothy D.

    1995-01-01

    The continuation of the EOSDIS testbed ('Testbed') has materialized from a multi-task system to a fully functional stand-alone data archive distribution center that once was only X-Windows driven to a system that is accessible by all types of users and computers via the World Wide Web. Throughout the past months, the Testbed has evolved into a completely new system. The current system is now accessible through Netscape, Mosaic, and all other servers that can contact the World Wide Web. On October 1, 1995 we will open to the public and we expect that the statistics of the type of user, where they are located, and what they are looking for will drastically change. What is the most important change in the Testbed has been the Web interface. This interface will allow more users access to the system and walk them through the data types with more ease than before. All of the callbacks are written in such a way that icons can be used to easily move around in the programs interface. The homepage offers the user the opportunity to go and get more information about each satellite data type and also information on free programs. These programs are grouped into categories for types of computers that the programs are compiled for, along with information on how to FTP the programs back to the end users computer. The heart of the Testbed is still the acquisition of satellite data. From the Testbed homepage, the user selects the 'access to data system' icon, which will take them to the world map and allow them to select an area that they would like coverage on by simply clicking that area of the map. This creates a new map where other similar choices can be made to get the latitude and longitude of the region the satellite data will cover. Once a selection has been made the search parameters page will appear to be filled out. Afterwards, the browse image will be called for once the search is completed and the images for viewing can be selected. There are several other option pages

  5. Establishment of an NWP testbed using ARM data

    SciTech Connect

    O'Connor, E.; Liu, Y.; Hogan, R.

    2010-03-15

    The aim of the FAst-physics System TEstbed and Research (FASTER) project is to evaluate and improve the parameterizations of fast physics (involving clouds, precipitation, aerosol) in numerical models using ARM measurements. One objective within FASTER is to evaluate model representations of fast physics with long-term continuous cloud observations by use of an 'NWP testbed'. This approach was successful in the European Cloudnet project. NWP model data (NCEP, ECMWF, etc.) is routinely output at ARM sites, and model evaluation can potentially be achieved in quasi-real time. In this poster, we will outline our progress in the development of the NWP testbed and discuss the successful integration of ARM algorithms, such as ARSCL, with algorithms and lessons learned from Cloudnet. Preliminary results will be presented of the evaluation of the ECMWF, NCEP, and UK Met Office models over the SGP site using this approach.

  6. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing a aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in real time in a sequence similar to what would occur in the NAS.

  7. The Palomar Testbed Interferometer

    NASA Technical Reports Server (NTRS)

    Colavita, M. M.; Wallace, J. K.; Hines, B. E.; Gursel, Y.; Malbet, F.; Palmer, D. L.; Pan, X. P.; Shao, M.; Yu, J. W.; Boden, A. F.

    1999-01-01

    The Palomar Testbed Interferometer (PTI) is a long-baseline infrared interferometer located at Palomar Observatory, California. It was built as a testbed for interferometric techniques applicable to the Keck Interferometer. First fringes were obtained in 1995 July. PTI implements a dual-star architecture, tracking two stars simultaneously for phase referencing and narrow-angle astrometry. The three fixed 40 cm apertures can be combined pairwise to provide baselines to 110 m. The interferometer actively tracks the white-light fringe using an array detector at 2.2 microns and active delay lines with a range of +/-38 m. Laser metrology of the delay lines allows for servo control, and laser metrology of the complete optical path enables narrow-angle astrometric measurements. The instrument is highly automated, using a multiprocessing computer system for instrument control and sequencing.

  8. Testbed for LISA Photodetectors

    NASA Technical Reports Server (NTRS)

    Guzman, Felipe; Livas, Jeffrey; Silverberg, Robert

    2009-01-01

    The Laser Interferometer Space Antenna (LISA) is a gravitational wave observatory consisting of three spacecraft separated by 5 million km in an equilateral triangle whose center follows the Earth in orbit around the Sun but offset in orbital phase by 20 degrees. LISA is designed to observe sources in the frequency range of 0.1 mHz-100 mHz by measuring fluctuations of the inter-spacecraft separation with laser interferometry. Quadrant photodetectors are used to measure both separation and angular orientation. Noise level, phase and amplitude inhomogeneities of the semiconductor response, and channel cross-talk between quadrant cells need to be assessed in order to ensure the 10 pm/Square root(Hz) sensitivity required for the interferometric length measurement in LISA. To this end, we are currently developing a testbed that allows us to evaluate photodetectors to the sensitivity levels required for LISA. A detailed description of the testbed and preliminary results will be presented.

  9. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  10. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program (TTPP) is intended to develop initial recommendations for requirements and design approaches for the information system of the Space Station era. Multiple scientific experiments are being performed, each exploring advanced technologies and technical approaches and each emulating some aspect of Space Station era science. The aggregate results of the program will serve to guide the development of future NASA information systems.

  11. Telescience testbed pilot program

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1988-01-01

    The Universities Space Research Association (USRA), under sponsorship from the NASA Office of Space Science and Applications, is conducting a Telescience Testbed Pilot Program. Fifteen universities, under subcontract to USRA, are conducting a variety of scientific experiments using advanced technology to determine the requirements and evaluate the tradeoffs for the information system of the Space Station era. An interim set of recommendations based on the experiences of the first six months of the pilot program is presented.

  12. Enabling Open Cloud Markets Through WS-Agreement Extensions

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn

    Research into computing resource markets has mainly considered the question of which market mechanisms provide a fair resource allocation. However, while developing such markets, the definition of the unit of trade (i.e. the definition of resource) has not been given much attention. In this paper, we analyze the requirements for tradable resource goods. Based on the results, we suggest a detailed goods definition, which is easy to understand, can be used with many market mechanisms, and addresses the needs of a Cloud resource market. The goods definition captures the complete system resource, including hardware specifications, software specifications, the terms of use, and a pricing function. To demonstrate the usefulness of such a standardized goods definition, we demonstrate its application in the form of a WS-Agreement template for a number of market mechanisms for commodity system resources.

  13. Advanced turboprop testbed systems study

    NASA Technical Reports Server (NTRS)

    Goldsmith, I. M.

    1982-01-01

    The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.

  14. Building a Parallel Cloud Storage System using OpenStack’s Swift Object Store and Transformative Parallel I/O

    SciTech Connect

    Burns, Andrew J.; Lora, Kaleb D.; Martinez, Esteban; Shorter, Martel L.

    2012-07-30

    Our project consists of bleeding-edge research into replacing the traditional storage archives with a parallel, cloud-based storage solution. It used OpenStack's Swift Object Store cloud software. It's Benchmarked Swift for write speed and scalability. Our project is unique because Swift is typically used for reads and we are mostly concerned with write speeds. Cloud Storage is a viable archive solution because: (1) Container management for larger parallel archives might ease the migration workload; (2) Many tools that are written for cloud storage could be utilized for local archive; and (3) Current large cloud storage practices in industry could be utilized to manage a scalable archive solution.

  15. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  16. LISA Optical Bench Testbed

    NASA Astrophysics Data System (ADS)

    Lieser, M.; d'Arcio, L.; Barke, S.; Bogenstahl, J.; Diekmann, C.; Diepholz, I.; Fitzsimons, E. D.; Gerberding, O.; Henning, J.-S.; Hewitson, M.; Hey, F. G.; Hogenhuis, H.; Killow, C. J.; Lucarelli, S.; Nikolov, S.; Perreur-Lloyd, M.; Pijnenburg, J.; Robertson, D. I.; Sohmer, A.; Taylor, A.; Tröbs, M.; Ward, H.; Weise, D.; Heinzel, G.; Danzmann, K.

    2013-01-01

    The optical bench (OB) is a part of the LISA spacecraft, situated between the telescope and the testmass. For measuring the inter-spacecraft distances there are several interferometers on the OB. The elegant breadboard of the OB for LISA is developed for the European Space Agency (ESA) by EADS Astrium, TNO Science & Industry, University of Glasgow and the Albert Einstein Intitute (AEI), the performance tests then will be done at the AEI. Here we present the testbed that will be used for the performance tests with the focus on the thermal environment and the laser infrastructure.

  17. Single link flexible beam testbed project. Thesis

    NASA Technical Reports Server (NTRS)

    Hughes, Declan

    1992-01-01

    This thesis describes the single link flexible beam testbed at the CLaMS laboratory in terms of its hardware, software, and linear model, and presents two controllers, each including a hub angle proportional-derivative (PD) feedback compensator and one augmented by a second static gain full state feedback loop, based upon a synthesized strictly positive real (SPR) output, that increases specific flexible mode pole damping ratios w.r.t the PD only case and hence reduces unwanted residual oscillation effects. Restricting full state feedback gains so as to produce a SPR open loop transfer function ensures that the associated compensator has an infinite gain margin and a phase margin of at least (-90, 90) degrees. Both experimental and simulation data are evaluated in order to compare some different observer performance when applied to the real testbed and to the linear model when uncompensated flexible modes are included.

  18. Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.

  19. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA s Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing an aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in realtime in a sequence similar to what would occur in the NAS. The ATC applications that can be studied are the Aeronautical Telecommunications Network s (ATN) Context Management (CM) and Controller Pilot Data Link Communications (CPDLC). The Surveillance applications are Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Services - Broadcast (TIS-B).

  20. PROOF on the Cloud for ALICE using PoD and OpenNebula

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Bagnasco, S.; Brunetti, R.; Lusso, S.

    2012-06-01

    In order to optimize the use and management of computing centres, their conversion to cloud facilities is becoming increasingly popular. In a medium to large cloud facility, many different virtual clusters may concur for the same resources: unused resources can be freed either by turning off idle virtual machines, or by lowering resources assigned to a virtual machine at runtime. PROOF, a ROOT-based parallel and interactive analysis framework, is officially endorsed in the computing model of the ALICE experiment as complementary to the Grid, and it has become very popular over the last three years. The locality of PROOF-based analysis facilities forces system administrators to scavenge resources, yet the chaotic nature of user analysis tasks deems them unstable and inconstantly used, making PROOF a typical use-case for HPC cloud computing. Currently, PoD dynamically and easily provides a PROOF-enabled cluster by submitting agents to a job scheduler. Unfortunately, a Tier-2 does not comfortably share the same queue between interactive and batch jobs, due to the very large average time to completion of the latter: an elastic cloud approach would enable interactive virtual machines to temporarily subtract resources to the batch ones, without a noticeable impact on them. In this work we describe our setup of a dynamic PROOF-based cloud analysis facility based on PoD and OpenNebula, orchestrated by a simple and lightweight control daemon that makes virtualization transparent for the user.

  1. Uav-Based Photogrammetric Point Clouds - Tree STEM Mapping in Open Stands in Comparison to Terrestrial Laser Scanner Point Clouds

    NASA Astrophysics Data System (ADS)

    Fritz, A.; Kattenborn, T.; Koch, B.

    2013-08-01

    In both ecology and forestry, there is a high demand for structural information of forest stands. Forest structures, due to their heterogeneity and density, are often difficult to assess. Hence, a variety of technologies are being applied to account for this "difficult to come by" information. Common techniques are aerial images or ground- and airborne-Lidar. In the present study we evaluate the potential use of unmanned aerial vehicles (UAVs) as a platform for tree stem detection in open stands. A flight campaign over a test site near Freiburg, Germany covering a target area of 120 × 75 [m2] was conducted. The dominant tree species of the site is oak (quercus robur) with almost no understory growth. Over 1000 images with a tilt angle of 45° were shot. The flight pattern applied consisted of two antipodal staggered flight routes at a height of 55 [m] above the ground. We used a Panasonic G3 consumer camera equipped with a 14-42 [mm] standard lens and a 16.6 megapixel sensor. The data collection took place in leaf-off state in April 2013. The area was prepared with artificial ground control points for transformation of the structure-from-motion (SFM) point cloud into real world coordinates. After processing, the results were compared with a terrestrial laser scanner (TLS) point cloud of the same area. In the 0.9 [ha] test area, 102 individual trees above 7 [cm] diameter at breast height were located on in the TLS-cloud. We chose the software CMVS/PMVS-2 since its algorithms are developed with focus on dense reconstruction. The processing chain for the UAV-acquired images consists of six steps: a. cleaning the data: removing of blurry, under- or over exposed and off-site images; b. applying the SIFT operator [Lowe, 2004]; c. image matching; d. bundle adjustment; e. clustering; and f. dense reconstruction. In total, 73 stems were considered as reconstructed and located within one meter of the reference trees. In general stems were far less accurate and complete as

  2. Space Environments Testbed

    NASA Technical Reports Server (NTRS)

    Leucht, David K.; Koslosky, Marie J.; Kobe, David L.; Wu, Jya-Chang C.; Vavra, David A.

    2011-01-01

    The Space Environments Testbed (SET) is a flight controller data system for the Common Carrier Assembly. The SET-1 flight software provides the command, telemetry, and experiment control to ground operators for the SET-1 mission. Modes of operation (see dia gram) include: a) Boot Mode that is initiated at application of power to the processor card, and runs memory diagnostics. It may be entered via ground command or autonomously based upon fault detection. b) Maintenance Mode that allows for limited carrier health monitoring, including power telemetry monitoring on a non-interference basis. c) Safe Mode is a predefined, minimum power safehold configuration with power to experiments removed and carrier functionality minimized. It is used to troubleshoot problems that occur during flight. d) Operations Mode is used for normal experiment carrier operations. It may be entered only via ground command from Safe Mode.

  3. Holodeck Testbed Project

    NASA Technical Reports Server (NTRS)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  4. Optical Network Testbeds Workshop

    SciTech Connect

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  5. Autonomous Flying Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2005-01-01

    The Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis,Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights.

  6. Long Duration Sorbent Testbed

    NASA Technical Reports Server (NTRS)

    Howard, David F.; Knox, James C.; Long, David A.; Miller, Lee; Cmaric, Gregory; Thomas, John

    2016-01-01

    The Long Duration Sorbent Testbed (LDST) is a flight experiment demonstration designed to expose current and future candidate carbon dioxide removal system sorbents to an actual crewed space cabin environment to assess and compare sorption working capacity degradation resulting from long term operation. An analysis of sorbent materials returned to Earth after approximately one year of operation in the International Space Station's (ISS) Carbon Dioxide Removal Assembly (CDRA) indicated as much as a 70% loss of working capacity of the silica gel desiccant material at the extreme system inlet location, with a gradient of capacity loss down the bed. The primary science objective is to assess the degradation of potential sorbents for exploration class missions and ISS upgrades when operated in a true crewed space cabin environment. A secondary objective is to compare degradation of flight test to a ground test unit with contaminant dosing to determine applicability of ground testing.

  7. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  8. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  9. Updated Electronic Testbed System

    NASA Technical Reports Server (NTRS)

    Brewer, Kevin L.

    2001-01-01

    As we continue to advance in exploring space frontiers, technology must also advance. The need for faster data recovery and data processing is crucial. In this, the less equipment used, and lighter that equipment is, the better. Because integrated circuits become more sensitive in high altitude, experimental verification and quantification is required. The Center for Applied Radiation Research (CARR) at Prairie View A&M University was awarded a grant by NASA to participate in the NASA ER-2 Flight Program, the APEX balloon flight program, and the Student Launch Program. These programs are to test anomalous errors in integrated circuits due to single event effects (SEE). CARR had already begun experiments characterizing the SEE behavior of high speed and high density SRAM's. The research center built a error testing system using a PC-104 computer unit, an Iomega Zip drive for storage, a test board with the components under test, and a latchup detection and reset unit. A test program was written to continuously monitor a stored data pattern in the SRAM chip and record errors. The devices under test were eight 4Mbit memory chips totaling 4Mbytes of memory. CARR was successful at obtaining data using the Electronic TestBed System (EBS) in various NASA ER-2 test flights. These series of high altitude flights of up to 70,000 feet, were effective at yielding the conditions which single event effects usually occur. However, the data received from the series of flights indicated one error per twenty-four hours. Because flight test time is very expensive, the initial design proved not to be cost effective. The need for orders of magnitude with more memory became essential. Therefore, a project which could test more memory within a given time was created. The goal of this project was not only to test more memory within a given time, but also to have a system with a faster processing speed, and which used less peripherals. This paper will describe procedures used to build an

  10. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  11. Near infrared testbed sensor

    NASA Astrophysics Data System (ADS)

    Sanderson, R. B.; McCalmont, J. F.; Montgomery, J. B.; Johnson, R. S.; McDermott, D. J.

    2007-04-01

    A new tactical airborne multicolor missile warning testbed was developed and fielded as part of an Air Force Research Laboratory (AFRL) initiative focusing on clutter and missile signature measurements for algorithm development. Multicolor discrimination is one of the most effective ways of improving the performance of infrared missile warning sensors, particularly for heavy clutter situations. Its utility has been demonstrated in multiple fielded sensors. Traditionally, multicolor discrimination has been performed in the mid-infrared, 3-5 μm band, where the molecular emission of CO and CO2 characteristic of a combustion process is readily distinguished from the continuum of a black body radiator. Current infrared warning sensor development is focused on near infrared (NIR) staring mosaic detector arrays that provide similar spectral discrimination in different bands to provide a cost effective and mechanically simpler system. This, in turn, has required that multicolor clutter data be collected for both analysis and algorithm development. The developed sensor test bed is a multi-camera system 1004x1004 FPA coupled with optimized filters integrated with the optics. The collection portion includes a ruggedized field-programmable gate array processor coupled with with an integrated controller/tracker and fast disk array capable of real-time processing and collection of up to 60 full frames per second. This configuration allowed the collection and real-time processing of temporally correlated, radiometrically calibrated data in multiple spectral bands that was then compared to background and target imagery taken previously

  12. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    PubMed

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching. PMID:23482120

  13. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  14. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations. In neurosurgery, the needle used in the standard stereotactic CT (Computational Tomography) or MRI (Magnetic Resonance Imaging) guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled 'Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification' is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  15. Comparison of the cloud activation potential of open ocean and coastal aerosol in the Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Vidaurre, G.; Brooks, S. D.; Thornton, D. C.

    2010-12-01

    Continuous measurements of aerosol concentration, particle size distribution, and cloud activation potential between 0.15 and 1.2% supersaturation were performed for open ocean and coastal air during the Halocarbon Air Sea Transect - Pacific (HalocAST) campaign. The nearly 7000 mile transect, aboard the R/V Thomas G. Thompson, started in Punta Arenas, Chile and ended in Seattle, Washington. Air mass source regions were identified on the basis of air mass back trajectories. For air masses in the southern hemisphere, aerosols sampled over the open ocean acted as cloud condensation nuclei at supersaturations between 0.5 and 1%, while coastal aerosols required higher supersaturations. In the pristine open ocean, observed aerosol concentrations were very low, typically below 200 cm-3, with an average particle diameter of approximately 0.4 μm. On the other hand, coastal aerosol concentrations were above 1000 cm-3 with an average particle diameter of 0.7 μm. Air masses originating in the northern hemisphere had much higher aerosol loads, between 500 and 2000 cm-3 over the ocean and above 4000 cm-3 at the coast. In both cases, the average particle diameters were approximately 0.5 μm. Measurements suggest that the northern hemisphere, substantially more polluted than the southern hemisphere, is characterized by alternating regions of high and medium aerosol number concentration. In addition, measurements of microorganism and organic matter concentration in the surface layer of the ocean water were conducted along the cruise track, to test the hypothesis that biogenic aerosol containing marine organic matter contribute to cloud activation potential. There was a significant correlation between mean aerosol diameter and prokaryote concentration in surface waters (r = 0.585, p < 0.01, n = 24), and between critical supersaturation and prokaryote concentration in surface waters (r = 0.538, p < 0.01, n = 24). This correlation indicates that larger aerosols occurred over water

  16. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  17. The ac power system testbed

    NASA Technical Reports Server (NTRS)

    Mildice, J.; Sundberg, R.

    1987-01-01

    The object of this program was to design, build, test, and deliver a high frequency (20 kHz) Power System Testbed which would electrically approximate a single, separable power channel of an IOC Space Station. That program is described, including the technical background, and the results are discussed showing that the major assumptions about the characteristics of this class of hardware (size, mass, efficiency, control, etc.) were substantially correct. This testbed equipment was completed and delivered and is being operated as part of the Space Station Power System Test Facility.

  18. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  19. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  20. High-contrast imaging testbed

    SciTech Connect

    Baker, K; Silva, D; Poyneer, L; Macintosh, B; Bauman, B; Palmer, D; Remington, T; Delgadillo-Lariz, M

    2008-01-23

    Several high-contrast imaging systems are currently under construction to enable the detection of extra-solar planets. In order for these systems to achieve their objectives, however, there is considerable developmental work and testing which must take place. Given the need to perform these tests, a spatially-filtered Shack-Hartmann adaptive optics system has been assembled to evaluate new algorithms and hardware configurations which will be implemented in these future high-contrast imaging systems. In this article, construction and phase measurements of a membrane 'woofer' mirror are presented. In addition, results from closed-loop operation of the assembled testbed with static phase plates are presented. The testbed is currently being upgraded to enable operation at speeds approaching 500 hz and to enable studies of the interactions between the woofer and tweeter deformable mirrors.

  1. Generalized Nanosatellite Avionics Testbed Lab

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt

    2015-01-01

    The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.

  2. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  3. Optical Coating Thermal Noise Testbed

    NASA Astrophysics Data System (ADS)

    Hartman, Michael T.; Eichholz, Johannes; Tanner, David B.; Mueller, Guido

    2015-04-01

    Interferometric gravitational-wave detectors measure the length strain of a passing gravitational-wave as differential arm length changes in kilometer-long Michelson interferometers. The second-generation detectors, such as Advanced LIGO (aLIGO), will achieve strain sensitivities which are limited by Brownian thermal noise in the optical coatings of the interferometers' arm-cavity mirror test masses. Brownian coating thermal noise (CTN) is the apparent motion on the mirror surface on the order of 10-17 -10-20 m resulting from thermal fluctuations in the coating and the coating's internal friction. The result is a source of length noise in optical resonators that is a function of the coating temperature and the coating material's mechanical loss. At the University of Florida we are constructing the THermal noise Optical Resonator (THOR), a testbed for the direct measurement of CTN in the aLIGO test mass coating as well as future coating candidates. The material properties of the coating (namely mechanical loss) are temperature dependent, making cryogenic mirrors a prospect for future gravitational-wave detectors. To explore this option we are simultaneously building a cryogenic CTN testbed, CryoTHOR. This is a presentation on the status of these testbeds. This work is supported by NSF Grants PHY-0969935 and PHY-1306594.

  4. Quantum well earth science testbed

    NASA Astrophysics Data System (ADS)

    Johnson, William R.; Hook, Simon J.; Mouroulis, Pantazis; Wilson, Daniel W.; Gunapala, Sarath D.; Hill, Cory J.; Mumolo, Jason M.; Eng, Bjorn T.

    2009-11-01

    A thermal hyperspectral imager is underdevelopment which utilizes the compact Dyson optical configuration and the broadband (8-12 μm) quantum well infrared photodetector (QWIP) focal plane array technology. The Dyson configuration uses a single monolithic prism-like grating design which allows for a high throughput instrument (F/1.6) with minimal ghosting, stray light and large swath width. The configuration has the potential to be the optimal high resolution imaging spectroscopy solution for aerial and space remote sensing applications due to its small form factor and relatively low power requirements. The planned instrument specifications are discussed as well as thermal design trade-offs. The current design uses a single high power cryocooler which allows operation of the QWIP at 40 K with adequate temperature stability. Calibration testing results (noise equivalent temperature difference, spectral linearity and spectral bandwidth) and laboratory emissivity plots from samples are shown using an operational testbed unit which has similar specifications as the final airborne system. Field testing of the testbed unit was performed to acquire plots of emissivity for various known standard minerals (quartz, opal, alunite). A comparison is made using data from the ASTER spectral library. The current single band (8-9 μm) testbed utilizes the high uniformity and operability of the QWIP array and shows excellent laboratory and field spectroscopic results.

  5. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  6. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  7. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  8. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  9. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-12-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources.

  10. NASA's telemedicine testbeds: Commercial benefit

    NASA Astrophysics Data System (ADS)

    Doarn, Charles R.; Whitten, Raymond

    1998-01-01

    The National Aeronautics and Space Administration (NASA) has been developing and applying telemedicine to support space flight since the Agency's beginning. Telemetry of physiological parameters from spacecraft to ground controllers is critical to assess the health status of humans in extreme and remote environments. Requisite systems to support medical care and maintain readiness will evolve as mission duration and complexity increase. Developing appropriate protocols and procedures to support multinational, multicultural missions is a key objective of this activity. NASA has created an Agency-wide strategic plan that focuses on the development and integration of technology into the health care delivery systems for space flight to meet these challenges. In order to evaluate technology and systems that can enhance inflight medical care and medical education, NASA has established and conducted several testbeds. Additionally, in June of 1997, NASA established a Commercial Space Center (CSC) for Medical Informatics and Technology Applications at Yale University School of Medicine. These testbeds and the CSC foster the leveraging of technology and resources between government, academia and industry to enhance health care. This commercial endeavor will influence both the delivery of health care in space and on the ground. To date, NASA's activities in telemedicine have provided new ideas in the application of telecommunications and information systems to health care. NASA's Spacebridge to Russia, an Internet-based telemedicine testbed, is one example of how telemedicine and medical education can be conducted using the Internet and its associated tools. Other NASA activities, including the development of a portable telemedicine workstation, which has been demonstrated on the Crow Indian Reservation and in the Texas Prison System, show promise in serving as significant adjuncts to the delivery of health care. As NASA continues to meet the challenges of space flight, the

  11. Control design for the SERC experimental testbeds

    NASA Technical Reports Server (NTRS)

    Jacques, Robert; Blackwood, Gary; Macmartin, Douglas G.; How, Jonathan; Anderson, Eric

    1992-01-01

    Viewgraphs on control design for the Space Engineering Research Center experimental testbeds are presented. Topics covered include: SISO control design and results; sensor and actuator location; model identification; control design; experimental results; preliminary LAC experimental results; active vibration isolation problem statement; base flexibility coupling into isolation feedback loop; cantilever beam testbed; and closed loop results.

  12. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea-ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Crosier, Jonathan; Bower, Keith N.; Darbyshire, Eoghan; Taylor, Jonathan W.; Liu, Dantong; Allan, James D.; Williams, Paul I.; Gallagher, Martin W.; Choularton, Thomas W.

    2016-04-01

    The Arctic sea-ice is intricately coupled to the atmosphere[1]. The decreasing sea-ice extent with the changing climate raises questions about how Arctic cloud structure will respond. Any effort to answer these questions is hindered by the scarcity of atmospheric observations in this region. Comprehensive cloud and aerosol measurements could allow for an improved understanding of the relationship between surface conditions and cloud structure; knowledge which could be key in validating weather model forecasts. Previous studies[2] have shown via remote sensing that cloudiness increases over the marginal ice zone (MIZ) and ocean with comparison to the sea-ice; however, to our knowledge, detailed in-situ data of this transition have not been previously presented. In 2013, the Aerosol-Cloud Coupling and Climate Interactions in the Arctic (ACCACIA) campaign was carried out in the vicinity of Svalbard, Norway to collect in-situ observations of the Arctic atmosphere and investigate this issue. Fitted with a suite of remote sensing, cloud and aerosol instrumentation, the FAAM BAe-146 aircraft was used during the spring segment of the campaign (Mar-Apr 2013). One case study (23rd Mar 2013) produced excellent coverage of the atmospheric changes when transitioning from sea-ice, through the MIZ, to the open ocean. Clear microphysical changes were observed, with the cloud liquid-water content increasing by almost four times over the transition. Cloud base, depth and droplet number also increased, whilst ice number concentrations decreased slightly. The surface warmed by ~13 K from sea-ice to ocean, with minor differences in aerosol particle number (of sizes corresponding to Cloud Condensation Nuclei or Ice Nucleating Particles) observed, suggesting that the primary driver of these microphysical changes was the increased heat fluxes and induced turbulence from the warm ocean surface as expected. References: [1] Kapsch, M.L., Graversen, R.G. and Tjernström, M. Springtime

  13. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  14. CRYOTE (Cryogenic Orbital Testbed) Concept

    NASA Technical Reports Server (NTRS)

    Gravlee, Mari; Kutter, Bernard; Wollen, Mark; Rhys, Noah; Walls, Laurie

    2009-01-01

    Demonstrating cryo-fluid management (CFM) technologies in space is critical for advances in long duration space missions. Current space-based cryogenic propulsion is viable for hours, not the weeks to years needed by space exploration and space science. CRYogenic Orbital TEstbed (CRYOTE) provides an affordable low-risk environment to demonstrate a broad array of critical CFM technologies that cannot be tested in Earth's gravity. These technologies include system chilldown, transfer, handling, health management, mixing, pressure control, active cooling, and long-term storage. United Launch Alliance is partnering with Innovative Engineering Solutions, the National Aeronautics and Space Administration, and others to develop CRYOTE to fly as an auxiliary payload between the primary payload and the Centaur upper stage on an Atlas V rocket. Because satellites are expensive, the space industry is largely risk averse to incorporating unproven systems or conducting experiments using flight hardware that is supporting a primary mission. To minimize launch risk, the CRYOTE system will only activate after the primary payload is separated from the rocket. Flying the testbed as an auxiliary payload utilizes Evolved Expendable Launch Vehicle performance excess to cost-effectively demonstrate enhanced CFM.

  15. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    PubMed

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756

  16. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  17. A Space Testbed for Photovoltaics

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Bailey, Sheila G.

    1998-01-01

    The Ohio Aerospace Institute and the NASA Lewis Research Center are designing and building a solar-cell calibration facility, the Photovoltaic Engineering Testbed (PET) to fly on the International Space Station to test advanced solar cell types in the space environment. A wide variety of advanced solar cell types have become available in the last decade. Some of these solar cells offer more than twice the power per unit area of the silicon cells used for the space station power system. They also offer the possibilities of lower cost, lighter weight, and longer lifetime. The purpose of the PET facility is to reduce the cost of validating new technologies and bringing them to spaceflight readiness. The facility will be used for three primary functions: calibration, measurement, and qualification. It is scheduled to be launched in June of 2002.

  18. Experiments Program for NASA's Space Communications Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Reinhart, Richard

    2012-01-01

    NASA developed a testbed for communications and navigation that was launched to the International Space Station in 2012. The testbed promotes new software defined radio (SDR) technologies and addresses associated operational concepts for space-based SDRs, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. The experiments program consists of a mix of in-house and external experiments from partners in industry, academia, and government. The experiments will investigate key challenges in communications, networking, and global positioning system navigation both on the ground and on orbit. This presentation will discuss some of the key opportunities and challenges for the testbed experiments program.

  19. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  20. A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group

    NASA Astrophysics Data System (ADS)

    Graves, W. R.; Holliday, J. R.; Rundle, J. B.

    2010-12-01

    According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage

  1. Inferring spatial clouds statistics from limited field-of-view, zenith observations

    SciTech Connect

    Sun, C.H.; Thorne, L.R.

    1996-04-01

    Many of the Cloud and Radiation Testbed (CART) measurements produce a time series of zenith observations, but spatial averages are often the desired data product. One possible approach to deriving spatial averages from temporal averages is to invoke Taylor`s hypothesis where and when it is valid. Taylor`s hypothesis states that when the turbulence is small compared with the mean flow, the covariance in time is related to the covariance in space by the speed of the mean flow. For clouds fields, Taylor`s hypothesis would apply when the {open_quotes}local{close_quotes} turbulence is small compared with advective flow (mean wind). The objective of this study is to determine under what conditions Taylor`s hypothesis holds or does not hold true for broken cloud fields.

  2. A Reconfigurable Testbed Environment for Spacecraft Autonomy

    NASA Technical Reports Server (NTRS)

    Biesiadecki, Jeffrey; Jain, Abhinandan

    1996-01-01

    A key goal of NASA's New Millennium Program is the development of technology for increased spacecraft on-board autonomy. Achievement of this objective requires the development of a new class of ground-based automony testbeds that can enable the low-cost and rapid design, test, and integration of the spacecraft autonomy software. This paper describes the development of an Autonomy Testbed Environment (ATBE) for the NMP Deep Space I comet/asteroid rendezvous mission.

  3. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  4. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  5. INDIGO: Building a DataCloud Framework to support Open Science

    NASA Astrophysics Data System (ADS)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  6. Reconstruction of passive open-path FTIR ambient spectra using meteorological measurements and its application for detection of aerosol cloud drift.

    PubMed

    Kira, Oz; Dubowski, Yael; Linker, Raphael

    2015-07-27

    Remote sensing of atmospheric aerosols is of great importance to public and environmental health. This research promotes a simple way of detecting an aerosol cloud using a passive Open Path FTIR (OP-FTIR) system, without utilizing radiative transfer models and without relying on an artificial light source. Meteorological measurements (temperature, relative humidity and solar irradiance), and chemometric methods (multiple linear regression and artificial neural networks) together with previous cloud-free OP-FTIR measurements were used to estimate the ambient spectrum in real time. The cloud detection process included a statistical comparison between the estimated cloud-free signal and the measured OP-FTIR signal. During the study we were able to successfully detect several aerosol clouds (water spray) in controlled conditions as well as during agricultural pesticide spraying in an orchard. PMID:26367691

  7. Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal; Rizzo, Maxime; Thompson, Patrick

    2009-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept study and a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC would provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 5-year mission lifetime. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables starlight suppression in broadband light from 480-960 nm. To demonstrate the VNC approach and advance it's technology readiness we have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  8. The CMS integration grid testbed

    SciTech Connect

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  9. A Kenyan Cloud School. Massive Open Online & Ongoing Courses for Blended and Lifelong Learning

    ERIC Educational Resources Information Center

    Jobe, William

    2013-01-01

    This research describes the predicted outcomes of a Kenyan Cloud School (KCS), which is a MOOC that contains all courses taught at the secondary school level in Kenya. This MOOC will consist of online, ongoing subjects in both English and Kiswahili. The KCS subjects offer self-testing and peer assessment to maximize scalability, and digital badges…

  10. Software Defined Networking challenges and future direction: A case study of implementing SDN features on OpenStack private cloud

    NASA Astrophysics Data System (ADS)

    Shamugam, Veeramani; Murray, I.; Leong, J. A.; Sidhu, Amandeep S.

    2016-03-01

    Cloud computing provides services on demand instantly, such as access to network infrastructure consisting of computing hardware, operating systems, network storage, database and applications. Network usage and demands are growing at a very fast rate and to meet the current requirements, there is a need for automatic infrastructure scaling. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Managing complex environments using traditional networks is time-consuming and expensive, especially in the case of generating virtual machines, migration and network configuration. To mitigate the challenges, network operations require efficient, flexible, agile and scalable software defined networks (SDN). This paper discuss various issues in SDN and suggests how to mitigate the network management related issues. A private cloud prototype test bed was setup to implement the SDN on the OpenStack platform to test and evaluate the various network performances provided by the various configurations.

  11. A Laboratory-Based Microwave Radio-Interferometry Testbed

    NASA Technical Reports Server (NTRS)

    Moriarity, M.; Simon, N.; Leach, K.; Piepmeier, J. R.

    2003-01-01

    The Goddard Radio lnterferometry Testbed (GRIT) is an adaptable platform for laboratory testing of Synthetic Thinned Array Radiometers (STAR). Using this testbed, we demonstrate that the Doppler radiometer can image a point source in an observed scene.

  12. A Multi-Mission Testbed for Advanced Technologies

    NASA Technical Reports Server (NTRS)

    Chau, S. N.; Lang, M.

    2001-01-01

    The mission of the Center for Space Integrated Microsystem (CSIM) at the Jet Propulsion Laboratory is to develop advanced avionics systems for future deep space missions. The Advanced Micro Spacecraft (AMS) task is building a multi-mission testbed facility to enable the infusion of CSIM technologies into future missions. The testbed facility will also perform experimentation for advanced avionics technologies and architectures to meet challenging power, performance, mass, volume, reliability, and fault tolerance of future missions. The testbed facility has two levels of testbeds: (1) a Proof-of-Concept (POC) Testbed and (2) an Engineering Model Testbed. The methodology of the testbed development and the process of technology infusion are presented in a separate paper in this conference. This paper focuses only on the design, implementation, and application of the POC testbed. Additional information is contained in the original extended abstract.

  13. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  14. A thermal spectral-spatial interferometric testbed

    NASA Astrophysics Data System (ADS)

    Savini, G.; Juanola-Parramon, R.; Stabbins, R.; Baccichet, N.; Donohoe, A.; Murphy, A.; O'Sullivan, C.

    2014-07-01

    We present an ongoing effort to achieve a Double Fourier Modulating (DFM) interferometer in the thermal infrared wavelength range. We describe a testbed designed to combine a sky simulator in the form of a miniaturized complex calibration source at the focus of a parabolic collimator with an interferometer baseline consisting of two parallel telescopes each mounted on a motorized linear stage. The two input arms are combined after one of them is modulated via a fast-scanning piezoelectric roof-top mirror. The optical design and layout of the testbed, the choice of interferometer parameters as well as the calibration scene adopted as source are described.

  15. Towards a testbed for malicious code detection

    SciTech Connect

    Lo, R.; Kerchen, P.; Crawford, R.; Ho, W.; Crossley, J.; Fink, G.; Levitt, K.; Olsson, R.; Archer, M. . Div. of Computer Science)

    1991-01-01

    This paper proposes an environment for detecting many types of malicious code, including computer viruses, Trojan horses, and time/logic bombs. This malicious code testbed (MCT) is based upon both static and dynamic analysis tools developed at the University of California, Davis, which have been shown to be effective against certain types of malicious code. The testbed extends the usefulness of these tools by using them in a complementary fashion to detect more general cases of malicious code. Perhaps more importantly, the MCT allows administrators and security analysts to check a program before installation, thereby avoiding any damage a malicious program might inflict. 5 refs., 2 figs., 2 tabs.

  16. The design and implementation of the LLNL gigabit testbed

    SciTech Connect

    Garcia, D.

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  17. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    NASA Astrophysics Data System (ADS)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  18. Exploiting Open Environmental Data using Linked Data and Cloud Computing: the MELODIES project

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Gonçalves, Pedro; Caumont, Hervé; Koubarakis, Manolis; Perkins, Bethan

    2015-04-01

    The European Open Data Strategy establishes important new principles that ensure that European public sector data will be released at no cost (or marginal cost), in machine-readable, commonly-understood formats, and with liberal licences enabling wide reuse. These data encompass both scientific data about the environment (from Earth Observation and other fields) and other public sector information, including diverse topics such as demographics, health and crime. Many open geospatial datasets (e.g. land use) are already available through the INSPIRE directive and made available through infrastructures such as the Global Earth Observation System of Systems (GEOSS). The intention of the Open Data Strategy is to stimulate the growth of research and value-adding services that build upon these data streams; however, the potential value inherent in open data, and the benefits that can be gained by combining previously-disparate sources of information are only just starting to become understood. The MELODIES project (Maximising the Exploitation of Linked Open Data In Enterprise and Science) is developing eight innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. MELODIES (http://melodiesproject.eu) is a European FP7 project that is coordinated by the University of Reading and has sixteen partners (including nine SMEs) from eight European countries. It started in November 2013 and will run for three years. The project is therefore in its early stages and therefore we will value the opportunity that this workshop affords to present our plans and interact with the wider Linked Geospatial Data community. The project is developing eight new services[1] covering a range of domains including agriculture, urban ecosystems, land use management, marine information, desertification, crisis management and hydrology. These services will combine Earth

  19. Contrast analysis and stability on the ExAO testbed

    SciTech Connect

    Evans, J; Thomas, S; Gavel, D; Dillon, D; Macintosh, B

    2008-06-10

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. The Laboratory for Adaptive Optics at UC Santa Cruz is investigating limits to high-contrast imaging in support of the Gemini Planet Imager. Previous contrast measurements were made with a simple single-opening prolate spheroid shaped pupil that produced a limited region of high-contrast, particularly when wavefront errors were corrected with the 1024-actuator Boston Micromachines MEMS deformable mirror currently in use on the testbed. A more sophisticated shaped pupil is now being used that has a much larger region of interest facilitating a better understanding of high-contrast measurements. In particular we examine the effect of heat sources in the testbed on PSF stability. We find that rms image motion scales as 0.02 {lambda}/D per watt when the heat source is near the pupil plane. As a result heat sources of greater than 5 watts should be avoided near pupil planes for GPI. The safest place to introduce heat is near a focal plane. Heat also can effect the standard deviation of the high-contrast region but in the final instrument other sources of error should be more significant.

  20. Flight Projects Office Information Systems Testbed (FIST)

    NASA Technical Reports Server (NTRS)

    Liggett, Patricia

    1991-01-01

    Viewgraphs on the Flight Projects Office Information Systems Testbed (FIST) are presented. The goal is to perform technology evaluation and prototyping of information systems to support SFOC and JPL flight projects in order to reduce risk in the development of operational data systems for such projects.

  1. Space Interferometry System Testbed-3: architecture

    NASA Technical Reports Server (NTRS)

    Alvarez-Salazar, Oscar S.; Renaud, Goullioud; Azizi, Ali

    2004-01-01

    The Space Interferometry Mission's System testbed-3 has recently integrated its Precision Support Structure and spacecraft backpack on a pseudo free-free 0.5 Hz passive isolation system. The Precision Support Structure holds a 3-baseline stellar interferometer instrument.

  2. A Laboratory Testbed for Embedded Fuzzy Control

    ERIC Educational Resources Information Center

    Srivastava, S.; Sukumar, V.; Bhasin, P. S.; Arun Kumar, D.

    2011-01-01

    This paper presents a novel scheme called "Laboratory Testbed for Embedded Fuzzy Control of a Real Time Nonlinear System." The idea is based upon the fact that project-based learning motivates students to learn actively and to use their engineering skills acquired in their previous years of study. It also fosters initiative and focuses students'…

  3. On the relationship between open cellular convective cloud patterns and the spatial distribution of precipitation

    NASA Astrophysics Data System (ADS)

    Yamaguchi, T.; Feingold, G.

    2014-10-01

    Precipitation is thought to be a necessary but insufficient condition for the transformation of stratocumulus-topped closed cellular convection to open cellular cumuliform convection. Here we test the hypothesis that the spatial distribution of precipitation is a key element of the closed-to-open cell transition. A series of idealized 3-dimensional simulations are conducted to evaluate the dependency of the transformation on the areal coverage of rain, and to explore the role of interactions between multiple rainy areas in the formation of the open cells. When rain is restricted to a small area, even substantial rain (order few mm day-1) does not result in a transition. With increasing areal coverage of the rain, the transition becomes possible provided that the rain rate is sufficiently large. When multiple small rain regions interact with each other, the transition occurs and spreads over a wider area, provided that the distance between the rain regions is short. When the distance between the rain areas is large, the transition eventually occurs, albeit slowly. For much longer distances between rain regions the system is anticipated to remain in a closed-cell state. These results suggest a connection to the recently hypothesized remote control of open-cell formation. Finally it is shown that phase trajectories of the mean and coefficient of variation of vertically integrated variables such as liquid water path align on one trajectory. This could be used as a diagnostic tool for global analyses of the statistics of closed- and open-cell occurrence and transitions between them.

  4. Managing autonomy levels in the SSM/PMAD testbed. [Space Station Power Management and Distribution

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry R.

    1990-01-01

    It is pointed out that when autonomous operations are mixed with those of a manual nature, concepts concerning the boundary of operations and responsibility become clouded. The space station module power management and distribution (SSM/PMAD) automation testbed has the need for such mixed-mode capabilities. The concept of managing the SSM/PMAD testbed in the presence of changing levels of autonomy is examined. A knowledge-based approach to implementing autonomy management in the distributed SSM/PMAD utilizing a centralized planning system is presented. Its knowledge relations and system-wide interactions are discussed, along with the operational nature of the currently functioning SSM/PMAD knowledge-based systems.

  5. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  6. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  7. Sparse matrix methods research using the CSM testbed software system

    NASA Technical Reports Server (NTRS)

    Chu, Eleanor; George, J. Alan

    1989-01-01

    Research is described on sparse matrix techniques for the Computational Structural Mechanics (CSM) Testbed. The primary objective was to compare the performance of state-of-the-art techniques for solving sparse systems with those that are currently available in the CSM Testbed. Thus, one of the first tasks was to become familiar with the structure of the testbed, and to install some or all of the SPARSPAK package in the testbed. A suite of subroutines to extract from the data base the relevant structural and numerical information about the matrix equations was written, and all the demonstration problems distributed with the testbed were successfully solved. These codes were documented, and performance studies comparing the SPARSPAK technology to the methods currently in the testbed were completed. In addition, some preliminary studies were done comparing some recently developed out-of-core techniques with the performance of the testbed processor INV.

  8. THE HERSCHEL INVENTORY OF THE AGENTS OF GALAXY EVOLUTION IN THE MAGELLANIC CLOUDS, A HERSCHEL OPEN TIME KEY PROGRAM

    SciTech Connect

    Meixner, M.; Roman-Duval, J.; Seale, J.; Gordon, K.; Beck, T.; Boyer, M. L.; Panuzzo, P.; Hony, S.; Sauvage, M.; Okumura, K.; Chanial, P.; Babler, B.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Carlson, L. R.; Clayton, G. C.; and others

    2013-09-15

    We present an overview of the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 {mu}m with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for the LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the criteria used to establish a catalog for each waveband for the HERITAGE program. The 250 {mu}m band is the most sensitive and the source catalogs for this band have {approx}25,000 objects for the LMC and {approx}5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supernova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the community support

  9. The HERschel Inventory of the Agents of Galaxy Evolution in the Magellanic Clouds, a HERschel Open Time Key Program

    NASA Technical Reports Server (NTRS)

    Meixner, Margaret; Panuzzo, P.; Roman-Duval, J.; Engelbracht, C.; Babler, B.; Seale, J.; Hony, S.; Montiel, E.; Sauvage, M.; Gordon, K.; Misselt, K.; Okumura, K.; Chanial, P.; Beck, T.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Boyer, M. L.; Carlson, L. R.; Clayton, G. C.; Chen, C.-H. R.; Cormier, D.; Fukui, Y.; Galametz, M.; Galliano, F.

    2013-01-01

    We present an overview or the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 micron with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the critetia used to establish a catalog for each waveband for the HERITAGE program. The 250 micron band is the most sensitive and the source catalogs for this band have approx. 25,000 objects for the LMC and approx. 5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supemova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the conummity support

  10. Observed and simulated temperature dependence of the liquid water path of low clouds

    SciTech Connect

    Del Genio, A.D.; Wolf, A.B.

    1996-04-01

    Data being acquired at the Atmospheric Radiation Measurement (ARM) Southern great Plains (SGP) Cloud and Radiation Testbed (CART) site can be used to examine the factors determining the temperature dependence of cloud optical thickness. We focus on cloud liquid water and physical thickness variations which can be derived from existing ARM measurements.

  11. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Vaillancourt, Robert; Carlson, David; Evans, Thomas; Schundler, Elizabeth; Todd, Lori; Mottus, Kathleen

    2009-05-01

    OPTRA is developing an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach is intended as a referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill.

  12. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  13. Overview of the Telescience Testbed Program

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad; Leiner, Barry M.

    1991-01-01

    The NASA's Telescience Testbed Program (TTP) conducted by the Ames Research Center is described with particular attention to the objectives, the approach used to achieve these objectives, and the expected benefits of the program. The goal of the TTP is to gain operational experience for the Space Station Freedom and the Earth Observing System programs, using ground testbeds, and to define the information and communication systems requirements for the development and operation of these programs. The results of TTP are expected to include the requirements for the remote coaching, command and control, monitoring and maintenance, payload design, and operations management. In addition, requirements for technologies such as workstations, software, video, automation, data management, and networking will be defined.

  14. Mini-mast CSI testbed user's guide

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  15. VCE testbed program planning and definition study

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.; Godston, J.

    1978-01-01

    The flight definition of the Variable Stream Control Engine (VSCE) was updated to reflect design improvements in the two key components: (1) the low emissions duct burner, and (2) the coannular exhaust nozzle. The testbed design was defined and plans for the overall program were formulated. The effect of these improvements was evaluated for performance, emissions, noise, weight, and length. For experimental large scale testing of the duct burner and coannular nozzle, a design definition of the VCE testbed configuration was made. This included selecting the core engine, determining instrumentation requirements, and selecting the test facilities, in addition to defining control system and assembly requirements. Plans for a comprehensive test program to demonstrate the duct burner and nozzle technologies were formulated. The plans include both aeroacoustic and emissions testing.

  16. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Vaillancourt, Robert; Carlson, David; Evans, Thomas; Schundler, Elizabeth; Todd, Lori; Mottus, Kathleen

    2010-04-01

    OPTRA has developed an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach is intended as a referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill. In this paper, we summarize the design and build and detail system characterization and test of a prototype I-OP-FTIR instrument. System characterization includes radiometric performance and spectral resolution. Results from a series of tomographic reconstructions of sulfur hexafluoride plumes in a laboratory setting are also presented.

  17. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Engel, James R.; Vaillancourt, Robert; Todd, Lori; Mottus, Kathleen

    2008-04-01

    OPTRA and University of North Carolina are developing an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach will be considered as a candidate referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill. In this paper, we summarize progress to date and overall system performance projections based on the instrument, spectroscopy, and tomographic reconstruction accuracy. We then present a preliminary optical design of the I-OP-FTIR.

  18. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  19. Developments at the Advanced Design Technologies Testbed

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    2003-01-01

    A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.

  20. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    PubMed Central

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  1. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience.

    PubMed

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  2. The Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Hines, Braden; Bell, Charles; Shen, Tsae-Pyng; Bloemhof, Eric; Zhao, Feng; Regehr, Martin; Holmes, Howard; Irigoyen, Robert; Neat, Gregory

    2003-01-01

    The Micro-Arcsecond Metrology (MAM) testbed is a ground-based system of optical and electronic equipment for testing components, systems, and engineering concepts for the Space Interferometer Mission (SIM) and similar future missions, in which optical interferometers will be operated in outer space. In addition, the MAM testbed is of interest in its own right as a highly precise metrological system. The designs of the SIM interferometer and the MAM testbed reflect a requirement to measure both the position of the starlight central fringe and the change in the internal optical path of the interferometer with sufficient spatial resolution to generate astrometric data with angular resolution at the microarcsecond level. The internal path is to be measured by use of a small metrological laser beam of 1,319-nm wavelength, whereas the position of the starlight fringe is to be estimated by use of a charge-coupled-device (CCD) image detector sampling a large concentric annular beam. For the SIM to succeed, the optical path length determined from the interferometer fringes must be tracked by the metrological subsystem to within tens of picometers, through all operational motions of an interferometer delay line and siderostats. The purpose of the experiments performed on the MAM testbed is to demonstrate this agreement in a large-scale simulation that includes a substantial portion of the system in the planned configuration for operation in outer space. A major challenge in this endeavor is to align the metrological beam with the starlight beam in order to maintain consistency between the metrological and starlight subsystems at the system level. The MAM testbed includes an optical interferometer with a white light source, all major optical components of a stellar interferometer, and heterodyne metrological sensors. The aforementioned subsystems are installed in a large vacuum chamber in order to suppress atmospheric and thermal disturbances. The MAM is divided into two

  3. Evaluating Aerosol Process Modules within the Framework of the Aerosol Modeling Testbed

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Velu, V.; Gustafson, W. I.; Chapman, E.; Easter, R. C.; Shrivastava, M.; Singh, B.

    2012-12-01

    Factors that influence predictions of aerosol direct and indirect forcing, such as aerosol mass, composition, size distribution, hygroscopicity, and optical properties, still contain large uncertainties in both regional and global models. New aerosol treatments are usually implemented into a 3-D atmospheric model and evaluated using a limited number of measurements from a specific case study. Under this modeling paradigm, the performance and computational efficiency of several treatments for a specific aerosol process cannot be adequately quantified because many other processes among various modeling studies (e.g. grid configuration, meteorology, emission rates) are different as well. The scientific community needs to know the advantages and disadvantages of specific aerosol treatments when the meteorology, chemistry, and other aerosol processes are identical in order to reduce the uncertainties associated with aerosols predictions. To address these issues, an Aerosol Modeling Testbed (AMT) has been developed that systematically and objectively evaluates new aerosol treatments for use in regional and global models. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from the Community Atmosphere Model version 5 (CAM5) have also been ported to WRF so that they can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. Data from several campaigns, including the 2006

  4. Overview on In-Space Internet Node Testbed (ISINT)

    NASA Technical Reports Server (NTRS)

    Richard, Alan M.; Kachmar, Brian A.; Fabian, Theodore; Kerczewski, Robert J.

    2000-01-01

    The Satellite Networks and Architecture Branch has developed the In-Space Internet Node Technology testbed (ISINT) for investigating the use of commercial Internet products for NASA missions. The testbed connects two closed subnets over a tabletop Ka-band transponder by using commercial routers and modems. Since many NASA assets are in low Earth orbits (LEO's), the testbed simulates the varying signal strength, changing propagation delay, and varying connection times that are normally experienced when communicating to the Earth via a geosynchronous orbiting (GEO) communications satellite. Research results from using this testbed will be used to determine which Internet technologies are appropriate for NASA's future communication needs.

  5. ISS Update: ISTAR -- International Space Station Testbed for Analog Research

    NASA Video Gallery

    NASA Public Affairs Officer Kelly Humphries interviews Sandra Fletcher, EVA Systems Flight Controller. They discuss the International Space Station Testbed for Analog Research (ISTAR) activity that...

  6. Thermodynamic and cloud parameter retrieval using infrared spectral data

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Huang, Hung-Lung A.; Li, Jun; McGill, Matthew J.; Mango, Stephen A.

    2005-01-01

    High-resolution infrared radiance spectra obtained from near nadir observations provide atmospheric, surface, and cloud property information. A fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. The retrieval algorithm is presented along with its application to recent field experiment data from the NPOESS Airborne Sounding Testbed - Interferometer (NAST-I). The retrieval accuracy dependence on cloud properties is discussed. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with an accuracy of approximately 1.0 km. Preliminary NAST-I retrieval results from the recent Atlantic-THORPEX Regional Campaign (ATReC) are presented and compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL).

  7. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation. PMID:23230155

  8. Design of the Dual Conjugate Adaptive Optics Test-bed

    NASA Astrophysics Data System (ADS)

    Sharf, Inna; Bell, K.; Crampton, D.; Fitzsimmons, J.; Herriot, Glen; Jolissaint, Laurent; Lee, B.; Richardson, H.; van der Kamp, D.; Veran, Jean-Pierre

    In this paper, we describe the Multi-Conjugate Adaptive Optics laboratory test-bed presently under construction at the University of Victoria, Canada. The test-bench will be used to support research in the performance of multi-conjugate adaptive optics, turbulence simulators, laser guide stars and miniaturizing adaptive optics. The main components of the test-bed include two micro-machined deformable mirrors, a tip-tilt mirror, four wavefront sensors, a source simulator, a dual-layer turbulence simulator, as well as computational and control hardware. The paper will describe in detail the opto-mechanical design of the adaptive optics module, the design of the hot-air turbulence generator and the configuration chosen for the source simulator. Below, we present a summary of these aspects of the bench. The optical and mechanical design of the test-bed has been largely driven by the particular choice of the deformable mirrors. These are continuous micro-machined mirrors manufactured by Boston Micromachines Corporation. They have a clear aperture of 3.3 mm and are deformed with 140 actuators arranged in a square grid. Although the mirrors have an open-loop bandwidth of 6.6 KHz, their shape can be updated at a sampling rate of 100 Hz. In our optical design, the mirrors are conjugated at 0km and 10 km in the atmosphere. A planar optical layout was achieved by using four off-axis paraboloids and several folding mirrors. These optics will be mounted on two solid blocks which can be aligned with respect to each other. The wavefront path design accommodates 3 monochromatic guide stars that can be placed at either 90 km or at infinity. The design relies on the natural separation of the beam into 3 parts because of differences in locations of the guide stars in the field of view. In total four wavefront sensors will be procured from Adaptive Optics Associates (AOA) or built in-house: three for the guide stars and the fourth to collect data from the science source output in

  9. Contrasting sea-ice and open-water boundary layers during melt and freeze-up seasons: Some result from the Arctic Clouds in Summer Experiment.

    NASA Astrophysics Data System (ADS)

    Tjernström, Michael; Sotiropoulou, Georgia; Sedlar, Joseph; Achtert, Peggy; Brooks, Barbara; Brooks, Ian; Persson, Ola; Prytherch, John; Salsbury, Dominic; Shupe, Matthew; Johnston, Paul; Wolfe, Dan

    2016-04-01

    With more open water present in the Arctic summer, an understanding of atmospheric processes over open-water and sea-ice surfaces as summer turns into autumn and ice starts forming becomes increasingly important. The Arctic Clouds in Summer Experiment (ACSE) was conducted in a mix of open water and sea ice in the eastern Arctic along the Siberian shelf during late summer and early autumn 2014, providing detailed observations of the seasonal transition, from melt to freeze. Measurements were taken over both ice-free and ice-covered surfaces, offering an insight to the role of the surface state in shaping the lower troposphere and the boundary-layer conditions as summer turned into autumn. During summer, strong surface inversions persisted over sea ice, while well-mixed boundary layers capped by elevated inversions were frequent over open-water. The former were often associated with advection of warm air from adjacent open-water or land surfaces, whereas the latter were due to a positive buoyancy flux from the warm ocean surface. Fog and stratus clouds often persisted over the ice, whereas low-level liquid-water clouds developed over open water. These differences largely disappeared in autumn, when mixed-phase clouds capped by elevated inversions dominated in both ice-free and ice-covered conditions. Low-level-jets occurred ~20-25% of the time in both seasons. The observations indicate that these jets were typically initiated at air-mass boundaries or along the ice edge in autumn, while in summer they appeared to be inertial oscillations initiated by partial frictional decoupling as warm air was advected in over the sea ice. The start of the autumn season was related to an abrupt change in atmospheric conditions, rather than to the gradual change in solar radiation. The autumn onset appeared as a rapid cooling of the whole atmosphere and the freeze up followed as the warm surface lost heat to the atmosphere. While the surface type had a pronounced impact on boundary

  10. Application of Model-based Prognostics to a Pneumatic Valves Testbed

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.; Gorospe, George

    2014-01-01

    Pneumatic-actuated valves play an important role in many applications, including cryogenic propellant loading for space operations. Model-based prognostics emphasizes the importance of a model that describes the nominal and faulty behavior of a system, and how faulty behavior progresses in time, causing the end of useful life of the system. We describe the construction of a testbed consisting of a pneumatic valve that allows the injection of faulty behavior and controllable fault progression. The valve opens discretely, and is controlled through a solenoid valve. Controllable leaks of pneumatic gas in the testbed are introduced through proportional valves, allowing the testing and validation of prognostics algorithms for pneumatic valves. A new valve prognostics approach is developed that estimates fault progression and predicts remaining life based only on valve timing measurements. Simulation experiments demonstrate and validate the approach.

  11. Application developer's tutorial for the CSM testbed architecture

    NASA Technical Reports Server (NTRS)

    Underwood, Phillip; Felippa, Carlos A.

    1988-01-01

    This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.

  12. Near infrared missile warning testbed sensor

    NASA Astrophysics Data System (ADS)

    McDermott, D. J.; Johnson, R. S.; Montgomery, J. B.; Sanderson, R. B.; McCalmont, J. F.; Taylor, M. J.

    2008-04-01

    Multicolor discrimination is one of the most effective ways of improving the performance of infrared missile warning sensors, particularly for heavy clutter situations. A new tactical airborne multicolor missile warning testbed was developed and fielded as part of a continuing Air Force Research Laboratory (AFRL) initiative focusing on clutter and missile signature measurements for effective missile warning algorithms. The developed sensor test bed is a multi-camera system 1004x1004 FPA coupled with optimized spectral filters integrated with the optics; a reduced form factor microprocessor-based video data recording system operating at 48 Hz; and a real time field programmable gate array processor for algorithm and video data processing capable of 800B Multiply/Accumulates operations per second. A detailed radiometric calibration procedure was developed to overcome severe photon-limited operating conditions due to the sub-nanometer bandwidth of the spectral filters. This configuration allows the collection and real-time processing of temporally correlated, radiometrically calibrated video data in multiple spectral bands. The testbed was utilized to collect false alarm sources spectra and Man-Portable Air Defense System (MANPADS) signatures under a variety of atmospheric and solar illuminating conditions. Signatures of approximately 100 missiles have been recorded.

  13. A Turbine-powered UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; High, James W.; Guerreiro, Nelson M.; Chambers, Ryan S.; Howard, Keith D.

    2007-01-01

    The latest version of the NASA Flying Controls Testbed (FLiC) integrates commercial-off-the-shelf components including airframe, autopilot, and a small turbine engine to provide a low cost experimental flight controls testbed capable of sustained speeds up to 200 mph. The series of flight tests leading up to the demonstrated performance of the vehicle in sustained, autopiloted 200 mph flight at NASA Wallops Flight Facility's UAV runway in August 2006 will be described. Earlier versions of the FLiC were based on a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate at Fort Eustis, Virginia and NASA Langley Research Center. The newer turbine powered platform (J-FLiC) builds on the successes using the relatively smaller, slower and less expensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches with the implementation of C-coded experimental controllers. Tracking video was taken during the test flights at Wallops and will be available for presentation at the conference. Analysis of flight data from both remotely piloted and autopiloted flights will be presented. Candidate experimental controllers for implementation will be discussed. It is anticipated that flight testing will resume in Spring 2007 and those results will be included, if possible.

  14. Gemini Planet Imager coronagraph testbed results

    NASA Astrophysics Data System (ADS)

    Sivaramakrishnan, Anand; Soummer, Rémi; Oppenheimer, Ben R.; Carr, G. Lawrence; Mey, Jacob L.; Brenner, Doug; Mandeville, Charles W.; Zimmerman, Neil; Macintosh, Bruce A.; Graham, James R.; Saddlemyer, Les; Bauman, Brian; Carlotti, Alexis; Pueyo, Laurent; Tuthill, Peter G.; Dorrer, Christophe; Roberts, Robin; Greenbaum, Alexandra

    2010-07-01

    The Gemini Planet Imager (GPI) is an extreme AO coronagraphic integral field unit YJHK spectrograph destined for first light on the 8m Gemini South telescope in 2011. GPI fields a 1500 channel AO system feeding an apodized pupil Lyot coronagraph, and a nIR non-common-path slow wavefront sensor. It targets detection and characterizion of relatively young (<2GYr), self luminous planets up to 10 million times as faint as their primary star. We present the coronagraph subsystem's in-lab performance, and describe the studies required to specify and fabricate the coronagraph. Coronagraphic pupil apodization is implemented with metallic half-tone screens on glass, and the focal plane occulters are deep reactive ion etched holes in optically polished silicon mirrors. Our JH testbed achieves H-band contrast below a million at separations above 5 resolution elements, without using an AO system. We present an overview of the coronagraphic masks and our testbed coronagraphic data. We also demonstrate the performance of an astrometric and photometric grid that enables coronagraphic astrometry relative to the primary star in every exposure, a proven technique that has yielded on-sky precision of the order of a milliarsecond.

  15. Telescience testbed in human space physiology

    NASA Astrophysics Data System (ADS)

    Watanabe, Satoru; Seo, Hisao; Iwase, Satoshi; Tanaka, Masafumi; Kaneko, Sayumi; Mano, Tadaaki; Matsui, Nobuo; Foldager, Niels; Bondepetersen, Flemming; Yamashita, Masamichi; Shoji, Takatoshi; Sudoh, Hideo

    The present telescience testbed study was conducted to evaluate the feasibility of physiological experimentation under restricted conditions such as during simulated weightlessness induced by using a water immersion facility, a reduced capacity of laboratory facilities, a delay and desynchronization of communication between investigator and operator, restrictions of different kinds of experiments practiced by only one operator following a limited time line and so on. The three day's experiments were carried out following the same protocols. The operators were changed every day, but was the same the first and the third day. The operators were both medical doctors but not all round experts in the physiological experimentation. The experimental objectives were: 1) ECG changes by changing water immersion levels, 2) blood pressure changes, 3) ultrasonic Echo-cardiographic changes, 4) laser Doppler skin blood flowmetry in a finger, 5) blood sampling to examine blood electrolytic and humoral changes. The effectiveness of the testbed experiment was assessed by evaluating the quality of the obtained data and estimating the friendliness of the operation of the telescience to investigators and operators.

  16. Testbed for the development of intelligent robot control

    SciTech Connect

    Harrigan, R.W.

    1986-01-01

    The Sensor Driven Robot Systems Testbed has been constructed to provide a working environment to aid in the development of intelligent robot control software. The Testbed employs vision and force as the robot's means of interrogating its environment. The Testbed, which has been operational for approximately 24 months, consists of a PUMA-560 robot manipulator coupled to a 2-dimensional vision system and force and torque sensing wrist. Recent work within the Testbed environment has led to a highly modularized control software concept with emphasis on detection and resolution of error situations. The objective of the Testbed is to develop intelligent robot control concepts incorporating planning and error recovery which are transportable to a wide variety of robot applications. This project is an ongoing, longterm development project and, as such, this paper represents a status report of the development work.

  17. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    SciTech Connect

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  18. Expediting Experiments across Testbeds with AnyBed: A Testbed-Independent Topology Configuration System and Its Tool Set

    NASA Astrophysics Data System (ADS)

    Suzuki, Mio; Hazeyama, Hiroaki; Miyamoto, Daisuke; Miwa, Shinsuke; Kadobayashi, Youki

    Building an experimental network within a testbed has been a tiresome process for experimenters, due to the complexity of the physical resource assignment and the configuration overhead. Also, the process could not be expedited across testbeds, because the syntax of a configuration file varies depending on specific hardware and software. Re-configuration of an experimental topology for each testbed wastes time, an experimenter could not carry out his/her experiments during the limited lease time of a testbed at worst. In this paper, we propose the AnyBed: the experimental network-building system. The conceptual idea of AnyBed is “If experimental network topologies can be portable across any kinds of testbed, then, it would expedite building an experimental network on a testbed while manipulating experiments by each testbed support tool”. To achieve this concept, AnyBed divide an experimental network configuration into the logical and physical network topologies. Mapping these two topologies, AnyBed can build intended logical network topology on any PC clusters. We have evaluated the AnyBed implementation using two distinct clusters. The evaluation result shows a BGP topology with 150 nodes can be constructed on a large scale testbed in less than 113 seconds.

  19. The Magellan Final Report on Cloud Computing

    SciTech Connect

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  20. Cross layer optimization for cloud-based radio over optical fiber networks

    NASA Astrophysics Data System (ADS)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  1. Supersonic combustion engine testbed, heat lightning

    NASA Technical Reports Server (NTRS)

    Hoying, D.; Kelble, C.; Langenbahn, A.; Stahl, M.; Tincher, M.; Walsh, M.; Wisler, S.

    1990-01-01

    The design of a supersonic combustion engine testbed (SCET) aircraft is presented. The hypersonic waverider will utilize both supersonic combustion ramjet (SCRAMjet) and turbofan-ramjet engines. The waverider concept, system integration, electrical power, weight analysis, cockpit, landing skids, and configuration modeling are addressed in the configuration considerations. The subsonic, supersonic and hypersonic aerodynamics are presented along with the aerodynamic stability and landing analysis of the aircraft. The propulsion design considerations include: engine selection, turbofan ramjet inlets, SCRAMjet inlets and the SCRAMjet diffuser. The cooling requirements and system are covered along with the topics of materials and the hydrogen fuel tanks and insulation system. A cost analysis is presented and the appendices include: information about the subsonic wind tunnel test, shock expansion calculations, and an aerodynamic heat flux program.

  2. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

  3. The NULLTIMATE Testbed: A Progress Report

    NASA Astrophysics Data System (ADS)

    Gabor, P.

    2010-10-01

    Nulling interferometry has been suggested as the underlying principle for an instrument which could provide direct detection and spectroscopy of Earth-like exoplanets, including searches for potential biomarkers (Darwin/TPF-I). Several aspects of this method require further research and development. The NULLTIMATE testbed at the Institut d’Astrophysique Spatiale in Orsay, France, is a new instrument, built in late 2008. It is designed to test different achromatic phase shifters (focus crossing, field reversal, dielectric plates) at 300 K using various sources ranging from 2 to 10 μm, with special attention to stabilization (optical path difference and beam intensity balance). Its operational parameters (null depth and stability) were tested with a monochromatic laser sources at 2.32 and 3.39 μm and with a supercontinuum source in the K band. This poster presents a progress report on its performance with a focus crossing achromatic phase shifter.

  4. Construction and Modeling of a Controls Testbed

    NASA Technical Reports Server (NTRS)

    Nagle, James C.; Homaifar, Abdollah; Nasser, Ahmed A.; Bikdash, Marwan

    1997-01-01

    This paper describes the construction and modeling of a control system testbed to be used for the comparison of various control methodologies. We specifically wish to test fuzzy logic control and compare performance of various fuzzy controllers, including Hybrid Fuzzy-PID (HFPID) and Hierarchical Hybrid Fuzzy-PID (HHFPID) to other controllers including localized rate feedback, LQR/LTR, and H2/H(sub infinity). The control problem is that of vibration suppression in a thin plate with inputs coming from accelerometers and outputs going to piezoelectric actuators or 'patches'. A model based on experimental modal analysis of the plate is conducted and compared with an analytical model. The analytical model uses a boundary condition which is a mix of clamped and simply supported.

  5. A land-surface Testbed for EOSDIS

    NASA Technical Reports Server (NTRS)

    Emery, William; Kelley, Tim

    1994-01-01

    The main objective of the Testbed project was to deliver satellite images via the Internet to scientific and educational users free of charge. The main method of operations was to store satellite images on a low cost tape library system, visually browse the raw satellite data, access the raw data filed, navigate the imagery through 'C' programming and X-Windows interface software, and deliver the finished image to the end user over the Internet by means of file transfer protocol methods. The conclusion is that the distribution of satellite imagery by means of the Internet is feasible, and the archiving of large data sets can be accomplished with low cost storage systems allowing multiple users.

  6. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    SciTech Connect

    Maloney, T.M.; Voecks, G.E.

    1995-01-25

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}

  7. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  8. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  9. The Goddard Space Flight Center (GSFC) robotics technology testbed

    NASA Technical Reports Server (NTRS)

    Schnurr, Rick; Obrien, Maureen; Cofer, Sue

    1989-01-01

    Much of the technology planned for use in NASA's Flight Telerobotic Servicer (FTS) and the Demonstration Test Flight (DTF) is relatively new and untested. To provide the answers needed to design safe, reliable, and fully functional robotics for flight, NASA/GSFC is developing a robotics technology testbed for research of issues such as zero-g robot control, dual arm teleoperation, simulations, and hierarchical control using a high level programming language. The testbed will be used to investigate these high risk technologies required for the FTS and DTF projects. The robotics technology testbed is centered around the dual arm teleoperation of a pair of 7 degree-of-freedom (DOF) manipulators, each with their own 6-DOF mini-master hand controllers. Several levels of safety are implemented using the control processor, a separate watchdog computer, and other low level features. High speed input/output ports allow the control processor to interface to a simulation workstation: all or part of the testbed hardware can be used in real time dynamic simulation of the testbed operations, allowing a quick and safe means for testing new control strategies. The NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) hierarchical control scheme, is being used as the reference standard for system design. All software developed for the testbed, excluding some of simulation workstation software, is being developed in Ada. The testbed is being developed in phases. The first phase, which is nearing completion, and highlights future developments is described.

  10. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  11. Time-multiplexed open-path TDLAS spectrometer for dynamic, sampling-free, interstitial H2 18O and H2 16O vapor detection in ice clouds

    NASA Astrophysics Data System (ADS)

    Kühnreich, B.; Wagner, S.; Habig, J. C.; Möhler, O.; Saathoff, H.; Ebert, V.

    2015-04-01

    An advanced in situ diode laser hygrometer for simultaneous, sampling-free detection of interstitial H2 16O and H2 18O vapor was developed and tested in the aerosol interaction and dynamics in atmosphere (AIDA) cloud chamber during dynamic cloud formation processes. The spectrometer to measure isotope-resolved water vapor concentrations comprises two rapidly time-multiplexed DFB lasers near 1.4 and 2.7 µm and an open-path White cell with 227-m absorption path length and 4-m mirror separation. A dynamic water concentration range from 2.6 ppb to 87 ppm for H2 16O and 87 ppt to 3.6 ppm for H2 18O could be achieved and was used to enable a fast and direct detection of dynamic isotope ratio changes during ice cloud formation in the AIDA chamber at temperatures between 190 and 230 K. Relative changes in the H2 18O/H2 16O isotope ratio of 1 % could be detected and resolved with a signal-to-noise ratio of 7. This converts to an isotope ratio resolution limit of 0.15 % at 1-s time resolution.

  12. Search Cloud

    MedlinePlus

    ... this page: https://medlineplus.gov/cloud.html Search Cloud To use the sharing features on this page, ... Top 110 zoster vaccine Share the MedlinePlus search cloud with your users by embedding our search cloud ...

  13. Search Cloud

    MedlinePlus

    ... www.nlm.nih.gov/medlineplus/cloud.html Search Cloud To use the sharing features on this page, please enable JavaScript. Share the MedlinePlus search cloud with your users by embedding our search cloud ...

  14. Development of a space-systems network testbed

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  15. Development of a space-systems network testbed

    NASA Astrophysics Data System (ADS)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  16. Development of Hardware-in-the-loop Microgrid Testbed

    SciTech Connect

    Xiao, Bailu; Prabakar, Kumaraguru; Starke, Michael R; Liu, Guodong; Dowling, Kevin; Ollis, T Ben; Irminger, Philip; Xu, Yan; Dimitrovski, Aleksandar D

    2015-01-01

    A hardware-in-the-loop (HIL) microgrid testbed for the evaluation and assessment of microgrid operation and control system has been presented in this paper. The HIL testbed is composed of a real-time digital simulator (RTDS) for modeling of the microgrid, multiple NI CompactRIOs for device level control, a prototype microgrid energy management system (MicroEMS), and a relay protection system. The applied communication-assisted hybrid control system has been also discussed. Results of function testing of HIL controller, communication, and the relay protection system are presented to show the effectiveness of the proposed HIL microgrid testbed.

  17. Custom data support for the FAst -physics System Testbed and Research (FASTER) Project

    SciTech Connect

    Toto, T.; Jensen, M.; Vogelmann, A.; Wagener, R.; Liu, Y.; Lin, W.

    2010-03-15

    The multi-institution FAst -physics System Testbed and Research (FASTER) project, funded by the DOE Earth System Modeling program, aims to evaluate and improve the parameterizations of fast processes (those involving clouds, precipitation and aerosols) in global climate models, using a combination of numerical prediction models, single column models, cloud resolving models, large-eddy simulations, full global climate model output and ARM active and passive remote sensing and in-situ data. This poster presents the Custom Data Support effort for the FASTER project. The effort will provide tailored datasets, statistics, best estimates and quality control data, as needed and defined by FASTER participants, for use in evaluating and improving parameterizations of fast processes in GCMs. The data support will include custom gridding and averaging, for the model of interest, using high time resolution and pixel level data from continuous ARM observations and complementary datasets. In addition to the FASTER team, these datasets will be made available to the ARM Science Team. Initial efforts with respect to data product development, priorities, availability and distribution are summarized here with an emphasis on cloud, atmospheric state and aerosol properties as observed during the Spring 2000 Cloud IOP and the Spring 2003 Aerosol IOP at the ARM Southern Great Plains site.

  18. Comparison of millimeter-wave cloud radar measurements for the Fall 1997 Cloud IOP

    SciTech Connect

    Sekelsky, S.M.; Li, L.; Galloway, J.; McIntosh, R.E.; Miller, M.A.; Clothiaux, E.E.; Haimov, S.; Mace, G.; Sassen, K.

    1998-05-01

    One of the primary objectives of the Fall 1997 IOP was to intercompare Ka-band (350Hz) and W-band (95GHz) cloud radar observations and verify system calibrations. During September 1997, several cloud radars were deployed at the Southern Great Plains (SOP) Cloud and Radiation Testbed (CART) site, including the full time operation 35 GHz CART Millimeter-wave Cloud Radar (MMCR), the University of Massachusetts (UMass) single antenna 33GHz/95 GHz Cloud Profiling Radar System (CPRS), the 95 GHz Wyoming Cloud Radar (WCR) flown on the University of Wyoming King Air, the University of Utah 95 GHz radar and the dual-antenna Pennsylvania State University 94 GHz radar. In this paper the authors discuss several issues relevant to comparison of ground-based radars, including the detection and filtering of insect returns. Preliminary comparisons of ground-based Ka-band radar reflectivity data and comparisons with airborne radar reflectivity measurements are also presented.

  19. SMILES ice cloud products

    NASA Astrophysics Data System (ADS)

    MilláN, L.; Read, W.; Kasai, Y.; Lambert, A.; Livesey, N.; Mendrok, J.; Sagawa, H.; Sano, T.; Shiotani, M.; Wu, D. L.

    2013-06-01

    Upper tropospheric water vapor and clouds play an important role in Earth's climate, but knowledge of them, in particular diurnal variation in deep convective clouds, is limited. An essential variable to understand them is cloud ice water content. The Japanese Superconducting Submillimeter-Wave Limb-Emission Sounder (SMILES) on board the International Space Station (ISS) samples the atmosphere at different local times allowing the study of diurnal variability of atmospheric parameters. We describe a new ice cloud data set consisting of partial Ice Water Path and Ice Water Content. Preliminary comparisons with EOS-MLS, CloudSat-CPR and CALIOP-CALIPSO are presented. Then, the diurnal variation over land and over open ocean for partial ice water path is reported. Over land, a pronounced diurnal variation peaking strongly in the afternoon/early evening was found. Over the open ocean, little temporal dependence was encountered. This data set is publicly available for download in HDF5 format.

  20. Situational descriptions of behavioral procedures: the in situ testbed.

    PubMed Central

    Kemp, S M; Eckerman, D A

    2001-01-01

    We demonstrate the In Situ testbed, a system that aids in evaluating computational models of learning, including artificial neural networks. The testbed models contingencies of reinforcement rising an extension of Mechner's (1959) notational system for the description of behavioral procedures. These contingencies are input to the model under test. The model's output is displayed as cumulative records. The cumulative record can then be compared to one produced by a pigeon exposed to the same contingencies. The testbed is tried with three published models of learning. Each model is exposed to up to three reinforcement schedules (testing ends when the model does not produce acceptable cumulative records): continuous reinforcement and extinction, fixed ratio, and fixed interval. The In Sitt testbed appears to be a reliable and valid testing procedure for comparing models of learning. PMID:11394484

  1. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system is documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  2. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system are documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  3. Phoenix Missile Hypersonic Testbed (PMHT): System Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    A viewgraph presentation of the Phoenix Missile Hypersonic Testbed (PMHT) is shown. The contents include: 1) Need and Goals; 2) Phoenix Missile Hypersonic Testbed; 3) PMHT Concept; 4) Development Objectives; 5) Possible Research Payloads; 6) Possible Research Program Participants; 7) PMHT Configuration; 8) AIM-54 Internal Hardware Schematic; 9) PMHT Configuration; 10) New Guidance and Armament Section Profiles; 11) Nomenclature; 12) PMHT Stack; 13) Systems Concept; 14) PMHT Preflight Activities; 15) Notional Ground Path; and 16) Sample Theoretical Trajectories.

  4. Energy Aware Clouds

    NASA Astrophysics Data System (ADS)

    Orgerie, Anne-Cécile; de Assunção, Marcos Dias; Lefèvre, Laurent

    Cloud infrastructures are increasingly becoming essential components for providing Internet services. By benefiting from economies of scale, Clouds can efficiently manage and offer a virtually unlimited number of resources and can minimize the costs incurred by organizations when providing Internet services. However, as Cloud providers often rely on large data centres to sustain their business and offer the resources that users need, the energy consumed by Cloud infrastructures has become a key environmental and economical concern. This chapter presents an overview of techniques that can improve the energy efficiency of Cloud infrastructures. We propose a framework termed as Green Open Cloud, which uses energy efficient solutions for virtualized environments; the framework is validated on a reference scenario.

  5. Evaluation testbed for ATD performance prediction (ETAPP)

    NASA Astrophysics Data System (ADS)

    Ralph, Scott K.; Eaton, Ross; Snorrason, Magnús; Irvine, John; Vanstone, Steve

    2007-04-01

    Automatic target detection (ATD) systems process imagery to detect and locate targets in imagery in support of a variety of military missions. Accurate prediction of ATD performance would assist in system design and trade studies, collection management, and mission planning. A need exists for ATD performance prediction based exclusively on information available from the imagery and its associated metadata. We present a predictor based on image measures quantifying the intrinsic ATD difficulty on an image. The modeling effort consists of two phases: a learning phase, where image measures are computed for a set of test images, the ATD performance is measured, and a prediction model is developed; and a second phase to test and validate performance prediction. The learning phase produces a mapping, valid across various ATR algorithms, which is even applicable when no image truth is available (e.g., when evaluating denied area imagery). The testbed has plug-in capability to allow rapid evaluation of new ATR algorithms. The image measures employed in the model include: statistics derived from a constant false alarm rate (CFAR) processor, the Power Spectrum Signature, and others. We present performance predictors for two trained ATD classifiers, one constructed using using GENIE Pro TM, a tool developed at Los Alamos National Laboratory, and the other eCognition TM, developed by Definiens (http://www.definiens.com/products). We present analyses of the two performance predictions, and compare the underlying prediction models. The paper concludes with a discussion of future research.

  6. Ames life science telescience testbed evaluation

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  7. Further progress in watermark evaluation testbed (WET)

    NASA Astrophysics Data System (ADS)

    Kim, Hyung C.; Lin, Eugene T.; Guitart, Oriol; Delp, Edward J., III

    2005-03-01

    While Digital Watermarking has received much attention in recent years, it is still a relatively young technology. There are few accepted tools/metrics that can be used to evaluate the suitability of a watermarking technique for a specific application. This lack of a universally adopted set of metrics/methods has motivated us to develop a web-based digital watermark evaluation system called the Watermark Evaluation Testbed or WET. There have been more improvements over the first version of WET. We implemented batch mode with a queue that allows for user submitted jobs. In addition to StirMark 3.1 as an attack module, we added attack modules based on StirMark 4.0. For a new image fidelity measure, we evaluate conditional entropy as an image fidelity measure for different watermarking algorithms and different attacks. Also, we show the results of curve fitting the Receiver Operating Characteristic (ROC) analysis data using the Parzen window density estimation. The curve fits the data closely while having only two parameters to estimate.

  8. Optical testbed for the LISA phasemeter

    NASA Astrophysics Data System (ADS)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  9. A Hydrometeorological Testbed For Western Water Issues

    NASA Astrophysics Data System (ADS)

    Ralph, F. M.; Reynolds, D.; Martner, B. E.; Kingsmill, D. E.; White, A. B.; Whitaker, J. S.

    2002-12-01

    Based on experience gained between 1997 and 2002 in a series of three West Coast experiments focused on improved prediction of precipitation in land-falling Pacific winter storms, NOAA is leading the creation of a regional Hydrometeorological Testbed (HMT). The goal of this effort is to advance both the understanding of fundamental physical processes influencing primarily winter-season precipitation (rain and snow) in mountainous regions, and to improve quantitative precipitation forecasting, main-stem river flood warnings and flash-flood warning lead time in such regions. The focus will be on processes spanning the weather-climate connection, from the mesoscale to tropical-extratropical connections that modulate regional short-term climate anomalies influencing precipitation. The geographic area covered by the initial HMT encompasses the flood-prone Russian River and Sacramento River watersheds in northern California. While these watersheds represent some of the greatest flood risks in the nation, the scientific and operational results developed there will have bearing on winter season hydrometeorological prediction in many other locations. These goals will be addressed through a joint effort between scientists, weather forecasters, hydrologists and forecast users that will define both the needs and methodologies to tackle this important problem. Annual field activities will begin in the winter of 2002/03, building on results from earlier studies in the region. Priorities and leveraging opportunities for wider participation in the winter 2003/04 season will be explored in upcoming planning meetings where broad input is encouraged.

  10. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  11. Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2

    NASA Technical Reports Server (NTRS)

    Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.

    2003-01-01

    Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used

  12. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  13. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  14. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  15. Advanced turboprop testbed systems study. Volume 1: Testbed program objectives and priorities, drive system and aircraft design studies, evaluation and recommendations and wind tunnel test plans

    NASA Technical Reports Server (NTRS)

    Bradley, E. S.; Little, B. H.; Warnock, W.; Jenness, C. M.; Wilson, J. M.; Powell, C. W.; Shoaf, L.

    1982-01-01

    The establishment of propfan technology readiness was determined and candidate drive systems for propfan application were identified. Candidate testbed aircraft were investigated for testbed aircraft suitability and four aircraft selected as possible propfan testbed vehicles. An evaluation of the four candidates was performed and the Boeing KC-135A and the Gulfstream American Gulfstream II recommended as the most suitable aircraft for test application. Conceptual designs of the two recommended aircraft were performed and cost and schedule data for the entire testbed program were generated. The program total cost was estimated and a wind tunnel program cost and schedule is generated in support of the testbed program.

  16. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  17. NASA's Coastal and Ocean Airborne Science Testbed

    NASA Astrophysics Data System (ADS)

    Guild, L. S.; Dungan, J. L.; Edwards, M.; Russell, P. B.; Morrow, J. H.; Hooker, S.; Myers, J.; Kudela, R. M.; Dunagan, S.; Soulage, M.; Ellis, T.; Clinton, N. E.; Lobitz, B.; Martin, K.; Zell, P.; Berthold, R. W.; Smith, C.; Andrew, D.; Gore, W.; Torres, J.

    2011-12-01

    The Coastal and Ocean Airborne Science Testbed (COAST) Project is a NASA Earth-science flight mission that will advance coastal ecosystems research by providing a unique airborne payload optimized for remote sensing in the optically complex coastal zone. Teaming NASA Ames scientists and engineers with Biospherical Instruments, Inc. (San Diego) and UC Santa Cruz, the airborne COAST instrument suite combines a customized imaging spectrometer, sunphotometer system, and a new bio-optical radiometer package to obtain ocean/coastal/atmosphere data simultaneously in flight for the first time. The imaging spectrometer (Headwall) is optimized in the blue region of the spectrum to emphasize remote sensing of marine and freshwater ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data will be accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Based on optical detectors called microradiometers, the NASA Ocean Biology and Biogeochemistry Calibration and Validation (cal/val) Office team has deployed advanced commercial off-the-shelf instrumentation that provides in situ measurements of the apparent optical properties at the land/ocean boundary including optically shallow aquatic ecosystems (e.g., lakes, estuaries, coral reefs). A complimentary microradiometer instrument package (Biospherical Instruments, Inc.), optimized for use above water, will be flown for the first time with the airborne instrument suite. Details of the October 2011 COAST airborne mission over Monterey Bay demonstrating this new airborne instrument suite capability will be presented, with associated preliminary data on coastal ocean color products, coincident spatial and temporal data on aerosol optical depth and water vapor column content, as well as derived exact water-leaving radiances.

  18. X.509 Authentication/Authorization in FermiCloud

    SciTech Connect

    Kim, Hyunwoo; Timm, Steven

    2014-11-11

    We present a summary of how X.509 authentication and authorization are used with OpenNebula in FermiCloud. We also describe a history of why the X.509 authentication was needed in FermiCloud, and review X.509 authorization options, both internal and external to OpenNebula. We show how these options can be and have been used to successfully run scientific workflows on federated clouds, which include OpenNebula on FermiCloud and Amazon Web Services as well as other community clouds. We also outline federation options being used by other commercial and open-source clouds and cloud research projects.

  19. Kite: Status of the External Metrology Testbed for SIM

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar; Azizi, Alireza; Moser, Steven; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-01-01

    Kite is a system level testbed for the External Metrology system of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducial that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to tens of picometers in order to correct for thermal measurements, the Kite testbed was build to test both the metrology gauges and out ability to optically model the system at these levels. The Kite testbed is an over-constraint system where 6 lengths are measured, but only 5 are needed to determine the system. The agreement in the over-constrained length needs to be on the order of 140 pm for the SIM Wide-Angle observing scenario and 8 pm for the Narrow-Angle observing scenario. We demonstrate that we have met the Wide-Angle goal with our current setup. For the Narrow-Angle case, we have only reached the goal for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more SIM like.

  20. Development of the CSI phase-3 evolutionary model testbed

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Davis, D. A.; Tan, M. K.

    1994-01-01

    This report documents the development effort for the reconfiguration of the Controls-Structures Integration (CSI) Evolutionary Model (CEM) Phase-2 testbed into the CEM Phase-3 configuration. This step responds to the need to develop and test CSI technologies associated with typical planned earth science and remote sensing platforms. The primary objective of the CEM Phase-3 ground testbed is to simulate the overall on-orbit dynamic behavior of the EOS AM-1 spacecraft. Key elements of the objective include approximating the low-frequency appendage dynamic interaction of EOS AM-1, allowing for the changeout of components, and simulating the free-free on-orbit environment using an advanced suspension system. The fundamentals of appendage dynamic interaction are reviewed. A new version of the multiple scaling method is used to design the testbed to have the full-scale geometry and dynamics of the EOS AM-1 spacecraft, but at one-tenth the weight. The testbed design is discussed, along with the testing of the solar array, high gain antenna, and strut components. Analytical performance comparisons show that the CEM Phase-3 testbed simulates the EOS AM-1 spacecraft with good fidelity for the important parameters of interest.

  1. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Astrophysics Data System (ADS)

    Kortenkamp, David; Izygon, Michel; Lawler, Dennis; Schreckenghost, Debra; Bonasso, R. Peter; Wang, Lui; Kennedy, Kriss

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  2. Cloud regimes as phase transitions

    NASA Astrophysics Data System (ADS)

    Stechmann, Samuel N.; Hottovy, Scott

    2016-06-01

    Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes -- open versus closed cells -- fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. With this new conceptual viewpoint, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions.

  3. Laser Metrology in the Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    An, Xin; Marx, D.; Goullioud, Renaud; Zhao, Feng

    2004-01-01

    The Space Interferometer Mission (SIM), scheduled for launch in 2009, is a space-born visible light stellar interferometer capable of micro-arcsecond-level astrometry. The Micro-Arcsecond Metrology testbed (MAM) is the ground-based testbed that incorporates all the functionalities of SIM minus the telescope, for mission-enabling technology development and verification. MAM employs a laser heterodyne metrology system using the Sub-Aperture Vertex-to-Vertex (SAVV) concept. In this paper, we describe the development and modification of the SAVV metrology launchers and the metrology instrument electronics, precision alignments and pointing control, locating cyclic error sources in the MAM testbed and methods to mitigate the cyclic errors, as well as the performance under the MAM performance metrics.

  4. Experimental Test-Bed for Intelligent Passive Array Research

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Torres, Miguel; David, Sunil; Isom, Adam; Cotto, Jose; Sharaiha, Samer

    2004-01-01

    This document describes the test-bed designed for the investigation of passive direction finding, recognition, and classification of speech and sound sources using sensor arrays. The test-bed forms the experimental basis of the Intelligent Small-Scale Spatial Direction Finder (ISS-SDF) project, aimed at furthering digital signal processing and intelligent sensor capabilities of sensor array technology in applications such as rocket engine diagnostics, sensor health prognostics, and structural anomaly detection. This form of intelligent sensor technology has potential for significant impact on NASA exploration, earth science and propulsion test capabilities. The test-bed consists of microphone arrays, power and signal distribution modules, web-based data acquisition, wireless Ethernet, modeling, simulation and visualization software tools. The Acoustic Sensor Array Modeler I (ASAM I) is used for studying steering capabilities of acoustic arrays and testing DSP techniques. Spatial sound distribution visualization is modeled using the Acoustic Sphere Analysis and Visualization (ASAV-I) tool.

  5. A Testbed for Deploying Distributed State Estimation in Power Grid

    SciTech Connect

    Jin, Shuangshuang; Chen, Yousu; Rice, Mark J.; Liu, Yan; Gorton, Ian

    2012-07-22

    Abstract—With the increasing demand, scale and data information of power systems, fast distributed applications are becoming more important in power system operation and control. This paper proposes a testbed for evaluating power system distributed applications, considering data exchange among distributed areas. A high-performance computing (HPC) version of distributed state estimation is implemented and used as a distributed application example. The IEEE 118-bus system is used to deploy the parallel distributed state estimation, and the MeDICi middleware is used for data communication. The performance of the testbed demonstrates its capability to evaluate parallel distributed state estimation by leveraging the HPC paradigm. This testbed can also be applied to evaluate other distributed applications.

  6. Telescience testbed pilot program, volume 2: Program results

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.

  7. High Contrast Imaging Testbed for the Terrestrial Planet Finder Coronagraph

    NASA Technical Reports Server (NTRS)

    Lowmman, Andrew E.; Trauger, John T.; Gordon, Brian; Green, Joseph J.; Moody, Dwight; Niessner, Albert F.; Shi, Fang

    2004-01-01

    The Terrestrial Planet Finder (TPF) mission is planning to launch a visible coronagraphic space telescope in 2014. To achieve TPF science goals, the coronagraph must have extreme levels of wavefront correction (less than 1 Angstrom rms over controllable spatial frequencies) and stability to get the necessary suppression of diffracted starlight (approximately l0(exp -10)) contrast at an angular separation approximately 4 (lamda)/D). TPF Coronagraph's primary platform for experimentation is the High Contrast Imaging Testbed, which will provide laboratory validation of key technologies as well as demonstration of a flight-traceable approach to implementation. Precision wavefront control in the testbed is provided by a high actuator density deformable mirror. Diffracted light control is achieved through use of occulting or apodizing masks and stops. Contrast measurements will establish the technical feasibility of TPF requirements, while model and error budget validation will demonstrate implementation viability. This paper describes the current testbed design, development approach, and recent experimental results.

  8. Design optimization of the JPL Phase B testbed

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Salama, M.; Wette, M.; Chu, Cheng-Chih

    1993-01-01

    Increasingly complex spacecraft will benefit from integrated design and optimization of structural, optical, and control subsystems. Integrated design optimization will allow designers to make tradeoffs in objectives and constraints across these subsystems. The location, number, and types of passive and active devices distributed along the structure can have a dramatic impact on overall system performance. In addition, the manner in which structural mass is distributed can also serve as an effective mechanism for attenuating disturbance transmission between source and sensitive system components. This paper presents recent experience using optimization tools that have been developed for addressing some of these issues on a challenging testbed design problem. This particular testbed is one of a series of testbeds at the Jet Propulsion Laboratory under the sponsorship of the NASA Control Structure Interaction (CSI) Program to demonstrate nanometer level optical pathlength control on a flexible truss structure that emulates a spaceborne interferometer.

  9. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  10. Progress on an external occulter testbed at flight Fresnel numbers

    NASA Astrophysics Data System (ADS)

    Kim, Yunjong; Sirbu, Dan; Galvin, Michael; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-01-01

    An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The occulter testbed uses 78 m optical propagation distance to realize the flight Fresnel numbers. We will use an etched silicon mask as the occulter. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the mechanical design of the testbed. We compare the experimental results with simulations that predict the ultimate contrast performance.

  11. Experiences with the JPL telerobot testbed: Issues and insights

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Balaram, Bob; Beahan, John

    1989-01-01

    The Jet Propulsion Laboratory's (JPL) Telerobot Testbed is an integrated robotic testbed used to develop, implement, and evaluate the performance of advanced concepts in autonomous, tele-autonomous, and tele-operated control of robotic manipulators. Using the Telerobot Testbed, researchers demonstrated several of the capabilities and technological advances in the control and integration of robotic systems which have been under development at JPL for several years. In particular, the Telerobot Testbed was recently employed to perform a near completely automated, end-to-end, satellite grapple and repair sequence. The task of integrating existing as well as new concepts in robot control into the Telerobot Testbed has been a very difficult and timely one. Now that researchers have completed the first major milestone (i.e., the end-to-end demonstration) it is important to reflect back upon experiences and to collect the knowledge that has been gained so that improvements can be made to the existing system. It is also believed that the experiences are of value to the others in the robotics community. Therefore, the primary objective here will be to use the Telerobot Testbed as a case study to identify real problems and technological gaps which exist in the areas of robotics and in particular systems integration. Such problems have surely hindered the development of what could be reasonably called an intelligent robot. In addition to identifying such problems, researchers briefly discuss what approaches have been taken to resolve them or, in several cases, to circumvent them until better approaches can be developed.

  12. UltraSciencenet: High- Performance Network Research Test-Bed

    SciTech Connect

    Rao, Nageswara S; Wing, William R; Poole, Stephen W; Hicks, Susan Elaine; DeNap, Frank A; Carter, Steven M; Wu, Qishi

    2009-04-01

    The high-performance networking requirements for next generation large-scale applications belong to two broad classes: (a) high bandwidths, typically multiples of 10Gbps, to support bulk data transfers, and (b) stable bandwidths, typically at much lower bandwidths, to support computational steering, remote visualization, and remote control of instrumentation. Current Internet technologies, however, are severely limited in meeting these demands because such bulk bandwidths are available only in the backbone, and stable control channels are hard to realize over shared connections. The UltraScience Net (USN) facilitates the development of such technologies by providing dynamic, cross-country dedicated 10Gbps channels for large data transfers, and 150 Mbps channels for interactive and control operations. Contributions of the USN project are two-fold: (a) Infrastructure Technologies for Network Experimental Facility: USN developed and/or demonstrated a number of infrastructure technologies needed for a national-scale network experimental facility. Compared to Internet, USN's data-plane is different in that it can be partitioned into isolated layer-1 or layer-2 connections, and its control-plane is different in the ability of users and applications to setup and tear down channels as needed. Its design required several new components including a Virtual Private Network infrastructure, a bandwidth and channel scheduler, and a dynamic signaling daemon. The control-plane employs a centralized scheduler to compute the channel allocations and a signaling daemon to generate configuration signals to switches. In a nutshell, USN demonstrated the ability to build and operate a stable national-scale switched network. (b) Structured Network Research Experiments: A number of network research experiments have been conducted on USN that cannot be easily supported over existing network facilities, including test-beds and production networks. It settled an open matter by demonstrating

  13. Telescience testbed pilot program, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.

  14. The Wide-Field Imaging Interferometry Testbed: Recent Progress

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.

    2010-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) at NASA's Goddard Space Flight Center was designed to demonstrate the practicality and application of techniques for wide-field spatial-spectral ("double Fourier") interferometry. WIIT is an automated system, and it is now producing substantial amounts of high-quality data from its state-of-the-art operating environment, Goddard's Advanced Interferometry and Metrology Lab. In this paper, we discuss the characterization and operation of the testbed and present the most recent results. We also outline future research directions. A companion paper within this conference discusses the development of new wide-field double Fourier data analysis algorithms.

  15. The Living With a Star Space Environment Testbed Program

    NASA Technical Reports Server (NTRS)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  16. Telescience Testbed Pilot Project - Evaluation environment for Space Station operations

    NASA Technical Reports Server (NTRS)

    Wiskerchen, Michael J.; Leiner, Barry M.

    1988-01-01

    The objectives of the Telescience Testbed Pilot Program (TTPP) are discussed. The purpose of the TTPP, which involves 15 universities in cooperation with various NASA centers, is to demonstrate the utility of a user-oriented rapid prototyping testbed approach to developing and refining science requirements and validation concepts and approaches for the information systems of the Space Station era and beyond. It is maintained that the TTPP provides an excellent environment, with low programmatic schedule and budget risk, for testing and evaluating new operations concepts and technologies.

  17. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  18. Testbeds for Assessing Critical Scenarios in Power Control Systems

    NASA Astrophysics Data System (ADS)

    Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem

    The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.

  19. Testbed model and data assimilation for ARM. Progress report No. 3, 1 September 1992--30 April 1993

    SciTech Connect

    Louis, J.F.

    1993-04-28

    The ultimate objectives of this research are to further develop ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to several related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parametrizations; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation uses a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data.

  20. An Integrated Testbed for Cooperative Perception with Heterogeneous Mobile and Static Sensors

    PubMed Central

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  1. An integrated testbed for cooperative perception with heterogeneous mobile and static sensors.

    PubMed

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  2. A programmable laboratory testbed in support of evaluation of functional brain activation and connectivity.

    PubMed

    Barbour, Randall L; Graber, Harry L; Xu, Yong; Pei, Yaling; Schmitz, Christoph H; Pfeil, Douglas S; Tyagi, Anandita; Andronica, Randy; Lee, Daniel C; Barbour, San-Lian S; Nichols, J David; Pflieger, Mark E

    2012-03-01

    An important determinant of the value of quantitative neuroimaging studies is the reliability of the derived information, which is a function of the data collection conditions. Near infrared spectroscopy (NIRS) and electroencelphalography are independent sensing domains that are well suited to explore principal elements of the brain's response to neuroactivation, and whose integration supports development of compact, even wearable, systems suitable for use in open environments. In an effort to maximize the translatability and utility of such resources, we have established an experimental laboratory testbed that supports measures and analysis of simulated macroscopic bioelectric and hemodynamic responses of the brain. Principal elements of the testbed include 1) a programmable anthropomorphic head phantom containing a multisignal source array embedded within a matrix that approximates the background optical and bioelectric properties of the brain, 2) integrated translatable headgear that support multimodal studies, and 3) an integrated data analysis environment that supports anatomically based mapping of experiment-derived measures that are directly and not directly observable. Here, we present a description of system components and fabrication, an overview of the analysis environment, and findings from a representative study that document the ability to experimentally validate effective connectivity models based on NIRS tomography. PMID:22438333

  3. xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud[OPEN

    PubMed Central

    Merchant, Nirav

    2016-01-01

    Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today’s pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant’s Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. PMID:27020957

  4. A Portable MIMO Testbed and Selected Channel Measurements

    NASA Astrophysics Data System (ADS)

    Goud, Paul, Jr.; Hang, Robert; Truhachev, Dmitri; Schlegel, Christian

    2006-12-01

    A portable[InlineEquation not available: see fulltext.] multiple-input multiple-output (MIMO) testbed that is based on field programmable gate arrays (FPGAs) and which operates in the 902-928 MHz industrial, scientific, and medical (ISM) band has been developed by the High Capacity Digital Communications (HCDC) Laboratory at the University of Alberta. We present a description of the HCDC testbed along with MIMO channel capacities that were derived from measurements taken with the HCDC testbed for three special locations: a narrow corridor, an athletics field that is surrounded by a metal fence, and a parkade. These locations are special because the channel capacities are different from what is expected for a typical indoor or outdoor channel. For two of the cases, a ray-tracing analysis has been performed and the simulated channel capacity values closely match the values calculated from the measured data. A ray-tracing analysis, however, requires accurate geometrical measurements and sophisticated modeling for each specific location. A MIMO testbed is ideal for quickly obtaining accurate channel capacity information.

  5. Operation Duties on the F-15B Research Testbed

    NASA Technical Reports Server (NTRS)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  6. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  7. Extending the Information Commons: From Instructional Testbed to Internet2

    ERIC Educational Resources Information Center

    Beagle, Donald

    2002-01-01

    The author's conceptualization of an Information Commons (IC) is revisited and elaborated in reaction to Bailey and Tierney's article. The IC's role as testbed for instructional support and knowledge discovery is explored, and progress on pertinent research is reviewed. Prospects for media-rich learning environments relate the IC to the…

  8. Smart Antenna UKM Testbed for Digital Beamforming System

    NASA Astrophysics Data System (ADS)

    Islam, Mohammad Tariqul; Misran, Norbahiah; Yatim, Baharudin

    2009-12-01

    A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH) array antenna and software reconfigurable digital beamforming system (DBS). The antenna is developed based on using the novel LIEH microstrip patch element design arranged into [InlineEquation not available: see fulltext.] uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance [InlineEquation not available: see fulltext.] floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88-2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  9. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme. PMID:27137302

  10. Developmental Cryogenic Active Telescope Testbed, a Wavefront Sensing and Control Testbed for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Leboeuf, Claudia M.; Davila, Pamela S.; Redding, David C.; Morell, Armando; Lowman, Andrew E.; Wilson, Mark E.; Young, Eric W.; Pacini, Linda K.; Coulter, Dan R.

    1998-01-01

    As part of the technology validation strategy of the next generation space telescope (NGST), a system testbed is being developed at GSFC, in partnership with JPL and Marshall Space Flight Center (MSFC), which will include all of the component functions envisioned in an NGST active optical system. The system will include an actively controlled, segmented primary mirror, actively controlled secondary, deformable, and fast steering mirrors, wavefront sensing optics, wavefront control algorithms, a telescope simulator module, and an interferometric wavefront sensor for use in comparing final obtained wavefronts from different tests. The developmental. cryogenic active telescope testbed (DCATT) will be implemented in three phases. Phase 1 will focus on operating the testbed at ambient temperature. During Phase 2, a cryocapable segmented telescope will be developed and cooled to cryogenic temperature to investigate the impact on the ability to correct the wavefront and stabilize the image. In Phase 3, it is planned to incorporate industry developed flight-like components, such as figure controlled mirror segments, cryogenic, low hold power actuators, or different wavefront sensing and control hardware or software. A very important element of the program is the development and subsequent validation of the integrated multidisciplinary models. The Phase 1 testbed objectives, plans, configuration, and design will be discussed.

  11. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  12. Arctic Clouds

    Atmospheric Science Data Center

    2013-04-19

    ...     View Larger Image Stratus clouds are common in the Arctic during the summer months, and are ... formats available at JPL August 23, 2000 - Stratus clouds help modulate the arctic climate. project:  ...

  13. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  14. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  15. Trusted Computing Strengthens Cloud Authentication

    PubMed Central

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  16. Climate zones for maritime clouds

    SciTech Connect

    White, A.B.; Ruffieux, D.; Fairall, C.W.

    1995-04-01

    In this paper we use a commercially available lidar ceilometer to investigate how the basic structure of marine boundary-layer clouds varies for four different marine climate regimes. We obtained most of the data used in this analysis from ship-based ceilometer measurements recorded during several different atmospheric and oceanographic field programs conducted in the Atlantic and Pacific oceans. For comparison, we show the results obtained at a mid-latitude continental location and at an ice camp on the Arctic ice shelf. For each analyzed case, we use an extended time series to generate meaningful cloud base and cloud fraction statistics. The Vaisala CT 12K ceilometer uses a GaAs diode laser to produce short (150 ns), high-intensity pulses of infrared radiation (904 nm wavelength). The return signals from a large number of consecutive pulses are coherently summed to boost the signal-to-noise ratio. Each resulting 30-s profile of backscattered power (15-m resolution) is analyzed to detect cloud layers using a specified cloud detection limit. In addition to measurements of cloud base, the ceilometer can also provide information on cloud fraction using a time series of the {open_quotes}cloud{close_quotes} or {open_quotes} no cloud{close_quotes} status reported in the 30-s data.

  17. Openings

    PubMed Central

    Selwyn, Peter A.

    2015-01-01

    Reviewing his clinic patient schedule for the day, a physician reflects on the history of a young woman he has been caring for over the past 9 years. What starts out as a routine visit then turns into a unique opening for communication and connection. A chance glimpse out the window of the exam room leads to a deeper meditation on parenthood, survival, and healing, not only for the patient but also for the physician. How many missed opportunities have we all had, without even realizing it, to allow this kind of fleeting but profound opening? PMID:26195687

  18. Openings.

    PubMed

    Selwyn, Peter A

    2015-01-01

    Reviewing his clinic patient schedule for the day, a physician reflects on the history of a young woman he has been caring for over the past 9 years. What starts out as a routine visit then turns into a unique opening for communication and connection. A chance glimpse out the window of the exam room leads to a deeper meditation on parenthood, survival, and healing, not only for the patient but also for the physician. How many missed opportunities have we all had, without even realizing it, to allow this kind of fleeting but profound opening? PMID:26195687

  19. The advanced orbiting systems testbed program: Results to date

    NASA Technical Reports Server (NTRS)

    Newsome, Penny A.; Otranto, John F.

    1993-01-01

    The Consultative Committee for Space Data Systems Recommendations for Packet Telemetry and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. Central to the AOST Program are the development of an end-to-end Testbed and its use in a comprehensive testing program. Other Program activities include flight-qualifiable component development, supporting studies, and knowledge dissemination. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations. The results presented in this paper include architectural issues, a draft proposed standardized test suite and flight-qualifiable components.

  20. Telescience testbed pilot program, volume 3: Experiment summaries

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.

  1. Amplitude variations on the Extreme Adaptive Optics testbed

    SciTech Connect

    Evans, J; Thomas, S; Dillon, D; Gavel, D; Phillion, D; Macintosh, B

    2007-08-14

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. At the Laboratory for Adaptive Optics on the Extreme Adaptive Optics testbed, we have already demonstrated wavefront control of better than 1 nm rms within controllable spatial frequencies. Corresponding contrast measurements, however, are limited by amplitude variations, including those introduced by the micro-electrical-mechanical-systems (MEMS) deformable mirror. Results from experimental measurements and wave optic simulations of amplitude variations on the ExAO testbed are presented. We find systematic intensity variations of about 2% rms, and intensity variations with the MEMS to be 6%. Some errors are introduced by phase and amplitude mixing because the MEMS is not conjugate to the pupil, but independent measurements of MEMS reflectivity suggest that some error is introduced by small non-uniformities in the reflectivity.

  2. Development of a FDIR Validation Test-Bed

    NASA Astrophysics Data System (ADS)

    Andersson, Jan; Cederman, Daniel; Habinc, Sandi; Dellandrea, Brice; Nodet, Jean-Christian; Guettache, Farid; Furano, Gianluca

    2014-08-01

    This paper describes work being performed by Aeroflex Gaisler and Thales Alenia Space France for the European Space Agency to develop an extension of the existing avionics system testbed facility in ESTEC's Avionics Lab. The work is funded by the European Space Agency under contract 4000109928/13/NL/AK. The resulting FDIR (Fault Detection, Isolation and Recovery) testbed will allow to test concepts, strategy mechanisms and tools related to FDIR. The resulting facility will have the capabilities to support nominal and off-nominal test cases and to support tools for post testing and post simulation analysis. Ultimately the purpose of the output of this activity is to provide a tool for assessment and validation at laboratory level. This paper describes an on-going development; at the time of writing the activity is in the preliminary design phase.

  3. FDIR Validation Test-Bed Development and Results

    NASA Astrophysics Data System (ADS)

    Karlsson, Alexander; Sakthivel, Anandhavel; Aberg, Martin; Andersson, Jan; Habinc, Sandi; Dellandrea, Brice; Nodet, Jean-Christian; Guettache, Farid; Furano, Gianluca

    2015-09-01

    This paper describes work being performed by Cobham Gaisler and Thales Alenia Space France for the European Space Agency to develop an extension of the existing avionics system testbed facility in ESTEC's Avionics Lab. The work is funded by the European Space Agency under contract 4000109928/13/NL/AK. The resulting FDIR (Fault Detection, Isolation and Recovery) testbed will allow to test concepts, strategy mechanisms and tools related to FDIR. The resulting facility will have the capabilities to support nominal and off-nominal test cases and to support tools for post testing and post simulation analysis. Ultimately the purpose of the output of this activity is to provide a tool for assessment and validation at laboratory level. This paper describes an on-going development; at the time of writing the activity is in the validation phase.

  4. A single-axis testbed for slewing control experiments

    NASA Technical Reports Server (NTRS)

    Hamilton, Jonathan; Lee, Gordon K. F.; Juang, Jer-Nan

    1990-01-01

    A simple single-axis testbed is described, and initial experimental results are presented to illustrate collocated and noncollocated control for this structure. The testbed is made up of a pair of single-axis flexible beams attached to a DC servo motor. An optical encoder and strain gauges provide hub and beam position information, respectively. The system is driven by an IBM PC system; with a motor controller, a programmable digital filter processes position error information through user-selected gains and pole-zero configurations. A 25-kHz data acquisition system provides the necessary interface between processor and motor. The control approaches currently being investigated include collocated PD control and noncollocated phase compensation.

  5. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  6. System integration of a Telerobotic Demonstration System (TDS) testbed

    NASA Technical Reports Server (NTRS)

    Myers, John K.

    1987-01-01

    The concept for and status of a telerobotic demonstration system testbed that integrates teleoperation and robotics is described. The components of the telerobotic system are described and the ongoing projects are discussed. The system can be divided into two sections: the autonomous subsystems, and the additional interface and support subsystems including teleoperations. The workings of each subsystem by itself and how the subsystems integrate into a complete system is discussed.

  7. Optical modeling in Testbed Environment for Space Situational Awareness (TESSA).

    PubMed

    Nikolaev, Sergei

    2011-08-01

    We describe optical systems modeling in the Testbed Environment for Space Situational Awareness (TESSA) simulator. We begin by presenting a brief outline of the overall TESSA architecture and focus on components for modeling optical sensors. Both image generation and image processing stages are described in detail, highlighting the differences in modeling ground- and space-based sensors. We conclude by outlining the applicability domains for the TESSA simulator, including potential real-life scenarios. PMID:21833092

  8. High energy laser testbed for accurate beam pointing control

    NASA Astrophysics Data System (ADS)

    Kim, Dojong; Kim, Jae Jun; Frist, Duane; Nagashima, Masaki; Agrawal, Brij

    2010-02-01

    Precision laser beam pointing is a key technology in High Energy Laser systems. In this paper, a laboratory High Energy Laser testbed developed at the Naval Postgraduate School is introduced. System identification is performed and a mathematical model is constructed to estimate system performance. New beam pointing control algorithms are designed based on this mathematical model. It is shown in both computer simulation and experiment that the adaptive filter algorithm can improve the pointing performance of the system.

  9. Remotely Accessible Testbed for Software Defined Radio Development

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2012-01-01

    Previous development testbeds have assumed that the developer was physically present in front of the hardware being used. No provision for remote operation of basic functions (power on/off or reset) was made, because the developer/operator was sitting in front of the hardware, and could just push the button manually. In this innovation, a completely remotely accessible testbed has been created, with all diagnostic equipment and tools set up for remote access, and using standardized interfaces so that failed equipment can be quickly replaced. In this testbed, over 95% of the operating hours were used for testing without the developer being physically present. The testbed includes a pair of personal computers, one running Linux and one running Windows. A variety of peripherals is connected via Ethernet and USB (universal serial bus) interfaces. A private internal Ethernet is used to connect to test instruments and other devices, so that the sole connection to the outside world is via the two PCs. An important design consideration was that all of the instruments and interfaces used stable, long-lived industry standards, such as Ethernet, USB, and GPIB (general purpose interface bus). There are no plug-in cards for the two PCs, so there are no problems with finding replacement computers with matching interfaces, device drivers, and installation. The only thing unique to the two PCs is the locally developed software, which is not specific to computer or operating system version. If a device (including one of the computers) were to fail or become unavailable (e.g., a test instrument needed to be recalibrated), replacing it is a straightforward process with a standard, off-the-shelf device.

  10. Testbed for ROADM and WXC Based Metro WDM Networks

    NASA Astrophysics Data System (ADS)

    Zong, Lei; Ji, Philip; Xu, Lei; Wang, Ting; Matsuda, Osamu; Cvijetic, Milorad

    2005-11-01

    A testbed for metro wavelength division multiplexing (WDM) network is realized and tested. The testbed contains a reconfigurable optical add/drop multiplexer (ROADM) node, a 2x2 wavelength cross-connect (WXC) node, and two interconnected two-fiber bidirectional path protected switching ring networks (TF-BPSR). Both the ROADM and WXC node are bidirectional nodes, so they can select channels from the working and the protection ring networks simultaneously, and they support both protected and unprotected services. The ROADM node uses a flexible band tunable filter (FBTF) to drop a waveband from the input WDM signals and send the express channels directly to the output port. As a result, the physical impairment accumulated on the express channels can be minimized. It also has a modular structure, so additional modules can be cascaded to expand the capacity and functionality of the node without any interruption to current services. The WXC node is realized with interconnected ROADM modules that are comprised of wavelength selective switches (WSSes). Arbitrary wavelength or wavelength sets can be either dropped in the node or cross-connected in a non-blocking manner. Multiple services, such as OC-48 and OC-192 SONET signals, gigabit Ethernet streams carrying interactive movie signals, and live video broadcasting services, are carried in the network, dropped in the ROADM and WXC node, and switched between the two ring networks. The testbed is controlled by a websever based network management system that facilitates remote control and monitoring. Experiments demonstrate that the performance of the nodes and the testbed meets the requirement of the services.

  11. Planning and reasoning in the JPL telerobot testbed

    NASA Technical Reports Server (NTRS)

    Peters, Stephen; Mittman, David; Collins, Carol; Omeara, Jacquie; Rokey, Mark

    1990-01-01

    The Telerobot Interactive Planning System is developed to serve as the highest autonomous-control level of the Telerobot Testbed. A recent prototype is described which integrates an operator interface for supervisory control, a task planner supporting disassembly and re-assembly operations, and a spatial planner for collision-free manipulator motion through the workspace. Each of these components is described in detail. Descriptions of the technical problem, approach, and lessons learned are included.

  12. The Photovoltaic Engineering Testbed: Design options and trade-offs

    NASA Astrophysics Data System (ADS)

    Landis, Geoffrey A.; Sexton, Andrew; Abramczyk, Richard; Francz, Joseph; Johnson, D. B.; Yang, Liu; Minjares, Daniel; Myers, James

    2000-01-01

    The Photovoltaic Engineering Testbed (PET) is a space-exposure test facility to fly on the International Space Station to calibrate, test, and qualify advanced solar cell types in the space environment. The purpose is to reduce the cost of validating new technologies and bringing them to spaceflight readiness by measuring them in the in-space environment. This paper reviews engineering options considered for flying PET on the International Space Station, and presents the current status of development. .

  13. Development and experimentation of an eye/brain/task testbed

    NASA Technical Reports Server (NTRS)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  14. Cyber security analysis testbed : combining real, emulation, and simulation.

    SciTech Connect

    Villamarin, Charles H.; Eldridge, John M.; Van Leeuwen, Brian P.; Urias, Vincent E.

    2010-07-01

    Cyber security analysis tools are necessary to evaluate the security, reliability, and resilience of networked information systems against cyber attack. It is common practice in modern cyber security analysis to separately utilize real systems of computers, routers, switches, firewalls, computer emulations (e.g., virtual machines) and simulation models to analyze the interplay between cyber threats and safeguards. In contrast, Sandia National Laboratories has developed novel methods to combine these evaluation platforms into a hybrid testbed that combines real, emulated, and simulated components. The combination of real, emulated, and simulated components enables the analysis of security features and components of a networked information system. When performing cyber security analysis on a system of interest, it is critical to realistically represent the subject security components in high fidelity. In some experiments, the security component may be the actual hardware and software with all the surrounding components represented in simulation or with surrogate devices. Sandia National Laboratories has developed a cyber testbed that combines modeling and simulation capabilities with virtual machines and real devices to represent, in varying fidelity, secure networked information system architectures and devices. Using this capability, secure networked information system architectures can be represented in our testbed on a single, unified computing platform. This provides an 'experiment-in-a-box' capability. The result is rapidly-produced, large-scale, relatively low-cost, multi-fidelity representations of networked information systems. These representations enable analysts to quickly investigate cyber threats and test protection approaches and configurations.

  15. Visually guided grasping to study teleprogrammation within the BAROCO testbed

    NASA Technical Reports Server (NTRS)

    Devy, M.; Garric, V.; Delpech, M.; Proy, C.

    1994-01-01

    This paper describes vision functionalities required in future orbital laboratories; in such systems, robots will be needed in order to execute the on-board scientific experiments or servicing and maintenance tasks under the remote control of ground operators. For this sake, ESA has proposed a robotic configuration called EMATS; a testbed has been developed by ESTEC in order to evaluate the potentialities of EMATS-like robot to execute scientific tasks in automatic mode. For the same context, CNES develops the BAROCO testbed to investigate remote control and teleprogrammation, in which high level primitives like 'Pick Object A' are provided as basic primitives. In nominal situations, the system has an a priori knowledge about the position of all objects. These positions are not very accurate, but this knowledge is sufficient in order to predict the position of the object which must be grasped, with respect to the manipulator frame. Vision is required in order to insure a correct grasping and to guarantee a good accuracy for the following operations. We describe our results about a visually guided grasping of static objects. It seems to be a very classical problem, and a lot of results are available. But, in many cases, it lacks a realistic evaluation of the accuracy, because such an evaluation requires tedious experiments. We propose several results about calibration of the experimental testbed, recognition algorithms required to locate a 3D polyhedral object, and the grasping itself.

  16. Visible Nulling Coronagraphy Testbed Development for Exoplanet Detection

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-01-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 10(exp 8) , 10(exp 9) and 10(exp 10) at an inner working angle of 2*lambda/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  17. The Mini-Mast CSI testbed: Lessons learned

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Belvin, W. Keith; Horta, Lucas G.; Pappa, R. S.

    1993-01-01

    The Mini-Mast testbed was one of the first large scale Controls-Structure-Interaction (CSI) systems used to evaluate state-of-the-art methodology in flexible structure control. Now that all the testing at Langley Research Center has been completed, a look back is warranted to evaluate the program. This paper describes some of the experiences and technology development studies by NASA, university, and industry investigators. Lessons learned are presented from three categories: the testbed development, control methods, and the operation of a guest investigator program. It is shown how structural safety margins provided a realistic environment to simulate on-orbit CSI research, even though they also reduced the research flexibility afforded to investigators. The limited dynamic coupling between the bending and torsion modes of the cantilevered test article resulted in highly successful SISO and MIMO controllers. However, until accurate models were obtained for the torque wheel actuators, sensors, filters, and the structure itself, most controllers were unstable. Controls research from this testbed should be applicable to cantilevered appendages of future large space structures.

  18. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    PubMed Central

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  19. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets

  20. 75 FR 64258 - Cloud Computing Forum & Workshop II

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-19

    ...NIST announces the Cloud Computing Forum & Workshop II to be held on November 4 and 5, 2010. This workshop will provide information on a Cloud Computing Roadmap Strategy as well as provide an updated status on NIST efforts to help develop open standards in interoperability, portability and security in cloud computing. The goals of this workshop are: Public announcement of the Cloud Computing......

  1. Development of Liquid Propulsion Systems Testbed at MSFC

    NASA Technical Reports Server (NTRS)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  2. The computational structural mechanics testbed generic structural-element processor manual

    NASA Technical Reports Server (NTRS)

    Stanley, Gary M.; Nour-Omid, Shahram

    1990-01-01

    The usage and development of structural finite element processors based on the CSM Testbed's Generic Element Processor (GEP) template is documented. By convention, such processors have names of the form ESi, where i is an integer. This manual is therefore intended for both Testbed users who wish to invoke ES processors during the course of a structural analysis, and Testbed developers who wish to construct new element processors (or modify existing ones).

  3. Sensor Networking Testbed with IEEE 1451 Compatibility and Network Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Gurkan, Deniz; Yuan, X.; Benhaddou, D.; Figueroa, F.; Morris, Jonathan

    2007-01-01

    Design and implementation of a testbed for testing and verifying IEEE 1451-compatible sensor systems with network performance monitoring is of significant importance. The performance parameters measurement as well as decision support systems implementation will enhance the understanding of sensor systems with plug-and-play capabilities. The paper will present the design aspects for such a testbed environment under development at University of Houston in collaboration with NASA Stennis Space Center - SSST (Smart Sensor System Testbed).

  4. Spacelab system analysis: A study of the Marshall Avionics System Testbed (MAST)

    NASA Technical Reports Server (NTRS)

    Ingels, Frank M.; Owens, John K.; Daniel, Steven P.; Ahmad, F.; Couvillion, W.

    1988-01-01

    An analysis of the Marshall Avionics Systems Testbed (MAST) communications requirements is presented. The average offered load for typical nodes is estimated. Suitable local area networks are determined.

  5. Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test S

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test Sets and Networks Integration Management Office Testing for the Tracking and Data Relay Satellite System

  6. Wavefront Amplitude Variation of TPF's High Contrast Imaging Testbed: Modeling and Experiment

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Lowman, Andrew E.; Moody, Dwight C.; Niessner, Albert F.; Trauger, John T.

    2005-01-01

    Knowledge of wavefront amplitude is as important as the knowledge of phase for a coronagraphic high contrast imaging system. Efforts have been made to understand various contributions of the amplitude variation in Terrestrial Planet Finder's (TPF) High Contrast Imaging Testbed (HCIT). Modeling of HCIT with as-built mirror surfaces has shown an amplitude variation of 1.3% due to the phase-amplitude mixing for the testbed's front-end optics. Experimental measurements on the testbed have shown the amplitude variation is about 2.5% with the testbed's illumination pattern has a major contribution as the low order amplitude variation.

  7. Cloud Control

    ERIC Educational Resources Information Center

    Ramaswami, Rama; Raths, David; Schaffhauser, Dian; Skelly, Jennifer

    2011-01-01

    For many IT shops, the cloud offers an opportunity not only to improve operations but also to align themselves more closely with their schools' strategic goals. The cloud is not a plug-and-play proposition, however--it is a complex, evolving landscape that demands one's full attention. Security, privacy, contracts, and contingency planning are all…

  8. Cloud Cover

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2012-01-01

    This article features a major statewide initiative in North Carolina that is showing how a consortium model can minimize risks for districts and help them exploit the advantages of cloud computing. Edgecombe County Public Schools in Tarboro, North Carolina, intends to exploit a major cloud initiative being refined in the state and involving every…

  9. Cloud Control

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2012-01-01

    Your learning curriculum needs a new technological platform, but you don't have the expertise or IT equipment to pull it off in-house. The answer is a learning system that exists online, "in the cloud," where learners can access it anywhere, anytime. For trainers, cloud-based coursework often means greater ease of instruction resulting in greater…

  10. EXPERT: An atmospheric re-entry test-bed

    NASA Astrophysics Data System (ADS)

    Massobrio, F.; Viotto, R.; Serpico, M.; Sansone, A.; Caporicci, M.; Muylaert, J.-M.

    2007-06-01

    In recognition of the importance of an independent European access to the International Space Station (ISS) and in preparation for the future needs of exploration missions, ESA is conducting parallel activities to generate flight data using atmospheric re-entry test-beds and to identify vehicle design solutions for human and cargo transportation vehicles serving the ISS and beyond. The EXPERT (European eXPErimental Re-entry Test-bed) vehicle represents the major on-going development in the first class of activities. Its results may also benefit in due time scientific missions to planets with an atmosphere and future reusable launcher programmes. The objective of EXPERT is to provide a test-bed for the validation of aerothermodynamics models, codes and ground test facilities in a representative flight environment, to improve the understanding of issues related to analysis, testing and extrapolation to flight. The vehicle will be launched on a sub-orbital trajectory using a Volna missile. The EXPERT concept is based on a symmetrical re-entry capsule whose shape is composed of simple geometrical elements. The suborbital trajectory will reach 120 km altitude and a re-entry velocity of 5 6km/s. The dimensions of the capsule are 1.6 m high and 1.3 m diameter; the overall mass is in the range of 250 350kg, depending upon the mission parameters and the payload/instrumentation complement. A consistent number of scientific experiments are foreseen on-board, from innovative air data system to shock wave/boundary layer interaction, from sharp hot structures characterisation to natural and induced regime transition. Currently the project is approaching completion of the phase B, with Alenia Spazio leading the industrial team and CIRA coordinating the scientific payload development under ESA contract.

  11. The Living With a Star Space Environment Testbed Experiments

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  12. CCPP-ARM Parameterization Testbed Model Forecast Data

    DOE Data Explorer

    Klein, Stephen

    2008-01-15

    Dataset contains the NCAR CAM3 (Collins et al., 2004) and GFDL AM2 (GFDL GAMDT, 2004) forecast data at locations close to the ARM research sites. These data are generated from a series of multi-day forecasts in which both CAM3 and AM2 are initialized at 00Z every day with the ECMWF reanalysis data (ERA-40), for the year 1997 and 2000 and initialized with both the NASA DAO Reanalyses and the NCEP GDAS data for the year 2004. The DOE CCPP-ARM Parameterization Testbed (CAPT) project assesses climate models using numerical weather prediction techniques in conjunction with high quality field measurements (e.g. ARM data).

  13. The CSM testbed matrix processors internal logic and dataflow descriptions

    NASA Technical Reports Server (NTRS)

    Regelbrugge, Marc E.; Wright, Mary A.

    1988-01-01

    This report constitutes the final report for subtask 1 of Task 5 of NASA Contract NAS1-18444, Computational Structural Mechanics (CSM) Research. This report contains a detailed description of the coded workings of selected CSM Testbed matrix processors (i.e., TOPO, K, INV, SSOL) and of the arithmetic utility processor AUS. These processors and the current sparse matrix data structures are studied and documented. Items examined include: details of the data structures, interdependence of data structures, data-blocking logic in the data structures, processor data flow and architecture, and processor algorithmic logic flow.

  14. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  15. The Advanced Orbiting Systems Testbed Program: Results to date

    NASA Technical Reports Server (NTRS)

    Otranto, John F.; Newsome, Penny A.

    1994-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Recommendations for Packet Telemetry (PT) and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's (GSFC's) AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations.

  16. Experimental validation of docking and capture using space robotics testbeds

    NASA Astrophysics Data System (ADS)

    Spofford, John

    Docking concepts include capture, berthing, and docking. The definitions of these terms, consistent with AIAA, are as follows: (1) capture (grasping)--the use of a manipulator to make initial contact and attachment between transfer vehicle and a platform; (2) berthing--positioning of a transfer vehicle or payload into platform restraints using a manipulator; and (3) docking--propulsive mechanical connection between vehicle and platform. The combination of the capture and berthing operations is effectively the same as docking; i.e., capture (grasping) + berthing = docking. These concepts are discussed in terms of Martin Marietta's ability to develop validation methods using robotics testbeds.

  17. The Living With a Star Program Space Environment Testbed

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  18. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    NASA Technical Reports Server (NTRS)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  19. Phoenix Missile Hypersonic Testbed (PMHT): Project Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    An over view of research into a low cost hypersonic research flight test capability to increase the amount of hypersonic flight data to help bridge the large developmental gap between ground testing/analysis and major flight demonstrator Xplanes is provided. The major objectives included: develop an air launched missile booster research testbed; accurately deliver research payloads through programmable guidance to hypersonic test conditions; low cost; a high flight rate minimum of two flights per year and utilize surplus air launched missiles and NASA aircraft.

  20. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John

    1991-01-01

    Docking concepts include capture, berthing, and docking. The definitions of these terms, consistent with AIAA, are as follows: (1) capture (grasping)--the use of a manipulator to make initial contact and attachment between transfer vehicle and a platform; (2) berthing--positioning of a transfer vehicle or payload into platform restraints using a manipulator; and (3) docking--propulsive mechanical connection between vehicle and platform. The combination of the capture and berthing operations is effectively the same as docking; i.e., capture (grasping) + berthing = docking. These concepts are discussed in terms of Martin Marietta's ability to develop validation methods using robotics testbeds.

  1. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  2. Performance of the PARCS Testbed Cesium Fountain Frequency Standard

    NASA Technical Reports Server (NTRS)

    Enzer, Daphna G.; Klipstein, William M.

    2004-01-01

    A cesium fountain frequency standard has been developed as a ground testbed for the PARCS (Primary Atomic Reference Clock in Space) experiment, an experiment intended to fly on the International Space Station. We report on the performance of the fountain and describe some of the implementations motivated in large part by flight considerations, but of relevance for ground fountains. In particular, we report on a new technique for delivering cooling and trapping laser beams to the atom collection region, in which a given beam is recirculated three times effectively providing much more optical power than traditional configurations. Allan deviations down to 10 have been achieved with this method.

  3. The Wide-Field Imaging Interferometry Testbed: Recent Results

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen

    2006-01-01

    We present recent results from the Wide-Field Imaging Interferometry Testbed (WIIT). The data acquired with the WIIT is "double Fourier" data, including both spatial and spectral information within each data cube. We have been working with this data, and starting to develop algorithms, implementations, and techniques for reducing this data. Such algorithms and tools are of great importance for a number of proposed future missions, including the Space Infrared Interferometric Telescope (SPIRIT), the Submillimeter Probe of the Evolution of Cosmic Structure (SPECS), and the Terrestrial Planet Finder Interferometer (TPF-I)/Darwin. Recent results are discussed and future study directions are described.

  4. Delivering Unidata Technology via the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  5. COMPARISON OF MILLIMETER-WAVE CLOUD RADAR MEASUREMENTS FOR THE FALL 1997 CLOUD IOP

    SciTech Connect

    SEKELSKY,S.M.; LI,L.; GALLOWAY,J.; MCINTOSH,R.E.; MILLER,M.A.; CLOTHIAUX,E.E.; HAIMOV,S.; MACE,G.; SASSEN,K.

    1998-03-23

    One of the primary objectives of the Fall 1997 IOP was to intercompare Ka-band (35GHz) and W-band (95GHz) cloud radar observations and verify system calibrations. During September 1997, several cloud radars were deployed at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site, including the full time operation 35 GHz CART Millimeter-wave Cloud Radar (MMCR), (Moran, 1997), the University of Massachusetts (UMass) single antenna 33GHz/95 GHz Cloud Profiling Radar System (CPRS), (Sekelsky, 1996), the 95 GHz Wyoming Cloud Radar (WCR) flown on the University of Wyoming King Air (Galloway, 1996), the University of Utah 95 GHz radar and the dual-antenna Pennsylvania State University 94 GHz radar (Clothiaux, 1995). In this paper the authors discuss several issues relevant to comparison of ground-based radars, including the detection and filtering of insect returns. Preliminary comparisons of ground-based Ka-band radar reflectivity data and comparisons with airborne radar reflectivity measurements are also presented.

  6. PORT: A Testbed Paradigm for On-line Digital Archive Development.

    ERIC Educational Resources Information Center

    Keeler, Mary; Kloesel, Christian

    1997-01-01

    Discusses the Peirce On-line Resource Testbed (PORT), a digital archive of primary data. Highlights include knowledge processing testbeds for digital resource development; Peirce's pragmatism in operation; PORT and knowledge processing; obstacles to archive access; and PORT as a paradigm for critical control in knowledge processing. (AEF)

  7. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... Pilot Program, 73 FR 76,002 (Dec. 15, 2008). \\3\\ The final Phase I test plan and additional information... National Telecommunications and Information Administration Spectrum Sharing Innovation Test-Bed Pilot... conduct in Phase II/III of the Spectrum Sharing Innovation Test-Bed pilot program to assess...

  8. Airborne Subscale Transport Aircraft Research Testbed: Aircraft Model Development

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas L.; Langford, William M.; Hill, Jeffrey S.

    2005-01-01

    The Airborne Subscale Transport Aircraft Research (AirSTAR) testbed being developed at NASA Langley Research Center is an experimental flight test capability for research experiments pertaining to dynamics modeling and control beyond the normal flight envelope. An integral part of that testbed is a 5.5% dynamically scaled, generic transport aircraft. This remotely piloted vehicle (RPV) is powered by twin turbine engines and includes a collection of sensors, actuators, navigation, and telemetry systems. The downlink for the plane includes over 70 data channels, plus video, at rates up to 250 Hz. Uplink commands for aircraft control include over 30 data channels. The dynamic scaling requirement, which includes dimensional, weight, inertial, actuator, and data rate scaling, presents distinctive challenges in both the mechanical and electrical design of the aircraft. Discussion of these requirements and their implications on the development of the aircraft along with risk mitigation strategies and training exercises are included here. Also described are the first training (non-research) flights of the airframe. Additional papers address the development of a mobile operations station and an emulation and integration laboratory.

  9. Development of optical packet and circuit integrated ring network testbed.

    PubMed

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate < 1×10(-4)) operation was achieved with optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated. PMID:22274025

  10. Off-road perception testbed vehicle design and evaluation

    NASA Astrophysics Data System (ADS)

    Spofford, John R.; Herron, Jennifer B.; Anhalt, David J.; Morgenthaler, Matthew K.; DeHerrera, Clinton

    2003-09-01

    Off-road robotics efforts such as DARPA"s PerceptOR program have motivated the development of testbed vehicles capable of sustained operation in a variety of terrain and environments. This paper describes the retrofitting of a minimally-modified ATV chassis into such a testbed which has been used by multiple programs for autonomous mobility development and sensor characterization. Modular mechanical interfaces for sensors and equipment enclosures enabled integration of multiple payload configurations. The electric power subsystem was capable of short-term operation on batteries with refueled generation for continuous operation. Processing subsystems were mounted in sealed, shock-dampened enclosures with heat exchangers for internal cooling to protect against external dust and moisture. The computational architecture was divided into a real-time vehicle control layer and an expandable high level processing and perception layer. The navigation subsystem integrated real time kinematic GPS with a three-axis IMU for accurate vehicle localization and sensor registration. The vehicle software system was based on the MarsScape architecture developed under DARPA"s MARS program. Vehicle mobility software capabilities included route planning, waypoint navigation, teleoperation, and obstacle detection and avoidance. The paper describes the vehicle design in detail and summarizes its performance during field testing.

  11. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  12. Extreme Adaptive Optics Testbed: Results and Future Work

    SciTech Connect

    Evans, J W; Sommargren, G; Poyneer, L; Macintosh, B; Severson, S; Dillon, D; Sheinis, A; Palmer, D; Kasdin, J; Olivier, S

    2004-07-15

    'Extreme' adaptive optics systems are optimized for ultra-high-contrast applications, such as ground-based extrasolar planet detection. The Extreme Adaptive Optics Testbed at UC Santa Cruz is being used to investigate and develop technologies for high-contrast imaging, especially wavefront control. A simple optical design allows us to minimize wavefront error and maximize the experimentally achievable contrast before progressing to a more complex set-up. A phase shifting diffraction interferometer is used to measure wavefront errors with sub-nm precision and accuracy. We have demonstrated RMS wavefront errors of <1.3 nm and a contrast of >10{sup -7} over a substantial region using a shaped pupil. Current work includes the installation and characterization of a 1024-actuator Micro-Electro-Mechanical- Systems (MEMS) deformable mirror, manufactured by Boston Micro-Machines, which will be used for wavefront control. In our initial experiments we can flatten the deformable mirror to 1.8-nm RMS wavefront error within a control radius of 5-13 cycles per aperture. Ultimately this testbed will be used to test all aspects of the system architecture for an extrasolar planet-finding AO system.

  13. Characterization of Vegetation using the UC Davis Remote Sensing Testbed

    NASA Astrophysics Data System (ADS)

    Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.

    2006-12-01

    Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.

  14. Function-based integration strategy for an agile manufacturing testbed

    NASA Astrophysics Data System (ADS)

    Park, Hisup

    1997-01-01

    This paper describes an integration strategy for plug-and- play software based on functional descriptions of the software modules. The functional descriptions identify explicitly the role of each module with respect to the overall systems. They define the critical dependencies that affect the individual modules and thus affect the behavior of the system. The specified roles, dependencies and behavioral constraints are then incorporated in a group of shared objects that are distributed over a network. These objects may be interchanged with others without disrupting the system so long as the replacements meet the interface and functional requirements. In this paper, we propose a framework for modeling the behavior of plug-and-play software modules that will be used to (1) design and predict the outcome of the integration, (2) generate the interface and functional requirements of individual modules, and (3) form a dynamic foundation for applying interchangeable software modules. I describe this strategy in the context of the development of an agile manufacturing testbed. The testbed represents a collection of production cells for machining operations, supported by a network of software modules or agents for planning, fabrication, and inspection. A process definition layer holds the functional description of the software modules. A network of distributed objects interact with one another over the Internet and comprise the plug-compatible software nodes that execute these functions. This paper will explore the technical and operational ramifications of using the functional description framework to organize and coordinate the distributed object modules.

  15. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2013-01-01

    NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  16. A Battery Certification Testbed for Small Satellite Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  17. Arc Detection With GUIDAR: First Experimental Tests On MXP Testbed

    NASA Astrophysics Data System (ADS)

    Salvador, S. M.; Maggiora, R.; D'Inca, R.; Fuenfgelder, H.

    2011-12-01

    The GUIDAR technology has been proposed for the detection of electric arcs in the transmission lines for antennas for plasma heating and current drive. After a preliminary study to assess the feasibility of this technique, some experimental tests with real arcs were conducted on the MXP testbed installed at IPP, Garching. The low frequency (25MHz) GUIDAR signal, made of a sequence of short phase-modulated impulses, is up-shifted to around 400MHz and injected into the transmission line by mean of a directional coupler. The echoes are then extracted with another directional coupler and down-shifted again for the processing. The analysis is performed at a pulse repetition frequency of 120-165kHz, enabling an arc detection within 6-8μs. Tests have shown encouraging results to demonstrate the capability of the GUIDAR system to easily detect both high voltage and, most important, low voltage arcs. The possibility of locating the arcs has also been addressed in the testbed with simulated arcs. The insensitivity of the method to slow changes of the line voltage standing wave ratio (mimicking antenna load variations) was also tested.

  18. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  19. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2012-01-01

    NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  20. The Hyperion Project: Partnership for an Advaned Technology Cluster Testbed

    SciTech Connect

    Seager, M; Leininger, M

    2008-04-28

    The Hyperion project offers a unique opportunity to participate in a community-driven testing and development resource at a scale beyond what can be accomplished by one entity alone. Hyperion is a new strategic technology partnership intended to support the member-driven development and testing at scale. This partnership will allow commodity clusters to scale up to meet the growing demands of customers multi-core petascale simulation environments. Hyperion will tightly couple together the outstanding research and development capabilities of Lawrence Livermore National Laboratory with leading technology companies, including Cisco, Data Direct Networks, Dell, Intel, LSI, Mellanox, Qlogic, RedHat, SuperMicro and Sun. The end goal of this project is to revolutionize cluster computing in fundamental ways by providing the critical software and hardware components for a highly scalable simulation environment. This environment will include support for high performance networking, parallel file systems, operating system, and cluster management. This goal will be achieved by building a scalable technology cluster testbed that will be fully dedicated to the partners and provide: (1) A scalable development testing and benchmarking environment for critical enabling Linux cluster technologies; (2) An evaluation testbed for new hardware and software technologies; and (3) A vehicle for forming long term collaborations.

  1. TopCAT and PySESA: Open-source software tools for point cloud decimation, roughness analyses, and quantitative description of terrestrial surfaces

    NASA Astrophysics Data System (ADS)

    Hensleigh, J.; Buscombe, D.; Wheaton, J. M.; Brasington, J.; Welcker, C. W.; Anderson, K.

    2015-12-01

    The increasing use of high-resolution topography (HRT) constructed from point clouds obtained from technology such as LiDAR, SoNAR, SAR, SfM and a variety of range-imaging techniques, has created a demand for custom analytical tools and software for point cloud decimation (data thinning and gridding) and spatially explicit statistical analysis of terrestrial surfaces. We will present on a number of analytical and computational tools designed to quantify surface roughness and texture, directly from point clouds in a variety of ways (using spatial- and frequency-domain statistics). TopCAT (Topographic Point Cloud Analysis Toolkit; Brasington et al., 2012) and PySESA (Python program for Spatially Explicit Spectral Analysis) both work by applying a small moving window to (x,y,z) data to calculate a suite of (spatial and spectral domain) statistics, which are then spatially-referenced on a regular (x,y) grid at a user-defined resolution. Collectively, these tools facilitate quantitative description of surfaces and may allow, for example, fully automated texture characterization and segmentation, roughness and grain size calculation, and feature detection and classification, on very large point clouds with great computational efficiency. Using tools such as these, it may be possible to detect geomorphic change in surfaces which have undergone minimal elevation difference, for example deflation surfaces which have coarsened but undergone no net elevation change, or surfaces which have eroded and accreted, leaving behind a different textural surface expression than before. The functionalities of the two toolboxes are illustrated with example high-resolution bathymetric point cloud data collected with multibeam echosounder, and topographic data collected with LiDAR.

  2. NASA/Goddard Space Flight Center's testbed for CCSDS compatible systems

    NASA Technical Reports Server (NTRS)

    Carper, Richard D.

    1993-01-01

    A testbed for flight and ground systems compatible with the Consultative Committee for Space Data Systems (CCSDS) Recommendations has been developed at NASA's Goddard Space Flight Center. The subsystems of an end-to-end CCSDS based data system are being developed. All return link CCSDS telemetry services (except Internet) and both versions of the CCSDS frame formats are being implemented. In key areas of uncertainty, multiple design approaches are being performed. In addition, key flight-qualifiable hardware components, such as Reed-Solomon encoders, are being developed to complement the testbed element development. The testbed and its capabilities are described. The method of dissemination of the testbed results are given, as are plans to make the testbed capabilities available to outside users. Plans for the development of standardized conformance and compatibility tests are provided.

  3. NASA/Goddard Space Flight Center's testbed for CCSDS compatible systems

    NASA Astrophysics Data System (ADS)

    Carper, Richard D.

    1993-03-01

    A testbed for flight and ground systems compatible with the Consultative Committee for Space Data Systems (CCSDS) Recommendations has been developed at NASA's Goddard Space Flight Center. The subsystems of an end-to-end CCSDS based data system are being developed. All return link CCSDS telemetry services (except Internet) and both versions of the CCSDS frame formats are being implemented. In key areas of uncertainty, multiple design approaches are being performed. In addition, key flight-qualifiable hardware components, such as Reed-Solomon encoders, are being developed to complement the testbed element development. The testbed and its capabilities are described. The method of dissemination of the testbed results are given, as are plans to make the testbed capabilities available to outside users. Plans for the development of standardized conformance and compatibility tests are provided.

  4. High performance testbed for four-beam infrared interferometric nulling and exoplanet detection.

    PubMed

    Martin, Stefan; Booth, Andrew; Liewer, Kurt; Raouf, Nasrat; Loya, Frank; Tang, Hong

    2012-06-10

    Technology development for a space-based infrared nulling interferometer capable of earthlike exoplanet detection and characterization started in earnest in the last 10 years. At the Jet Propulsion Laboratory, the planet detection testbed was developed to demonstrate the principal components of the beam combiner train for a high performance four-beam nulling interferometer. Early in the development of the testbed, the importance of "instability noise" for nulling interferometer sensitivity was recognized, and the four-beam testbed would produce this noise, allowing investigation of methods for mitigating this noise source. The testbed contains the required features of a four-beam combiner for a space interferometer and performs at a level matching that needed for the space mission. This paper describes in detail the design, functions, and controls of the testbed. PMID:22695670

  5. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; Pauls, T.

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  6. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  7. Observational evidence linking precipitation and mesoscale cloud fraction in the southeast Pacific

    NASA Astrophysics Data System (ADS)

    Rapp, Anita D.

    2016-07-01

    Precipitation has been hypothesized to play an important role in the transition of low clouds from closed to open cell cumulus in regions of large-scale subsidence. A synthesis of A-Train satellite measurements is used to examine the relationship between precipitation and mesoscale cloud fraction across a transition region in the southeastern Pacific. Low cloud pixels are identified in 4 years of CloudSat/CALIPSO observations and along-track mean cloud fraction within 2.5-500 km surrounding the clouds calculated. Results show that cloud fraction decreases more rapidly in areas surrounding precipitating clouds than around nonprecipitating clouds. The closed to open cell transition region appears especially sensitive, with the surrounding mesoscale cloud fraction decreasing 30% faster in the presence of precipitation compared to nonprecipitating clouds. There is also dependence on precipitation rate and cloud liquid water path (LWP), with higher rain rates or lower LWP showing larger decreases in surrounding cloud fraction.

  8. SPHERES: Design of a Formation Flying Testbed for ISS

    NASA Astrophysics Data System (ADS)

    Sell, S. W.; Chen, S. E.

    2002-01-01

    The SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) payload is an innovative formation-flying spacecraft testbed currently being developed for use internally aboard the International Space Station (ISS). The purpose of the testbed is to provide a cost-effective, long duration, replenishable, and easily reconfigurable platform with representative dynamics for the development and validation of metrology, formation flying, and autonomy algorithms. The testbed components consist of three 8-inch diameter free-flying "satellites," five ultrasound beacons, and an ISS laptop workstation. Each satellite is self-contained with on-board battery power, cold-gas propulsion (CO2), and processing systems. Satellites use two packs of eight standard AA batteries for approximately 90 minutes of lifetime while beacons last the duration of the mission powered by a single AA battery. The propulsion system uses pressurized carbon dioxide gas, stored in replaceable tanks, distributed through an adjustable regulator and associated tubing to twelve thrusters located on the faces of the satellites. A Texas Instruments C6701 DSP handles control algorithm data while an FPGA manages all sensor data, timing, and communication processes on the satellite. All three satellites communicate with each other and with the controlling laptop via a wireless RF link. Five ultrasound beacons, located around a predetermined work area, transmit ultrasound signals that are received by each satellite. The system effectively acts as a pseudo-GPS system, allowing the satellites to determine position and attitude and to navigate within the test arena. The payload hardware are predominantly Commercial Off The Shelf (COTS) products with the exception of custom electronics boards, selected propulsion system adaptors, and beacon and satellite structural elements. Operationally, SPHERES will run in short duration test sessions with approximately two weeks between each session. During

  9. Space Station technology testbed: 2010 deep space transport

    NASA Astrophysics Data System (ADS)

    Holt, Alan C.

    1993-12-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  10. Space Station technology testbed: 2010 deep space transport

    NASA Technical Reports Server (NTRS)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  11. Research on private cloud computing based on analysis on typical opensource platform: a case study with Eucalyptus and Wavemaker

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyuan; Yuan, Jian; Chen, Shi

    2013-03-01

    Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.

  12. CLOUD CONDENSATION NUCLEI MEASUREMENTS WITHIN CLOUDS

    EPA Science Inventory

    Measurements of the spectra of cloud condensation nuclei (CCN) within and near the boundaries of clouds are presented. Some of the in-cloud measurements excluded the nuclei within cloud droplets (interstitial CCN) while others included all nuclei inside the cloud (total CCN). The...

  13. Astronomy In The Cloud: Using Mapreduce For Image Coaddition

    NASA Astrophysics Data System (ADS)

    Wiley, Keith; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-01-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computational challenges such as anomaly detection, classification, and moving object tracking. Since such studies require the highest quality data, methods such as image coaddition, i.e., registration, stacking, and mosaicing, will be critical to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources, e.g., asteroids, or transient objects, e.g., supernovae, these datastreams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this paper we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data is partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources, i.e., platforms where Hadoop is offered as a service. We report on our experience implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multi-terabyte imaging dataset provides a good testbed for algorithm development since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image coaddition to the MapReduce framework. Then we describe a number of optimizations to our basic approach and report experimental results compring their performance. This work is funded by

  14. The computational structural mechanics testbed architecture. Volume 2: The interface

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the third set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 3 describes the CLIP-Processor interface and related topics. It is intended only for processor developers.

  15. SIM Interferometer Testbed (SCDU) Status and Recent Results

    NASA Technical Reports Server (NTRS)

    Nemati, Bijan; An, Xin; Goullioud, Renaud; Shao, Michael; Shen, Tsae-Pyng; Wehmeier, Udo J.; Weilert, Mark A.; Wang, Xu; Werne, Thomas A.; Wu, Janet P.; Zhai, Chengxing

    2010-01-01

    SIM Lite is a space-borne stellar interferometer capable of searching for Earth-size planets in the habitable zones of nearby stars. This search will require measurement of astrometric angles with sub micro-arcsecond accuracy and optical pathlength differences to 1 picometer by the end of the five-year mission. One of the most significant technical risks in achieving this level of accuracy is from systematic errors that arise from spectral differences between candidate stars and nearby reference stars. The Spectral Calibration Development Unit (SCDU), in operation since 2007, has been used to explore this effect and demonstrate performance meeting SIM goals. In this paper we present the status of this testbed and recent results.

  16. Universal Linear Optics: A Testbed for Optical Quantum Logic

    NASA Astrophysics Data System (ADS)

    Sparrow, Chris; Carolan, Jacques; Harrold, Christopher; Russell, Nicholas; Marshall, Graham; Silverstone, Joshua; Thompson, Mark; Matthews, Jonathan; O'Brien, Jeremy; Laing, Anthony; Martin-Lopez, Enrique; Shadbolt, Peter; Matsuda, Nobuyuki; Oguma, Manabu; Itoh, Mikitaka; Hashimoto, Toshikazu

    Linear optics is a promising platform for scalable quantum information processing. We demonstrate a single reprogrammable optical circuit that is sufficient to implement all possible linear optical protocols up the size of the circuit [Carolan et al., Science, 349, (2015)]. The system is an ideal testbed for rapidly prototyping new linear optical quantum gates, and testing known protocols in experimentally realistic scenarios. We use the device to perform a series of postselected and heralded quantum logic gates including a new scheme for heralded bell state generation, a key primitive in measurement-based linear optical quantum computation. We propose and demonstrate techniques for efficiently and accurately characterising and verifying these gates' operation. The ability to rapidly reprogram linear optical devices promises to replace a multitude of existing and future prototype systems, pointing the way to applications across quantum technologies.

  17. Test applications for heterogeneous real-time network testbed

    SciTech Connect

    Mines, R.F.; Knightly, E.W.

    1994-07-01

    This paper investigates several applications for a heterogeneous real-time network testbed. The network is heterogeneous in terms of network devices, technologies, protocols, and algorithms. The network is real-time in that its services can provide per-connection end-to-end performance guarantees. Although different parts of the network use different algorithms, all components have the necessary mechanisms to provide performance guarantees: admission control and priority scheduling. Three applications for this network are described in this paper: a video conferencing tool, a tool for combustion modeling using distributed computing, and an MPEG video archival system. Each has minimum performance requirements that must be provided by the network. By analyzing these applications, we provide insights to the traffic characteristics and performance requirements of practical real-time loads.

  18. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John; Schmitz, Eric; Hoff, William

    1991-01-01

    This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.

  19. Photovoltaic Engineering Testbed Designed for Calibrating Photovoltaic Devices in Space

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    2002-01-01

    Accurate prediction of the performance of solar arrays in space requires that the cells be tested in comparison with a space-flown standard. Recognizing that improvements in future solar cell technology will require an ever-increasing fidelity of standards, the Photovoltaics and Space Environment Branch at the NASA Glenn Research Center, in collaboration with the Ohio Aerospace Institute, designed a prototype facility to allow routine calibration, measurement, and qualification of solar cells on the International Space Station, and then the return of the cells to Earth for laboratory use. For solar cell testing, the Photovoltaic Engineering Testbed (PET) site provides a true air-mass-zero (AM0) solar spectrum. This allows solar cells to be accurately calibrated using the full spectrum of the Sun.

  20. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; Grendreau, Keith C.

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  1. The computational structural mechanics testbed architecture. Volume 2: Directives

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the second of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 2 describes the CLIP directives in detail. It is intended for intermediate and advanced users.

  2. A Simulation Testbed for Airborne Merging and Spacing

    NASA Technical Reports Server (NTRS)

    Santos, Michel; Manikonda, Vikram; Feinberg, Art; Lohr, Gary

    2008-01-01

    The key innovation in this effort is the development of a simulation testbed for airborne merging and spacing (AM&S). We focus on concepts related to airports with Super Dense Operations where new airport runway configurations (e.g. parallel runways), sequencing, merging, and spacing are some of the concepts considered. We focus on modeling and simulating a complementary airborne and ground system for AM&S to increase efficiency and capacity of these high density terminal areas. From a ground systems perspective, a scheduling decision support tool generates arrival sequences and spacing requirements that are fed to the AM&S system operating on the flight deck. We enhanced NASA's Airspace Concept Evaluation Systems (ACES) software to model and simulate AM&S concepts and algorithms.

  3. Modular, Rapid Propellant Loading System/Cryogenic Testbed

    NASA Technical Reports Server (NTRS)

    Hatfield, Walter, Sr.; Jumper, Kevin

    2012-01-01

    The Cryogenic Test Laboratory (CTL) at Kennedy Space Center (KSC) has designed, fabricated, and installed a modular, rapid propellant-loading system to simulate rapid loading of a launch-vehicle composite or standard cryogenic tank. The system will also function as a cryogenic testbed for testing and validating cryogenic innovations and ground support equipment (GSE) components. The modular skid-mounted system is capable of flow rates of liquid nitrogen from 1 to 900 gpm (approx equals 3.8 to 3,400 L/min), of pressures from ambient to 225 psig (approx equals 1.5 MPa), and of temperatures to -320 F (approx equals -195 C). The system can be easily validated to flow liquid oxygen at a different location, and could be easily scaled to any particular vehicle interface requirements

  4. MIT-KSC space life sciences telescience testbed

    NASA Technical Reports Server (NTRS)

    1989-01-01

    A Telescience Life Sciences Testbed is being developed. The first phase of this effort consisted of defining the experiments to be performed, investigating the various possible means of communication between KSC and MIT, and developing software and hardware support. The experiments chosen were two vestibular sled experiments: a study of ocular torsion produced by Y axis linear acceleration, based on the Spacelab D-1 072 Vestibular Experiment performed pre- and post-flight at KSC; and an optokinetic nystagmus (OKN)/linear acceleration interaction experiment. These two experiments were meant to simulate actual experiments that might be performed on the Space Station and to be representative of space life sciences experiments in general in their use of crew time and communications resources.

  5. Simulation to Flight Test for a UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; Logan, Michael J.; French, Michael L.; Guerreiro, Nelson M.

    2006-01-01

    The NASA Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis, Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights, including a fully autonomous demonstration at the Association of Unmanned Vehicle Systems International (AUVSI) UAV Demo 2005. Simulations based on wind tunnel data are being used to further develop advanced controllers for implementation and flight test.

  6. The computational structural mechanics testbed architecture. Volume 1: The language

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the first set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP, and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 1 presents the basic elements of the CLAMP language and is intended for all users.

  7. Intelligent Elements for the ISHM Testbed and Prototypes (ITP) Project

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Park, Han; Schwabacher, Mark; Watson, Michael; Mackey, Ryan; Fijany, Amir; Trevino, Luis; Weir, John

    2005-01-01

    Deep-space manned missions will require advanced automated health assessment capabilities. Requirements such as in-space assembly, long dormant periods and limited accessibility during flight, present significant challenges that should be addressed through Integrated System Health Management (ISHM). The ISHM approach will provide safety and reliability coverage for a complete system over its entire life cycle by determining and integrating health status and performance information from the subsystem and component levels. This paper will focus on the potential advanced diagnostic elements that will provide intelligent assessment of the subsystem health and the planned implementation of these elements in the ISHM Testbed and Prototypes (ITP) Project under the NASA Exploration Systems Research and Technology program.

  8. Easy and hard testbeds for real-time search algorithms

    SciTech Connect

    Koenig, S.; Simmons, R.G.

    1996-12-31

    Although researchers have studied which factors influence the behavior of traditional search algorithms, currently not much is known about how domain properties influence the performance of real-time search algorithms. In this paper we demonstrate, both theoretically and experimentally, that Eulerian state spaces (a super set of undirected state spaces) are very easy for some existing real-time search algorithms to solve: even real-time search algorithms that can be intractable, in general, are efficient for Eulerian state spaces. Because traditional real-time search testbeds (such as the eight puzzle and gridworlds) are Eulerian, they cannot be used to distinguish between efficient and inefficient real-time search algorithms. It follows that one has to use non-Eulerian domains to demonstrate the general superiority of a given algorithm. To this end, we present two classes of hard-to-search state spaces and demonstrate the performance of various real-time search algorithms on them.

  9. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Mike; Munson, Mike; Teate, George

    2006-01-01

    A new testbed for hypersonic flight research is proposed. Known as the Phoenix air-launched small missile (ALSM) flight testbed, it was conceived to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of two unique and very capable flight assets: the United States Navy Phoenix AIM-54 long-range, guided air-to-air missile and the NASA Dryden F-15B testbed airplane. The U.S. Navy retirement of the Phoenix AIM-54 missiles from fleet operation has presented an excellent opportunity for converting this valuable flight asset into a new flight testbed. This cost-effective new platform will fill an existing gap in the test and evaluation of current and future hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform. When launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will be valuable for the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite small-payload air-launched space boosters.

  10. Demo III: Department of Defense testbed for unmanned ground mobility

    NASA Astrophysics Data System (ADS)

    Shoemaker, Chuck M.; Bornstein, Jonathan A.; Myers, Scott D.; Brendle, Bruce E., Jr.

    1999-07-01

    Robotics has been identified by numerous recent Department of Defense (DOD) studies as a key enabling technology for future military operational concepts. The Demo III Program is a multiyear effort encompassing technology development and demonstration on testbed platforms, together with modeling simulation and experimentation directed toward optimization of operational concepts to employ this technology. Primary program focus is the advancement of capabilities for autonomous mobility through unstructured environments, concentrating on both perception and intelligent control technology. The scout mission will provide the military operational context for demonstration of this technology, although a significant emphasis is being placed upon both hardware and software modularity to permit rapid extension to other military missions. The Experimental Unmanned Vehicle (XUV) is a small (approximately 1150 kg, V-22 transportable) technology testbed vehicle designed for experimentation with multiple military operational concepts. Currently under development, the XUV is scheduled for roll-out in Summer 1999, with an initial troop experimentation to be conducted in September 1999. Though small, and relatively lightweight, modeling has shown the chassis capable of automotive mobility comparable to the current Army lightweight high-mobility, multipurpose, wheeled vehicle (HMMWV). The XUV design couples multisensor perception with intelligent control to permit autonomous cross-country navigation at speeds of up to 32 kph during daylight and 16 kph during hours of darkness. A small, lightweight, highly capable user interface will permit intuitive control of the XUV by troops from current-generation tactical vehicles. When it concludes in 2002, Demo III will provide the military with both the technology and the initial experience required to develop and field the first generation of semi-autonomous tactical ground vehicles for combat, combat support, and logistics applications.

  11. Vacuum Nuller Testbed Performance, Characterization and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  12. Aerosol-cloud interactions in ship tracks using Terra MODIS/MISR

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Chun; Christensen, Matthew W.; Diner, David J.; Garay, Michael J.

    2015-04-01

    Simultaneous ship track observations from Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) have been compiled to investigate how ship-injected aerosols affect marine warm boundary layer clouds for different cloud types and environmental conditions. By taking advantage of the high spatial resolution multiangle observations available from MISR, we utilized the retrieved cloud albedo, cloud top height, and cloud motion vectors to examine cloud property responses in ship-polluted and nearby unpolluted clouds. The strength of the cloud albedo response to increased aerosol level is primarily dependent on cloud cell structure, dryness of the free troposphere, and boundary layer depth, corroborating a previous study by Chen et al. (2012) where A-Train satellite data were utilized. Under open cell cloud structure the cloud properties are more susceptible to aerosol perturbations as compared to closed cells. Aerosol plumes caused an increase in liquid water amount (+38%), cloud top height (+13%), and cloud albedo (+49%) for open cell clouds, whereas for closed cell clouds, little change in cloud properties was observed. Further capitalizing on MISR's unique capabilities, the MISR cross-track cloud speed was used to derive cloud top divergence. Statistically averaging the results from the identified plume segments to reduce random noise, we found evidence of cloud top divergence in the ship-polluted clouds, whereas the nearby unpolluted clouds showed cloud top convergence, providing observational evidence of a change in local mesoscale circulation associated with enhanced aerosols. Furthermore, open cell polluted clouds revealed stronger cloud top divergence as compared to closed cell clouds, consistent with different dynamical mechanisms driving their responses. These results suggest that detailed cloud responses, classified by cloud type and environmental conditions, must be accounted for in global climate modeling

  13. Overview of New Cloud Optical Properties in Air Force Weather Worldwide Merged Cloud Analysis

    NASA Astrophysics Data System (ADS)

    Nobis, T. E.; Conner, M. D.

    2013-12-01

    Air Force Weather (AFW) has documented requirements for real-time cloud analysis to support DoD missions around the world. To meet these needs, AFW utilizes the Cloud Depiction and Forecast System (CDFS) II system to develop an hourly cloud analysis. The system creates cloud masks at pixel level from 16 different satellite sources, diagnoses cloud layers, reconciles the pixel level data to a regular grid by instrument class, and optimally merges the various instrument classes to create a final multi-satellite analysis. In Jan, 2013, Northrop Grumman Corp. delivered a new CDFS II baseline which included the addition of new Atmospheric and Environmental Research Inc (AER) developed Cloud Optical Property (COP) variables in the analysis. The new variables include phase (ice/water), optical depth, ice/water path, and particle size. In addition, the COP schemes have radically changed the derivation of cloud properties like cloud top height and thickness. The Northrop-developed CDFS II Test Bed was used to examine and characterize the behavior of these new variables in order to understand how the variables are performing, especially between instrument classes. Understanding this behavior allows performance tuning and uncertainty estimation which will assist users seeking to reason with the data and will be necessary for use in model development and climatology development. This presentation will provide a basic overview of the CDFS II produced COP variables and show results from experiments conducted on the CDFS II Testbed. Results will include a basic comparison of COP derived using different instrument classes as well as comparison between pixel level and derived gridded products with an eye towards better characterization of uncertainty.

  14. CLOUD CHEMISTRY.

    SciTech Connect

    SCHWARTZ,S.E.

    2001-03-01

    Clouds present substantial concentrations of liquid-phase water, which can potentially serve as a medium for dissolution and reaction of atmospheric gases. The important precursors of acid deposition, SO{sub 2} and nitrogen oxides NO and NO{sub 2} are only sparingly soluble in clouds without further oxidation to sulfuric and nitric acids. In the case of SO{sub 2} aqueous-phase reaction with hydrogen peroxide, and to lesser extent ozone, are identified as important processes leading to this oxidation, and methods have been described by which to evaluate the rates of these reactions. The limited solubility of the nitrogen oxides precludes significant aqueous-phase reaction of these species, but gas-phase reactions in clouds can be important especially at night.

  15. Neptune's clouds

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The bright cirrus-like clouds of Neptune change rapidly, often forming and dissipating over periods of several to tens of hours. In this sequence Voyager 2 observed cloud evolution in the region around the Great Dark Spot (GDS). The surprisingly rapid changes which occur separating each panel shows that in this region Neptune's weather is perhaps as dynamic and variable as that of the Earth. However, the scale is immense by our standards -- the Earth and the GDS are of similar size -- and in Neptune's frigid atmosphere, where temperatures are as low as 55 degrees Kelvin (-360 F), the cirrus clouds are composed of frozen methane rather than Earth's crystals of water ice. The Voyager Mission is conducted by JPL for NASA's Office of Space Science and Applications

  16. Development of a hybrid cloud parameterization for general circulation models

    SciTech Connect

    Kao, C.Y.J.; Kristjansson, J.E.; Langley, D.L.

    1995-04-01

    We have developed a cloud package with state-of-the-art physical schemes that can parameterize low-level stratus or stratocumulus, penetrative cumulus, and high-level cirrus. Such parameterizations will improve cloud simulations in general circulation models (GCMs). The principal tool in this development comprises the physically based Arakawa-Schubert scheme for convective clouds and the Sundqvist scheme for layered, nonconvective clouds. The term {open_quotes}hybrid{close_quotes} addresses the fact that the generation of high-attitude layered clouds can be associated with preexisting convective clouds. Overall, the cloud parameterization package developed should better determine cloud heating and drying effects in the thermodynamic budget, realistic precipitation patterns, cloud coverage and liquid/ice water content for radiation purposes, and the cloud-induced transport and turbulent diffusion for atmospheric trace gases.

  17. Progress at the starshade testbed at Northrop Grumman Aerospace Systems: comparisons with computer simulations

    NASA Astrophysics Data System (ADS)

    Samuele, Rocco; Varshneya, Rupal; Johnson, Tim P.; Johnson, Adam M. F.; Glassman, Tiffany

    2010-07-01

    We report on progress at the Northrop Grumman Aerospace Systems (NGAS) starshade testbed. The starshade testbed is a 42.8 meter vacuum chamber that replicates the Fresnel number of an equivalent full-scale starshade mission, namely the flagship New Worlds Observer (NWO) configuration. This paper reports on recent upgrades to the testbed and comparisons of previously published experimental results with computer simulations - which show encouraging agreement to within a factor of 1.5. We also report on a new generation of sub-scale starshades that for the first time allow us to exactly match the Fresnel number of a full-scale mission.

  18. The Wide-Field Imaging Interferometry Testbed (WIIT): Recent Progress and Results

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.; Frey, Bradley J.; Leisawitz, David T.; Lyon, Richard G.; Maher, Stephen F.; Martino, Anthony J.

    2008-01-01

    Continued research with the Wide-Field Imaging Interferometry Testbed (WIIT) has achieved several important milestones. We have moved WIIT into the Advanced Interferometry and Metrology (AIM) Laboratory at Goddard, and have characterized the testbed in this well-controlled environment. The system is now completely automated and we are in the process of acquiring large data sets for analysis. In this paper, we discuss these new developments and outline our future research directions. The WIIT testbed, combined with new data analysis techniques and algorithms, provides a demonstration of the technique of wide-field interferometric imaging, a powerful tool for future space-borne interferometers.

  19. Multiconjugate adaptive optics results from the laboratory for adaptive optics MCAO/MOAO testbed.

    PubMed

    Laag, Edward A; Ammons, S Mark; Gavel, Donald T; Kupke, Renate

    2008-08-01

    We report on the development of wavefront reconstruction and control algorithms for multiconjugate adaptive optics (MCAO) and the results of testing them in the laboratory under conditions that simulate an 8 meter class telescope. The University of California Observatories (UCO) Lick Observatory Laboratory for Adaptive Optics multiconjugate testbed allows us to test wide-field-of-view adaptive optics systems as they might be instantiated in the near future on giant telescopes. In particular, we have been investigating the performance of MCAO using five laser beacons for wavefront sensing and a minimum-variance algorithm for control of two conjugate deformable mirrors. We have demonstrated improved Strehl ratio and enlarged field-of-view performance when compared to conventional AO techniques. We have demonstrated improved MCAO performance with the implementation of a routine that minimizes the generalized isoplanatism when turbulent layers do not correspond to deformable mirror conjugate altitudes. Finally, we have demonstrated suitability of the system for closed loop operation when configured to feed back conditional mean estimates of wavefront residuals rather than the directly measured residuals. This technique has recently been referred to as the "pseudo-open-loop" control law in the literature. PMID:18677374

  20. Our World: Cool Clouds

    NASA Video Gallery

    Learn how clouds are formed and watch an experiment to make a cloud using liquid nitrogen. Find out how scientists classify clouds according to their altitude and how clouds reflect and absorb ligh...

  1. Complex Clouds

    Atmospheric Science Data Center

    2013-04-16

    ...     View Larger Image The complex structure and beauty of polar clouds are highlighted by these images acquired ... Multi-angle Imaging SpectroRadiometer observes the daylit Earth continuously from pole to pole, and every 9 days views the entire globe ...

  2. Cloud Front

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA02171 Cloud Front

    These clouds formed in the south polar region. The faintness of the cloud system likely indicates that these are mainly ice clouds, with relatively little dust content.

    Image information: VIS instrument. Latitude -86.7N, Longitude 212.3E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  3. Thin Clouds

    Atmospheric Science Data Center

    2013-04-18

    ... of this montage is a natural-color view of the Caribbean Sea east of the Yucatan Peninsula as seen by MISR's most steeply ... - Thin, feathery clouds of ice crystals over the Caribbean Sea. project:  MISR category:  gallery ...

  4. Optical Network Testbed-Key Enabler in Developing Current and Future Network Solutions

    NASA Astrophysics Data System (ADS)

    Vukovic, Alex; Wu, Jing; Savoie, Michel; Hua, Heng; Campbell, Scott; Zhang, Hanxi

    2005-10-01

    The all-optical network (AON) demonstrator is a trial system-level testbed for the validation and verification of key network building blocks, scalable architectures, as well as control and management solutions for next-generation wavelength division multiplexing (WDM) networks. Developed at the Communications Research Centre (CRC) in Ottawa, ON, Canada, the AON testbed has already validated certain system-level concepts at the physical and upper layers. The paper describes the crucial role of the AON testbed in research, development, and "proof of concept" for both emerging optical technologies at the physical layer (performance characterization) and customer-managed networks at the upper layer (network management). Moreover, it is expected that the AON testbed will continue to be a valuable playground for future developments of emerging technologies, solutions, and applications.

  5. Description of the control system design for the SSF PMAD DC testbed

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Kimnach, Greg L.

    1991-01-01

    The Power Management and Distribution (PMAD) DC Testbed Control System for Space Station Freedom was developed using a top down approach based on classical control system and conventional terrestrial power utilities design techniques. The design methodology includes the development of a testbed operating concept. This operating concept describes the operation of the testbed under all possible scenarios. A unique set of operating states was identified and a description of each state, along with state transitions, was generated. Each state is represented by a unique set of attributes and constraints, and its description reflects the degree of system security within which the power system is operating. Using the testbed operating states description, a functional design for the control system was developed. This functional design consists of a functional outline, a text description, and a logical flowchart for all the major control system functions. Described here are the control system design techniques, various control system functions, and the status of the design and implementation.

  6. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  7. Carrier Plus: A Sensor Payload for Living With a Star Space Environment Testbed (LWS/SET)

    NASA Technical Reports Server (NTRS)

    Marshall, Cheryl; Moss, Steven; Howard, Regan; LaBel, Kenneth; Grycewicz, Tom; Barth, Janet; Brewer, Dana

    2003-01-01

    The paper discusses the following: 1. Living with a Star (LWS) program: space environment testbed (SET); natural space environment. 2. Carrier plus: goals and benefits. 3. ON-orbit sensor measurements. 4. Carrier plus architecture. 5. Participation in carrier plus.

  8. Preliminary Design of a Galactic Cosmic Ray Shielding Materials Testbed for the International Space Station

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen; Sechkar, Edward A.; Panko, Scott R.

    2012-01-01

    The preliminary design of a testbed to evaluate the effectiveness of galactic cosmic ray (GCR) shielding materials, the MISSE Radiation Shielding Testbed (MRSMAT) is presented. The intent is to mount the testbed on the Materials International Space Station Experiment-X (MISSE-X) which is to be mounted on the International Space Station (ISS) in 2016. A key feature is the ability to simultaneously test nine samples, including standards, which are 5.25 cm thick. This thickness will enable most samples to have an areal density greater than 5 g/sq cm. It features a novel and compact GCR telescope which will be able to distinguish which cosmic rays have penetrated which shielding material, and will be able to evaluate the dose transmitted through the shield. The testbed could play a pivotal role in the development and qualification of new cosmic ray shielding technologies.

  9. TPF Planet Detection Testbed: demonstrating deep, stable nulling and planet detection

    NASA Technical Reports Server (NTRS)

    Martin, Stefan

    2005-01-01

    The design of a testbed being built at the Jet Propulsion Laboratory is described. Simulatiung a dual chopped Bracewell interferometer, the testbed comprises a four beam star and planet source and nulling beam combiner. Since achieving a stable null is of great concern the testbed has many control systems designed to achieve stability of alignment and optical path difference over long periods of time. Comparisons between the testbed and the flight system are drawn and key performance parameters are discussed. The interaction between designs for phaseplate systems that achromatically invert the electric field of one of each pair of the incoming beams to achieve the null and the choice of fringe tracking schemes is also discussed.

  10. Independent Technology Assessment within the Federation of Earth Science Information Partners (ESIP) Testbed

    NASA Astrophysics Data System (ADS)

    Burgess, A. B.; Robinson, E.; Graybeal, J.

    2015-12-01

    The Federation of Earth Science Information Partners (ESIP) is a community of science, data and information technology practitioners. ESIP's mission is to support the networking and data dissemination needs of our members and the global community. We do this by linking the functional sectors of education, observation, research and application with the ultimate use of Earth science. Amongst the services provided to ESIP members is the Testbed; a collaborative forum for the development of technology standards, services, protocols and best practices. ESIP has partnered with the NASA Advanced Information Systems Technology (AIST) program to integrate independent assessment of Testing Readiness Level (TRL) into the ESIP Testbed. In this presentation we will 1) demonstrate TRL assessment in the ESIP Testbed using three AIST projects, 2) discuss challenges and insights into creating an independent validation/verification framework and 3) outline the versatility of the ESIP Testbed as applied to other technology projects.

  11. A Real-Time Testbed for Satellite and Terrestrial Communications Experimentation and Development

    NASA Technical Reports Server (NTRS)

    Angkasa, K.; Hamkins, J.; Jao, J.; Lay, N.; Satorius, E.; Zevallos, A.

    1997-01-01

    This paper describes a programmable DSP-based testbed that is employed in the development and evaluation of blind demodulation algorithms to be used in wireless satellite or terrestrial communications systems. The testbed employs a graphical user interface (GUI) to provide independent, real-time control of modulator, channel and demodulator parameters and also affords realtime observation of various diagnostic signals such as carrier, timing recovery and decoder metrics. This interactive flexibility enables an operator to tailor the testbed parameters and environment to investigate the performance of any arbitrary communications system and channel model. Furthermore, a variety of digital and analog interfaces allow the testbed to be used either as a stand-alone digital modulator or receiver, thereby extending its experimental utility from the laboratory to the field.

  12. Three-dimensional geospatial information service based on cloud computing

    NASA Astrophysics Data System (ADS)

    Zhai, Xi; Yue, Peng; Jiang, Liangcun; Wang, Linnan

    2014-01-01

    Cloud computing technologies can support high-performance geospatial services in various domains, such as smart city and agriculture. Apache Hadoop, an open-source software framework, can be used to build a cloud environment on commodity clusters for storage and large-scale processing of data sets. The Open Geospatial Consortium (OGC) Web 3-D Service (W3DS) is a portrayal service for three-dimensional (3-D) geospatial data. Its performance could be improved by cloud computing technologies. This paper investigates how OGC W3DS could be developed in a cloud computing environment. It adopts the Apache Hadoop as the framework to provide a cloud implementation. The design and implementation of the 3-D geospatial information cloud service is presented. The performance evaluation is performed over data retrieval tests running in a cloud platform built by Hadoop clusters. The evaluation results provide a valuable reference on providing high-performance 3-D geospatial information cloud services.

  13. Versatile simulation testbed for rotorcraft speech I/O system design

    NASA Technical Reports Server (NTRS)

    Simpson, Carol A.

    1986-01-01

    A versatile simulation testbed for the design of a rotorcraft speech I/O system is described in detail. The testbed will be used to evaluate alternative implementations of synthesized speech displays and speech recognition controls for the next generation of Army helicopters including the LHX. The message delivery logic is discussed as well as the message structure, the speech recognizer command structure and features, feedback from the recognizer, and random access to controls via speech command.

  14. Validation of the CERTS Microgrid Concept The CEC/CERTS MicrogridTestbed

    SciTech Connect

    Nichols, David K.; Stevens, John; Lasseter, Robert H.; Eto,Joseph H.

    2006-06-01

    The development of test plans to validate the CERTSMicrogrid concept is discussed, including the status of a testbed.Increased application of Distributed Energy Resources on the Distributionsystem has the potential to improve performance, lower operational costsand create value. Microgrids have the potential to deliver these highvalue benefits. This presentation will focus on operationalcharacteristics of the CERTS microgrid, the partners in the project andthe status of the CEC/CERTS microgrid testbed. Index Terms DistributedGeneration, Distributed Resource, Islanding, Microgrid,Microturbine

  15. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  16. A high-resolution, four-band SAR testbed with real-time image formation

    SciTech Connect

    Walker, B.; Sander, G.; Thompson, M.; Burns, B.; Fellerhoff, R.; Dubbert, D.

    1996-03-01

    This paper describes the Twin-Otter SAR Testbed developed at Sandia National Laboratories. This SAR is a flexible, adaptable testbed capable of operation on four frequency bands: Ka, Ku, X, and VHF/UHF bands. The SAR features real-time image formation at fine resolution in spotlight and stripmap modes. High-quality images are formed in real time using the overlapped subaperture (OSA) image-formation and phase gradient autofocus (PGA) algorithms.

  17. Designing an autonomous helicopter testbed: From conception through implementation

    NASA Astrophysics Data System (ADS)

    Garcia, Richard D.

    Miniature Unmanned Aerial Vehicles (UAVs) are currently being researched for a wide range of tasks, including search and rescue, surveillance, reconnaissance, traffic monitoring, fire detection, pipe and electrical line inspection, and border patrol to name only a few of the application domains. Although small/miniature UAVs, including both Vertical Takeoff and Landing (VTOL) vehicles and small helicopters, have shown great potential in both civilian and military domains, including research and development, integration, prototyping, and field testing, these unmanned systems/vehicles are limited to only a handful of university labs. For VTOL type aircraft the number is less than fifteen worldwide! This lack of development is due to both the extensive time and cost required to design, integrate and test a fully operational prototype as well as the shortcomings of published materials to fully describe how to design and build a "complete" and "operational" prototype system. This dissertation overcomes existing barriers and limitations by describing and presenting in great detail every technical aspect of designing and integrating a small UAV helicopter including the on-board navigation controller, capable of fully autonomous takeoff, waypoint navigation, and landing. The presented research goes beyond previous works by designing the system as a testbed vehicle. This design aims to provide a general framework that will not only allow researchers the ability to supplement the system with new technologies but will also allow researchers to add innovation to the vehicle itself. Examples include modification or replacement of controllers, updated filtering and fusion techniques, addition or replacement of sensors, vision algorithms, Operating Systems (OS) changes or replacements, and platform modification or replacement. This is supported by the testbed's design to not only adhere to the technology it currently utilizes but to be general enough to adhere to a multitude of

  18. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Michael T.; Munson, Michael J.; Teate, George F.

    2007-01-01

    The Phoenix Air-Launched Small Missile (ALSM) flight testbed was conceived and is proposed to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of the United States Navy Phoenix AIM-54 (Hughes Aircraft Company, now Raytheon Company, Waltham, Massachusetts) long-range, guided air-to-air missile and the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (Edwards, California) F-15B (McDonnell Douglas, now the Boeing Company, Chicago, Illinois) testbed airplane. The retirement of the Phoenix AIM-54 missiles from fleet operation has presented an opportunity for converting this flight asset into a new flight testbed. This cost-effective new platform will fill the gap in the test and evaluation of hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform; when launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will assist the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite-small-payload air-launched space boosters.

  19. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  20. The Living With a Star Space Environment Testbeds

    NASA Astrophysics Data System (ADS)

    Brewer, D.; Barth, J.; Label, K.

    The Living With a Star (LWS) Space Environment Testbeds (SET) are a series of projects that contain investigations that collect data in space and use it to provide products that improve the engineering approach to accommodate and/or mitigate the effects of solar variability on spacecraft design and operations. The improvements reduce requirements for design and operations margins to account for the uncertainties in the space environment and its effects. Reducing the requirements will increase the payload fraction, permit the use of a smaller launch vehicle (thereby reducing mission cost), and/or enable routine operations in new segments of the environment (such as middle Earth orbit, the region from 2000 km to 10,000 km) at costs similar to those for operations below 2000 km. A new SET project starts about very two years when investigations are selected. Five categories of investigations included in SET projects are: (1) Characterization of the space environment in the presence of a spacecraft; (2) Definition of the mechanisms for materials' degradation and the performance characterization of materials designed for shielding from ionizing radiation; (3) Accommodation and/or mitigation of space environment effects for detectors/sensors; (4) Performance improvement methodology for microelectronics used in space; and, (5) Accommodation and/or mitigation of charging/discharging effects on spacecraft and spacecraft components. All SET projects use secondary access to space and partnering to leverage resources. An overview of the SET segment of the LWS program will be presented.

  1. Priority scheme planning for the robust SSM/PMAD testbed

    NASA Astrophysics Data System (ADS)

    Elges, Michael R.; Ashworth, Barry R.

    Whenever mixing priorities of manually controlled resources with those of autonomously controlled resources, the space station module power management and distribution (SSM/PMAD) environment requires cooperating expert system interaction between the planning function and the priority manager. The elements and interactions of the SSM/PMAD planning and priority management functions are presented. Their adherence to cooperating for common achievement are described. In the SSM/PMAD testbed these actions are guided by having a system planning function, KANT, which has insight to the executing system and its automated database. First, the user must be given access to all information which may have an effect on the desired outcome. Second, the fault manager element, FRAMES, must be informed as to the change so that correct diagnoses and operations take place if and when faults occur. Third, some element must engage as mediator for selection of resources and actions to be added or removed at the user's request. This is performed by the priority manager, LPLMS. Lastly, the scheduling mechanism, MAESTRO, must provide future schedules adhering to the user modified resource base.

  2. NASA'S Coastal and Ocean Airborne Science Testbed (COAST): Early Results

    NASA Astrophysics Data System (ADS)

    Guild, L. S.; Dungan, J. L.; Edwards, M.; Russell, P. B.; Morrow, J. H.; Kudela, R. M.; Myers, J. S.; Livingston, J.; Lobitz, B.; Torres-Perez, J.

    2012-12-01

    The NASA Coastal and Ocean Airborne Science Testbed (COAST) project advances coastal ecosystems research and ocean color calibration and validation capability by providing a unique airborne payload optimized for remote sensing in the optically complex coastal zone. The COAST instrument suite combines a customized imaging spectrometer, sunphotometer system, and a new bio-optical radiometer package to obtain ocean/coastal/atmosphere data simultaneously in flight for the first time. The imaging spectrometer (Headwall) is optimized in the blue region of the spectrum to emphasize remote sensing of marine and freshwater ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data is accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Coastal Airborne In situ Radiometers (C-AIR, Biospherical Instruments, Inc.), developed for COAST for airborne campaigns from field-deployed microradiometer instrumentation, will provide measurements of apparent optical properties at the land/ocean boundary including optically shallow aquatic ecosystems. Ship-based measurements allowed validation of airborne measurements. Radiative transfer modeling on in-water measurements from the HyperPro and Compact-Optical Profiling System (C-OPS, the in-water companion to C-AIR) profiling systems allows for comparison of airborne and in-situ water leaving radiance measurements. Results of the October 2011 Monterey Bay COAST mission include preliminary data on coastal ocean color products, coincident spatial and temporal data on aerosol optical depth and water vapor column content, as well as derived exact water-leaving radiances.

  3. A Test-Bed Configuration: Toward an Autonomous System

    NASA Astrophysics Data System (ADS)

    Ocaña, F.; Castillo, M.; Uranga, E.; Ponz, J. D.; TBT Consortium

    2015-09-01

    In the context of the Space Situational Awareness (SSA) program of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. In order to fulfill all the security requirements for the TBT project, the use of a autonomous emergency system (AES) is foreseen to monitor the control system. The AES will monitor remotely the health of the observing system and the internal and external environment. It will incorporate both autonomous and interactive actuators to force the protection of the system (i.e., emergency dome close out).

  4. Articulated navigation testbed (ANT): an example of adaptable intrinsic mobility

    NASA Astrophysics Data System (ADS)

    Brosinsky, Chris A.; Hanna, Doug M.; Penzes, Steven G.

    2000-07-01

    An important but oft overlooked aspect of any robotic system is the synergistic benefit of designing the chassis to have high intrinsic mobility which complements rather than limits, its system capabilities. This novel concept continues to be investigated by the Defence Research Establishment Suffield (DRES) with the Articulated Navigation Testbed (ANT) Unmanned Ground Vehicle (UGV). The ANT demonstrates high mobility through the combination of articulated steering and a hybrid locomotion scheme which utilizes individually powered wheels on the edge of rigid legs; legs which are capable of approximately 450 degrees of rotation. The configuration can be minimally configured as a 4x4 and modularly expanded to 6x6, 8x8, and so on. This enhanced mobility configuration permits pose control and novel maneuvers such as stepping, bridging, crawling, etc. Resultant mobility improvements, particularly in unstructured and off-road environments, will reduce the resolution with which the UGV sensor systems must perceive its surroundings and decreases the computational requirements of the UGV's perception systems1 for successful semi-autonomous or autonomous terrain negotiation. This paper reviews critical vehicle developments leading up to the ANT concept, describes the basis for its configuration and speculates on the impact of the intrinsic mobility concept for UGV effectiveness.

  5. Extrasolar Planetary Imaging Coronagraph: Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.

    2008-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime ( 5 year goal) and will revisit planets at least three times at intervals of 9 months. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed,

  6. Development of a Testbed for Distributed Satellite Command and Control

    NASA Astrophysics Data System (ADS)

    Zetocha, Paul; Brito, Margarita

    2002-01-01

    At the Air Force Research Laboratory's Space Vehicles Directorate we are investigating and developing architectures for commanding and controlling a cluster of cooperating satellites through prototype development for the TechSat-21 program. The objective of this paper is to describe a distributed satellite testbed that is currently under development and to summarize near term prototypes being implemented for cluster command and control. To design, develop, and test our architecture we are using eight PowerPC 750 VME-based single board computers, representing eight satellites. Each of these computers is hosting the OSE(TM) real-time operating system from Enea Systems. At the core of our on-board cluster manager is ObjectAgent. ObjectAgent is an agent-based object-oriented framework for flight systems, which is particularly suitable for distributed applications. In order to handle communication with the ground as well as to assist with the cluster management we are using the Spacecraft Command Language (SCL). SCL is also at the centerpiece of our ground control station and handles cluster commanding, telemetry decommutation, state-of-health monitoring, and Fault Detection, Isolation, and Resolution (FDIR). For planning and scheduling activities we are currently using ASPEN from NASA/JPL. This paper will describe each of the above components in detail and then present the prototypes being implemented.

  7. Wavefront Control Toolbox for James Webb Space Telescope Testbed

    NASA Technical Reports Server (NTRS)

    Shiri, Ron; Aronstein, David L.; Smith, Jeffery Scott; Dean, Bruce H.; Sabatke, Erin

    2007-01-01

    We have developed a Matlab toolbox for wavefront control of optical systems. We have applied this toolbox to the optical models of James Webb Space Telescope (JWST) in general and to the JWST Testbed Telescope (TBT) in particular, implementing both unconstrained and constrained wavefront optimization to correct for possible misalignments present on the segmented primary mirror or the monolithic secondary mirror. The optical models implemented in Zemax optical design program and information is exchanged between Matlab and Zemax via the Dynamic Data Exchange (DDE) interface. The model configuration is managed using the XML protocol. The optimization algorithm uses influence functions for each adjustable degree of freedom of the optical mode. The iterative and non-iterative algorithms have been developed to converge to a local minimum of the root-mean-square (rms) of wavefront error using singular value decomposition technique of the control matrix of influence functions. The toolkit is highly modular and allows the user to choose control strategies for the degrees of freedom to be adjusted on a given iteration and wavefront convergence criterion. As the influence functions are nonlinear over the control parameter space, the toolkit also allows for trade-offs between frequency of updating the local influence functions and execution speed. The functionality of the toolbox and the validity of the underlying algorithms have been verified through extensive simulations.

  8. Priority scheme planning for the robust SSM/PMAD testbed

    NASA Technical Reports Server (NTRS)

    Elges, Michael R.; Ashworth, Barry R.

    1991-01-01

    Whenever mixing priorities of manually controlled resources with those of autonomously controlled resources, the space station module power management and distribution (SSM/PMAD) environment requires cooperating expert system interaction between the planning function and the priority manager. The elements and interactions of the SSM/PMAD planning and priority management functions are presented. Their adherence to cooperating for common achievement are described. In the SSM/PMAD testbed these actions are guided by having a system planning function, KANT, which has insight to the executing system and its automated database. First, the user must be given access to all information which may have an effect on the desired outcome. Second, the fault manager element, FRAMES, must be informed as to the change so that correct diagnoses and operations take place if and when faults occur. Third, some element must engage as mediator for selection of resources and actions to be added or removed at the user's request. This is performed by the priority manager, LPLMS. Lastly, the scheduling mechanism, MAESTRO, must provide future schedules adhering to the user modified resource base.

  9. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    SciTech Connect

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  10. Analyses of liquid rocket instabilities using a computational testbed

    SciTech Connect

    Grenda, J.M.; Venkateswaran, S.; Merkle, C.L.

    1994-12-31

    A synergistic hierarchy of numerical and analytical models is used to simulate three-dimensional combustion instability in liquid rocket engines. Existing phenomenological models for vaporization and atomization are used in quasi-steady form to describe the liquid phase processes. In addition to a complete nonlinear numerical model, linearized numerical and closed-form analytical models are used to validate the numerical solution and to obtain initial estimates of stable and unstable operating regimes. All three models are fully three dimensional. The simultaneous application of these approaches permits computationally inexpensive surveys to be performed in rapid parametric fashion for a wide variety of operating conditions. Stability maps obtained from the computations indicate that, when droplet temperature fluctuations are present, vaporization and atomization can drive instability. The presence of droplet temperature fluctuations introduces areas of instability for smaller drop sizes and colder drop temperatures. The computational procedures are demonstrated to accurately capture the three-dimensional wave propagation within the combustion chamber. The validated results indicate excellent amplitude and phase agreement for properly selected grid resolution. The nonlinear model demonstrates limit cycle behavior for growing waves and wave steepening for large-amplitude disturbances. The current work represents a validated computational testbed upon which more comprehensive physical modeling may be incorporated.

  11. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  12. Finite Element Modeling of the NASA Langley Aluminum Testbed Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Pritchard, Joselyn I.; Buehrle, Ralph D.; Pappa, Richard S.

    2002-01-01

    The NASA Langley Aluminum Testbed Cylinder (ATC) was designed to serve as a universal structure for evaluating structural acoustic codes, modeling techniques and optimization methods used in the prediction of aircraft interior noise. Finite element models were developed for the components of the ATC based on the geometric, structural and material properties of the physical test structure. Numerically predicted modal frequencies for the longitudinal stringer, ring frame and dome component models, and six assembled ATC configurations were compared with experimental modal survey data. The finite element models were updated and refined, using physical parameters, to increase correlation with the measured modal data. Excellent agreement, within an average 1.5% to 2.9%, was obtained between the predicted and measured modal frequencies of the stringer, frame and dome components. The predictions for the modal frequencies of the assembled component Configurations I through V were within an average 2.9% and 9.1%. Finite element modal analyses were performed for comparison with 3 psi and 6 psi internal pressurization conditions in Configuration VI. The modal frequencies were predicted by applying differential stiffness to the elements with pressure loading and creating reduced matrices for beam elements with offsets inside external superelements. The average disagreement between the measured and predicted differences for the 0 psi and 6 psi internal pressure conditions was less than 0.5%. Comparably good agreement was obtained for the differences between the 0 psi and 3 psi measured and predicted internal pressure conditions.

  13. Clutter and signatures from near infrared testbed sensor

    NASA Astrophysics Data System (ADS)

    Sanderson, R. B.; McCalmont, J. F.; Montgomery, J. B.; Johnson, R. S.; McDermott, D. J.

    2008-04-01

    A new tactical airborne multicolor missile warning testbed was developed as part of an Air Force Research Laboratory (AFRL) initiative focusing on the development of sensors operating in the near infrared where commercially available silicon detectors can be used. At these wavelengths, the rejection of solar induced false alarms is a critical issue. Multicolor discrimination provides one of the most promising techniques for improving the performance of missile warning sensors, particularly for heavy clutter situations. This, in turn, requires that multicolor clutter data be collected for both analysis and algorithm development. The developed sensor test bed, as described in previous papers1, is a two-camera system with 1004x1004 FPA coupled with optimized filters integrated with the optics. The collection portion includes a high speed processor coupled with a high capacity disk array capable of collecting up to 48 full frames per second. This configuration allows the collection of temporally correlated, radiometrically calibrated data in two spectral bands that provide a basis for evaluating the performance of spectral discrimination algorithms. The presentation will describe background and clutter data collected from ground and flight locations in both detection and guard bands and the statistical analysis to provide a basis for evaluation of sensor performance. In addition, measurements have been made of discrete targets, both threats and false alarms. The results of these measurements have shown the capability of these sensors to provide a useful discrimination capability to distinguish threats from false alarms.

  14. Virtual Testbed Aerospace Operations Center (VT-AOC)

    NASA Astrophysics Data System (ADS)

    Dunaway, Bradley; Broadstock, Tom

    2003-09-01

    The Air Force is conducting research in new technologies for next-generation Aerospace Operations Centers (AOCs). The Virtual Testbed Aerospace Operations Center (VT-AOC) will support advanced research in information technologies that operate in or are closely tied to AOCs. The VT-AOC will provide a context for developing, demonstrating, and testing new processes and tools in a realistic environment. To generate the environment, the VT-AOC will incorporate multiple mixed-resolution simulations that are capable of driving existing and future AOC command and control (C2) systems. The VT-AOC will provide the capability to capture existing or proposed C2 processes and then evaluate them operating in conjunction with new technologies. The VT-AOC will also be capable of connecting with other facilities to support increasingly more complex experiments and demonstrations. Together, these capabilities support key initiatives such as Agile Research and Development/Science and Technology (R&D/S&T), Predictive Battlespace Awareness, and Effects-Based Operations.

  15. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  16. NN-SITE: A remote monitoring testbed facility

    SciTech Connect

    Kadner, S.; White, R.; Roman, W.; Sheely, K.; Puckett, J.; Ystesund, K.

    1997-08-01

    DOE, Aquila Technologies, LANL and SNL recently launched collaborative efforts to create a Non-Proliferation Network Systems Integration and Test (NN-Site, pronounced N-Site) facility. NN-Site will focus on wide area, local area, and local operating level network connectivity including Internet access. This facility will provide thorough and cost-effective integration, testing and development of information connectivity among diverse operating systems and network topologies prior to full-scale deployment. In concentrating on instrument interconnectivity, tamper indication, and data collection and review, NN-Site will facilitate efforts of equipment providers and system integrators in deploying systems that will meet nuclear non-proliferation and safeguards objectives. The following will discuss the objectives of ongoing remote monitoring efforts, as well as the prevalent policy concerns. An in-depth discussion of the Non-Proliferation Network Systems Integration and Test facility (NN-Site) will illuminate the role that this testbed facility can perform in meeting the objectives of remote monitoring efforts, and its potential contribution in promoting eventual acceptance of remote monitoring systems in facilities worldwide.

  17. High-performance testbed network with ATM technology for neuroimaging

    NASA Astrophysics Data System (ADS)

    Huang, H. K.; Arenson, Ronald L.; Dillon, William P.; Lou, Shyhliang A.; Bazzill, Todd M.; Wong, Albert W. K.; Gould, Robert G.

    1995-05-01

    Today's teleradiology transmits images with telephone lines (from 14400 to 1.5 Mbits/sec). However, the large amount of data commonly produced during an MR or CT procedure can limit some applications of teleradiology. This paper is a progress report of a high speed (155 Mbits/sec) testbed teleradiology network using asynchronous transfer mode (ATM OC 3) technology for neuroradiology. The network connects radiology departments of four affiliated hospitals and one MR imaging center within the San Francisco Bay Area with ATM switches through the Pacific Bell ATM main switch at Oakland, California; they are: University of California at San Francisco Hospital and Medical School (UCSF), Mt. Zion Hospital (MZH), San Francisco VA Medical Center (SFVAMC), San Francisco General Hospital (SFGH), and San Francisco Magnetic Resonance Imaging Center (SFMRC). UCSF serves as the expert center and the ATM switch is connected to its PACS infrastructure, the others are considered as satellite sites. Images and related patient data are transmitted from the four satellite sites to the expert canter for interpretation and consultation.

  18. An integrated dexterous robotic testbed for space applications

    NASA Technical Reports Server (NTRS)

    Li, Larry C.; Nguyen, Hai; Sauer, Edward

    1992-01-01

    An integrated dexterous robotic system was developed as a testbed to evaluate various robotics technologies for advanced space applications. The system configuration consisted of a Utah/MIT Dexterous Hand, a PUMA 562 arm, a stereo vision system, and a multiprocessing computer control system. In addition to these major subsystems, a proximity sensing system was integrated with the Utah/MIT Hand to provide capability for non-contact sensing of a nearby object. A high-speed fiber-optic link was used to transmit digitized proximity sensor signals back to the multiprocessing control system. The hardware system was designed to satisfy the requirements for both teleoperated and autonomous operations. The software system was designed to exploit parallel processing capability, pursue functional modularity, incorporate artificial intelligence for robot control, allow high-level symbolic robot commands, maximize reusable code, minimize compilation requirements, and provide an interactive application development and debugging environment for the end users. An overview is presented of the system hardware and software configurations, and implementation is discussed of subsystem functions.

  19. Preparing for LISA Data: The Testbed for LISA Analysis Project

    NASA Astrophysics Data System (ADS)

    Finn, Lee Samuel; Benacquista, Matthew J.; Larson, Shane L.; Rubbo, Louis J.

    2006-11-01

    The Testbed for LISA Analysis (TLA) Project aims to facilitate the development, validation, and comparison of different methods for LISA science data analysis by the broad LISA Science Community to meet the special challenges that LISA poses. It includes a well-defined Simulated LISA Data Product (SLDP), which provides a clean interface between the modeling of LISA, the preparation of LISA data, and the analysis of the LISA science data stream; a web-based clearinghouse (at ) providing SLDP software libraries, relevant software, papers and other documentation, and a repository for SLDP data sets; a set of mailing lists for communication between and among LISA simulator developers and LISA science analysts; a problem tracking system for SLDP support; and a program of workshops to allow the burgeoning LISA science community to further refine the SLDP definition, define specific LISA science analysis challenges, and report their results. This proceedings paper describes the TLA Project, the resources it provides immediately, its future plans, and invites the participation of the broader community in the furtherance of its goals.

  20. Linear Clouds

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA03667 Linear Clouds

    These clouds are located near the edge of the south polar region. The cloud tops are the puffy white features in the bottom half of the image.

    Image information: VIS instrument. Latitude -80.1N, Longitude 52.1E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  1. Assessing the Performance of Computationally Simple and Complex Representations of Aerosol Processes using a Testbed Methodology

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Ma, P.; Easter, R. C.; Liu, X.; Zaveri, R. A.; Rasch, P.

    2012-12-01

    Predictions of aerosol radiative forcing in climate models still contain large uncertainties, resulting from a poor understanding of certain aerosol processes, the level of complexity of aerosol processes represented in models, and the ability of models to account for sub-grid scale variability of aerosols and processes affecting them. In addition, comparing the performance and computational efficiency of new aerosol process modules used in various studies is problematic because different studies often employ different grid configurations, meteorology, trace gas chemistry, and emissions that affect the temporal and spatial evolution of aerosols. To address this issue, we have developed an Aerosol Modeling Testbed (AMT) to systematically and objectively evaluate aerosol process modules. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series of testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from a global climate model, Community Atmosphere Model version 5 (CAM5), has also been ported to WRF so that these parameterizations can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. In this study, we evaluate simple and complex treatments of the aerosol size distribution and secondary organic aerosols using the AMT and measurements collected during three field campaigns: the Megacities Initiative Local and Global Observations (MILAGRO) campaign conducted in the vicinity of Mexico City during March 2006, the

  2. Cloud Interactions

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site]

    Released 1 July 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.

    Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.

    This image was acquired during mid-spring near the North Pole. The linear water-ice clouds are now regional in extent and often interact with neighboring cloud system, as seen in this image. The bottom of the image shows how the interaction can destroy the linear nature. While the surface is still visible through most of the clouds, there is evidence that dust is also starting to enter the atmosphere.

    Image information: VIS instrument. Latitude 68.4, Longitude 258.8 East (101.2 West). 38 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration

  3. Modeling of clouds and radiation for developing parameterizations for general circulation models. Annual report, 1995

    SciTech Connect

    Toon, O.B.; Westphal, D.L.

    1996-07-01

    We have used a hierarchy of numerical models for cirrus and stratus clouds and for radiative transfer to improve the reliability of general circulation models. Our detailed cloud microphysical model includes all of the physical processes believed to control the lifecycles of liquid and ice clouds in the troposphere. We have worked on specific GCM parameterizations for the radiative properties of cirrus clouds, making use of a mesocale model as the test-bed for the parameterizations. We have also modeled cirrus cloud properties with a detailed cloud physics model to better understand how the radiatively important properties of cirrus are controlled by their environment. We have used another cloud microphysics model to investigate of the interactions between aerosols and clouds. This work is some of the first to follow the details of interactions between aerosols and cloud droplets and has shown some unexpected relations between clouds and aerosols. We have also used line-by- line radiative transfer results verified with ARM data, to derive a GCMS.

  4. 76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-07

    ...NIST announces the Cloud Computing Forum & Workshop IV to be held on November 2, 3 and 4, 2011. This workshop will provide information on the U.S. Government (USG) Cloud Computing Technology Roadmap initiative. This workshop will also provide an updated status on NIST efforts to help develop open standards in interoperability, portability and security in cloud computing. This event is open to......

  5. Web Solutions Inspire Cloud Computing Software

    NASA Technical Reports Server (NTRS)

    2013-01-01

    An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.

  6. Estimating Cloud Cover

    ERIC Educational Resources Information Center

    Moseley, Christine

    2007-01-01

    The purpose of this activity was to help students understand the percentage of cloud cover and make more accurate cloud cover observations. Students estimated the percentage of cloud cover represented by simulated clouds and assigned a cloud cover classification to those simulations. (Contains 2 notes and 3 tables.)

  7. Instrument measures cloud cover

    NASA Technical Reports Server (NTRS)

    Laue, E. G.

    1981-01-01

    Eight solar sensing cells comprise inexpensive monitoring instrument. Four cells always track Sun while other four face sky and clouds. On overcast day, cloud-irradiance sensors generate as much short-circuit current as Sun sensor cells. As clouds disappear, output of cloud sensors decreases. Ratio of two sensor type outputs determines fractional cloud cover.

  8. Description of the SSF PMAD DC testbed control system data acquisition function

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Mackin, Michael; Wright, Theodore

    1992-01-01

    The NASA LeRC in Cleveland, Ohio has completed the development and integration of a Power Management and Distribution (PMAD) DC Testbed. This testbed is a reduced scale representation of the end to end, sources to loads, Space Station Freedom Electrical Power System (SSF EPS). This unique facility is being used to demonstrate DC power generation and distribution, power management and control, and system operation techniques considered to be prime candidates for the Space Station Freedom. A key capability of the testbed is its ability to be configured to address system level issues in support of critical SSF program design milestones. Electrical power system control and operation issues like source control, source regulation, system fault protection, end-to-end system stability, health monitoring, resource allocation, and resource management are being evaluated in the testbed. The SSF EPS control functional allocation between on-board computers and ground based systems is evolving. Initially, ground based systems will perform the bulk of power system control and operation. The EPS control system is required to continuously monitor and determine the current state of the power system. The DC Testbed Control System consists of standard controllers arranged in a hierarchical and distributed architecture. These controllers provide all the monitoring and control functions for the DC Testbed Electrical Power System. Higher level controllers include the Power Management Controller, Load Management Controller, Operator Interface System, and a network of computer systems that perform some of the SSF Ground based Control Center Operation. The lower level controllers include Main Bus Switch Controllers and Photovoltaic Controllers. Power system status information is periodically provided to the higher level controllers to perform system control and operation. The data acquisition function of the control system is distributed among the various levels of the hierarchy. Data

  9. Spectral Dependence of MODIS Cloud Droplet Effective Radius Retrievals for Marine Boundary Layer Clouds

    NASA Technical Reports Server (NTRS)

    Zhang, Zhibo; Platnick, Steven E.; Ackerman, Andrew S.; Cho, Hyoun-Myoung

    2014-01-01

    Low-level warm marine boundary layer (MBL) clouds cover large regions of Earth's surface. They have a significant role in Earth's radiative energy balance and hydrological cycle. Despite the fundamental role of low-level warm water clouds in climate, our understanding of these clouds is still limited. In particular, connections between their properties (e.g. cloud fraction, cloud water path, and cloud droplet size) and environmental factors such as aerosol loading and meteorological conditions continue to be uncertain or unknown. Modeling these clouds in climate models remains a challenging problem. As a result, the influence of aerosols on these clouds in the past and future, and the potential impacts of these clouds on global warming remain open questions leading to substantial uncertainty in climate projections. To improve our understanding of these clouds, we need continuous observations of cloud properties on both a global scale and over a long enough timescale for climate studies. At present, satellite-based remote sensing is the only means of providing such observations.

  10. Feedback in Clouds II: UV Photoionisation and the first supernova in a massive cloud

    NASA Astrophysics Data System (ADS)

    Geen, Sam; Hennebelle, Patrick; Tremblin, Pascal; Rosdahl, Joakim

    2016-09-01

    Molecular cloud structure is regulated by stellar feedback in various forms. Two of the most important feedback processes are UV photoionisation and supernovae from massive stars. However, the precise response of the cloud to these processes, and the interaction between them, remains an open question. In particular, we wish to know under which conditions the cloud can be dispersed by feedback, which in turn can give us hints as to how feedback regulates the star formation inside the cloud. We perform a suite of radiative magnetohydrodynamic simulations of a 105 solar mass cloud with embedded sources of ionising radiation and supernovae, including multiple supernovae and a hypernova model. A UV source corresponding to 10% of the mass of the cloud is required to disperse the cloud, suggesting that the star formation efficiency should be on the order of 10%. A single supernova is unable to significantly affect the evolution of the cloud. However, energetic hypernovae and multiple supernovae are able to add significant quantities of momentum to the cloud, approximately 1043 g cm/s of momentum per 1051 ergs of supernova energy. We argue that supernovae alone are unable to regulate star formation in molecular clouds. We stress the importance of ram pressure from turbulence in regulating feedback in molecular clouds.

  11. Aerosol-Cloud Interactions in Ship Tracks Using Terra MODIS/MISR

    NASA Astrophysics Data System (ADS)

    Chen, Y. C.; Christensen, M.; Diner, D. J.; Garay, M. J.; Nelson, D. L.

    2014-12-01

    Simultaneous ship track observations from Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Multi-angle Imaging SpectroRadiometer (MISR) have been compiled to investigate how ship-injected aerosols affect marine warm boundary layer clouds under different cloud types and environmental conditions. Taking advantage of the high spatial resolution multiangle observations uniquely available from MISR, we utilized the retrieved cloud albedo, cloud top height, and cloud motion vectors to examine the cloud property responses in ship-polluted and nearby unpolluted clouds. The strength of cloud albedo response to increased aerosol level is primarily dependent on cloud cell structure, dryness of the free troposphere, and boundary layer depth, corroborating a previous study by Chen et al. (2012) where A-Train satellite data were applied. Under open cell cloud structure, the cloud properties are more susceptible to aerosol perturbations as compared to closed cells. Aerosol plumes caused an increase in liquid water amount (+27%), cloud top height (+11%), and cloud albedo (+40%) for open cell clouds, whereas under closed cell clouds, little changes in cloud properties were observed. Further capitalizing on MISR's unique capabilities, the MISR cross-track cloud speed has been used to derive cloud top divergence. Statistically averaging the results from many plume segments to reduce random noise, we have found that in ship-polluted clouds there is stronger cloud top divergence, and in nearby unpolluted clouds, convergence occurs and leads to downdrafts, providing observational evidence for cloud top entrainment feedback. These results suggest that detailed cloud responses, classified by cloud type and environmental conditions, must be accounted for in global climate modeling studies to reduce uncertainties of aerosol indirect forcing. Reference: Chen, Y.-C. et al. Occurrence of lower cloud albedo in ship tracks, Atmos. Chem. Phys., 12, 8223-8235, doi:10.5194/acp-12

  12. Martian Clouds

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site]

    Released 28 June 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.

    Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.

    This image was acquired during early spring near the North Pole. The linear 'ripples' are transparent water-ice clouds. This linear form is typical for polar clouds. The black regions on the margins of this image are areas of saturation caused by the build up of scattered light from the bright polar material during the long image exposure.

    Image information: VIS instrument. Latitude 68.1, Longitude 147.9 East (212.1 West). 38 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS

  13. Emulating JWST Exoplanet Transit Observations in a Testbed laboratory experiment

    NASA Astrophysics Data System (ADS)

    Touli, D.; Beichman, C. A.; Vasisht, G.; Smith, R.; Krist, J. E.

    2014-12-01

    The transit technique is used for the detection and characterization of exoplanets. The combination of transit and radial velocity (RV) measurements gives information about a planet's radius and mass, respectively, leading to an estimate of the planet's density (Borucki et al. 2011) and therefore to its composition and evolutionary history. Transit spectroscopy can provide information on atmospheric composition and structure (Fortney et al. 2013). Spectroscopic observations of individual planets have revealed atomic and molecular species such as H2O, CO2 and CH4 in atmospheres of planets orbiting bright stars, e.g. Deming et al. (2013). The transit observations require extremely precise photometry. For instance, Jupiter transit results to a 1% brightness decrease of a solar type star while the Earth causes only a 0.0084% decrease (84 ppm). Spectroscopic measurements require still greater precision <30ppm. The Precision Projector Laboratory (PPL) is a collaboration between the Jet Propulsion Laboratory (JPL) and California Institute of Technology (Caltech) to characterize and validate detectors through emulation of science images. At PPL we have developed a testbed to project simulated spectra and other images onto a HgCdTe array in order to assess precision photometry for transits, weak lensing etc. for Explorer concepts like JWST, WFIRST, EUCLID. In our controlled laboratory experiment, the goal is to demonstrate ability to extract weak transit spectra as expected for NIRCam, NIRIS and NIRSpec. Two lamps of variable intensity, along with spectral line and photometric simulation masks emulate the signals from a star-only, from a planet-only and finally, from a combination of a planet + star. Three masks have been used to simulate spectra in monochromatic light. These masks, which are fabricated at JPL, have a length of 1000 pixels and widths of 2 pixels, 10 pixels and 1 pixel to correspond respectively to the noted above JWST instruments. From many-hour long

  14. Sensor System Performance Evaluation and Benefits from the NPOESS Airborne Sounder Testbed-Interferometer (NAST-I)

    NASA Technical Reports Server (NTRS)

    Larar, A.; Zhou, D.; Smith, W.

    2009-01-01

    Advanced satellite sensors are tasked with improving global-scale measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring, and environmental change detection. Validation of the entire measurement system is crucial to achieving this goal and thus maximizing research and operational utility of resultant data. Field campaigns employing satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft are an essential part of this validation task. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Airborne Sounder Testbed-Interferometer (NAST-I) has been a fundamental contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This paper focuses on some of the challenges associated with validating advanced atmospheric sounders and the benefits obtained from employing airborne interferometers such as the NAST-I. Select results from underflights of the Aqua Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) obtained during recent field campaigns will be presented.

  15. The Fourier-Kelvin Stellar Interferometer (FKSI): A Progress Report and Preliminary Results from Our Laboratory Testbed

    NASA Technical Reports Server (NTRS)

    Berry, Richard; Rajagopa, J.; Danchi, W. C.; Allen, R. J.; Benford, D. J.; Deming, D.; Gezari, D. Y.; Kuchner, M.; Leisawitz, D. T.; Linfield, R.

    2005-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) is a mission concept for an imaging and nulling interferometer for the near-infrared to mid-infrared spectral region (3-8 microns). FKSI is conceived as a scientific and technological pathfinder to TPF/DARWIN as well as SPIRIT, SPECS, and SAFIR. It will also be a high angular resolution system complementary to JWST. The scientific emphasis of the mission is on the evolution of protostellar systems, from just after the collapse of the precursor molecular cloud core, through the formation of the disk surrounding the protostar, the formation of planets in the disk, and eventual dispersal of the disk material. FKSI will also search for brown dwarfs and Jupiter mass and smaller planets, and could also play a very powerful role in the investigation of the structure of active galactic nuclei and extra-galactic star formation. We report additional studies of the imaging capabilities of the FKSI with various configurations of two to five telescopes, studies of the capabilities of FKSI assuming an increase in long wavelength response to 10 or 12 microns (depending on availability of detectors), and preliminary results from our nulling testbed.

  16. Laboratory Spacecraft Data Processing and Instrument Autonomy: AOSAT as Testbed

    NASA Astrophysics Data System (ADS)

    Lightholder, Jack; Asphaug, Erik; Thangavelautham, Jekan

    2015-11-01

    Recent advances in small spacecraft allow for their use as orbiting microgravity laboratories (e.g. Asphaug and Thangavelautham LPSC 2014) that will produce substantial amounts of data. Power, bandwidth and processing constraints impose limitations on the number of operations which can be performed on this data as well as the data volume the spacecraft can downlink. We show that instrument autonomy and machine learning techniques can intelligently conduct data reduction and downlink queueing to meet data storage and downlink limitations. As small spacecraft laboratory capabilities increase, we must find techniques to increase instrument autonomy and spacecraft scientific decision making. The Asteroid Origins Satellite (AOSAT) CubeSat centrifuge will act as a testbed for further proving these techniques. Lightweight algorithms, such as connected components analysis, centroid tracking, K-means clustering, edge detection, convex hull analysis and intelligent cropping routines can be coupled with the tradition packet compression routines to reduce data transfer per image as well as provide a first order filtering of what data is most relevant to downlink. This intelligent queueing provides timelier downlink of scientifically relevant data while reducing the amount of irrelevant downlinked data. Resulting algorithms allow for scientists to throttle the amount of data downlinked based on initial experimental results. The data downlink pipeline, prioritized for scientific relevance based on incorporated scientific objectives, can continue from the spacecraft until the data is no longer fruitful. Coupled with data compression and cropping strategies at the data packet level, bandwidth reductions exceeding 40% can be achieved while still downlinking data deemed to be most relevant in a double blind study between scientist and algorithm. Applications of this technology allow for the incorporation of instrumentation which produces significant data volumes on small spacecraft

  17. Earthbound Unmanned Autonomous Vehicles (UAVS) As Planetary Science Testbeds

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Bland, G.; Diaz, J. A.; Fladeland, M. M.

    2014-12-01

    Recent advances in the technology of unmanned vehicles have greatly expanded the range of contemplated terrestrial operational environments for their use, including aerial, surface, and submarine. The advances have been most pronounced in the areas of autonomy, miniaturization, durability, standardization, and ease of operation, most notably (especially in the popular press) for airborne vehicles. Of course, for a wide range of planetary venues, autonomy at high cost of both money and risk, has always been a requirement. Most recently, missions to Mars have also featured an unprecedented degree of mobility. Combining the traditional planetary surface deployment operational and science imperatives with emerging, very accessible, and relatively economical small UAV platforms on Earth can provide flexible, rugged, self-directed, test-bed platforms for landed instruments and strategies that will ultimately be directed elsewhere, and, in the process, provide valuable earth science data. While the most direct transfer of technology from terrestrial to planetary venues is perhaps for bodies with atmospheres (and oceans), with appropriate technology and strategy accommodations, single and networked UAVs can be designed to operate on even airless bodies, under a variety of gravities. In this presentation, we present and use results and lessons learned from our recent earth-bound UAV volcano deployments, as well as our future plans for such, to conceptualize a range of planetary and small-body missions. We gratefully acknowledge the assistance of students and colleagues at our home institutions, and the government of Costa Rica, without which our UAV deployments would not have been possible. This work was carried out, in part, at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.

  18. Geo Light Imaging National Testbed (GLINT): past, present, and future

    NASA Astrophysics Data System (ADS)

    Ford, Stephen D.; Voelz, David G.; Gamiz, Victor L.; Storm, Susan L.; Czyzak, Stanley R.; Oldenettel, Jerry; Hunter, Allen

    1999-09-01

    Object identification in deep space is a surveillance mission crucial to our national defense. Satellite health/status monitoring is another important space surveillance task with both military and civilian applications. Deep space satellites provide challenging targets for ground-based optical sensors due to the extreme range imposed by geo-stationary and geo-synchronous orbits. The Air Force Research Laboratory, in partnership with Trex Enterprises and our other contractor partners, will build a new ground-based sensor to address these deficiencies. The Geo Light Imaging National Testbed (GLINT) is based on an active imaging concept known as Fourier telescopy. In this technique, the target satellite is illuminated by two or more laser sources. The corresponding fields interfere at the satellite to form interference fringes. These fringes may be made to move across the target by the introduction of a frequency shift between the laser beams. The resulting time-varying laser backscatter contains information about a Fourier component of the target reflectivity and may be collected with a large solar heliostat array. This large unphased receiver provides sufficient signal-to-noise ratio for each Fourier component using relatively low power laser sources. A third laser source allows the application of phase closure in the image reconstruction software. Phase closure removes virtually all low frequency phase distortion and guarantees that the phases of all fringes are relatively fixed. Therefore, the Fourier phase associated with each component can be recovered accurately. This paper briefly reviews the history of Fourier telescopy, the proposed design of the GLINT system, and the future of this research area.

  19. The AMES Photometric Testbed for the Kepler Mission

    NASA Astrophysics Data System (ADS)

    Koch, D.; Witteborn, F.; Dunham, E.; Jenkins, J.; Borucki, W.; Webster, W.

    1999-09-01

    A testbed facility has been constructed to perform end-to-end laboratory tests of the photometric method for finding terrestrial-size planets. The main objective of the facility is to determine the effects of various induced noise sources on the capability of a CCD photometer to maintain an instrument relative precision of better than 1x10(-5) . The photometry facility includes: a simulated star field with a source to approximate a solar spectrum, fast optics to simulate the telescope, a thinned back-illuminated CCD similar to those to be used on the spacecraft operating at 1 MHz read rate, shutterless operation, and computers to perform the onboard CCD control and data handling. The test structure is thermally and mechanically isolated. Each source of noise is introduced in a controlled fashion and evaluated as to its contribution to the total noise budget. Pointing noise or changing thermal gradients in the spacecraft can cause star-image motion at the milli-pixel level. These motions are imposed by piezo-electric devices that move the photometer relative to the star field. Signals as small as those associated with terrestrial-size transits of solar-like stars are produced in the facility. This is accomplished by electrical self-heating and expansion of fine wires placed across many of the star apertures. The effective small decrease in stellar brightness is used to demonstrate that terrestrial-size planets can be detected under realistic instrument noise conditions and at the shot-noise-limited level. Algorithms identical to both the onboard and ground processing systems are used to extract and process the data. These processes use differential photometry to construct light curves and search for transits. Examples are presented of the effects of imposing several noise sources and the resulting detectability of planets.

  20. Crater Clouds

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA06085 Crater Clouds

    The crater on the right side of this image is affecting the local wind regime. Note the bright line of clouds streaming off the north rim of the crater.

    Image information: VIS instrument. Latitude -78.8N, Longitude 320.0E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  1. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    NASA Technical Reports Server (NTRS)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  2. The Development of a Smart Distribution Grid Testbed for Integrated Information Management Systems

    SciTech Connect

    Lu, Ning; Du, Pengwei; Paulson, Patrick R.; Greitzer, Frank L.; Guo, Xinxin; Hadley, Mark D.

    2011-07-28

    This paper presents a smart distribution grid testbed to test or compare designs of integrated information management systems (I2MSs). An I2MS extracts and synthesizes information from a wide range of data sources to detect abnormal system behaviors, identify possible causes, assess the system status, and provide grid operators with response suggestions. The objective of the testbed is to provide a modeling environment with sufficient data sources for the I2MS design. The testbed includes five information layers and a physical layer; it generates multi-layer chronological data based on actual measurement playbacks or simulated data sets produced by the physical layer. The testbed models random hardware failures, human errors, extreme weather events, and deliberate tampering attempts to allow users to evaluate the performance of different I2MS designs. Initial results of I2MS performance tests showed that the testbed created a close-to-real-world environment that allowed key performance metrics of the I2MS to be evaluated.

  3. Preliminary results of the LLNL airborne experimental test-bed SAR system

    SciTech Connect

    Miller, M.G.; Mullenhoff, C.J.; Kiefer, R.D.; Brase, J.M.; Wieting, M.G.; Berry, G.L.; Jones, H.E.

    1996-01-16

    The Imaging and Detection Program (IDP) within Laser Programs at Lawrence Livermore National Laboratory (LLNL) in cooperation with the Hughes Aircraft Company has developed a versatile, high performance, airborne experimental test-bed (AETB) capability. The test-bed has been developed for a wide range of research and development experimental applications including radar and radiometry plus, with additional aircraft modifications, optical systems. The airborne test-bed capability has been developed within a Douglas EA-3B Skywarrior jet aircraft provided and flown by Hughes Aircraft Company. The current test-bed payload consists of an X-band radar system, a high-speed data acquisition, and a real-time processing capability. The medium power radar system is configured to operate in a high resolution, synthetic aperture radar (SAR) mode and is highly configurable in terms of waveforrns, PRF, bandwidth, etc. Antennas are mounted on a 2-axis gimbal in the belly radome of the aircraft which provides pointing and stabilization. Aircraft position and antenna attitude are derived from a dedicated navigational system and provided to the real-time SAR image processor for instant image reconstruction and analysis. This paper presents a further description of the test-bed and payload subsystems plus preliminary results of SAR imagery.

  4. The Development of a Smart Distribution Grid Testbed for Integrated Information Management Systems

    SciTech Connect

    Lu, Ning; Du, Pengwei; Paulson, Patrick R.; Greitzer, Frank L.; Guo, Xinxin; Hadley, Mark D.

    2011-01-31

    This paper presents a smart distribution grid testbed to test or compare designs of integrated information management systems (I2MSs). An I2MS extracts and synthesizes information from a wide range of data sources to detect abnormal system behaviors, identify possible causes, assess the system status, and provide grid operators with response suggestions. The objective of the testbed is to provide a modeling environment with sufficient data sources for the I2MS design. The testbed includes five information layers and a physical layer; it generates multi-layer chronological data based on actual measurement playbacks or simulated data sets produced by the physical layer. The testbed models random hardware failures, human errors, extreme weather events, and deliberate tampering attempts to allow users to evaluate the performance of different I2MS designs. Initial results of I2MS performance tests showed that the testbed created a close-to-real-world environment that allowed key performance metrics of the I2MS to be evaluated

  5. Testbed for distributed scenario simulations with EW and its effects on C2

    NASA Astrophysics Data System (ADS)

    Tydén, L.; Wigren, C.; Andersson, H.; Olsson, S.

    2007-04-01

    The paper will present a simulation testbed in which a scenario can be setup, simulated and evaluated and where planning tools, electronic warfare (EW) components and command and control (C2) functionality can be integrated. The testbed is HLA (high level architecture) compliant, allows for a distributed simulation with dynamically configurable federates, and can also be used for testing actual equipment in a simulated scenario. One of the key components in the testbed is a set of planning tools that can be used to show ranges for sensors, jamming and communication systems. These tools can be used not only for planning the mission (e.g. best route) but can also be used during the mission to show the location of possible threats or the range of own equipment (sensor, jamming, communication) in different situations. During a mission these tools can be used to support the decisions of what actions to take in different situations. One goal with developing the planning tools in the testbed is to learn how to use planning tools in real life scenarios. Therefore, the planning tools are constantly developed and tested with respect to technical and tactical use. Also technical and tactical aspects of current and future EW and C2 equipment can be tested and developed in the testbed.

  6. High Vertically Resolved Atmospheric and Surface/Cloud Parameters Retrieved with Infrared Atmospheric Sounding Interferometer (IASI)

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Larar, Allen M.; Smith, WIlliam L.; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    The Joint Airborne IASI Validation Experiment (JAIVEx) was conducted during April 2007 mainly for validation of the IASI on the MetOp satellite. IASI possesses an ultra-spectral resolution of 0.25/cm and a spectral coverage from 645 to 2760/cm. Ultra-spectral resolution infrared spectral radiance obtained from near nadir observations provide atmospheric, surface, and cloud property information. An advanced retrieval algorithm with a fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. This physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the cloud-free and/or clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals are achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). Preliminary retrievals of atmospheric soundings, surface properties, and cloud optical/microphysical properties with the IASI observations are obtained and presented. These retrievals will be further inter-compared with those obtained from airborne FTS system, such as the NPOESS Airborne Sounder Testbed - Interferometer (NAST-I), dedicated dropsondes, radiosondes, and ground based Raman Lidar. The

  7. Testing cloud microphysics parameterizations in NCAR CAM5 with ISDAC and M-PACE observations

    SciTech Connect

    Liu X.; Lin W.; Xie, S.; Boyle, J.; Klein, S. A.; Shi, X.; Wang, Z.; Ghan, S. J.; Earle, M.; Liu, P. S. K.; Zelenyuk, A.

    2011-12-24

    Arctic clouds simulated by the National Center for Atmospheric Research (NCAR) Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic spring and fall seasons performed under the Cloud-Associated Parameterizations Testbed framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary-layer mixed-phase stratocumulus and multilayer or deep frontal clouds. However, for low-level stratocumulus, the model significantly underestimates the observed cloud liquid water content in both seasons. As a result, CAM5 significantly underestimates the surface downward longwave radiative fluxes by 20-40 W m{sup -2}. Introducing a new ice nucleation parameterization slightly improves the model performance for low-level mixed-phase clouds by increasing cloud liquid water content through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron-Findeisen process. The CAM5 single-column model testing shows that changing the instantaneous freezing temperature of rain to form snow from -5 C to -40 C causes a large increase in modeled cloud liquid water content through the slowing down of cloud liquid and rain-related processes (e.g., autoconversion of cloud liquid to rain). The underestimation of aerosol concentrations in CAM5 in the Arctic also plays an important role in the low bias of cloud liquid water in the single-layer mixed-phase clouds. In addition, numerical issues related to the coupling of model physics and time stepping in CAM5 are responsible for the model biases and will be explored in future studies.

  8. Testing Cloud Microphysics Parameterizations in NCAR CAM5 with ISDAC and M-PACE Observations

    SciTech Connect

    Liu, Xiaohong; Xie, Shaocheng; Boyle, James; Klein, Stephen A.; Shi, Xiangjun; Wang, Zhien; Lin, Wuyin; Ghan, Steven J.; Earle, Michael; Liu, Peter; Zelenyuk, Alla

    2011-12-24

    Arctic clouds simulated by the NCAR Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic Spring and Fall seasons performed under the Cloud- Associated Parameterizations Testbed (CAPT) framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary layer mixed-phase stratocumulus, and multilayer or deep frontal clouds. However, for low-level clouds, the model significantly underestimates the observed cloud liquid water content in both seasons and cloud fraction in the Spring season. As a result, CAM5 significantly underestimates the surface downward longwave (LW) radiative fluxes by 20-40 W m-2. The model with a new ice nucleation parameterization moderately improves the model simulations by increasing cloud liquid water content in mixed-phase clouds through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron- Findeisen (WBF) process. The CAM5 single column model testing shows that change in the homogeneous freezing temperature of rain to form snow from -5 C to -40 C has a substantial impact on the modeled liquid water content through the slowing-down of liquid and rain-related processes. In contrast, collections of cloud ice by snow and cloud liquid by rain are of minor importance for single-layer boundary layer mixed-phase clouds in the Arctic.

  9. Atmospheric Corerection of Aviris Data of Monterey Bay Contaminated by Thin Cirrus Clouds

    NASA Technical Reports Server (NTRS)

    van den Bosch, Jeannette; Davis, Curtiss O.; Mobley, Curtis D.; Rhea, W. Joseph

    1993-01-01

    Aviris scenes are often rejected when cloud cover exceeds 10 percent. However, if the cloud cover is determined to be primarily cirrus rather than cumulus, inwater optical properties may still be extracted over open ocean.

  10. Corona-producing ice clouds: A case study of a cold mid-latitude cirrus layer

    SciTech Connect

    Sassen, K.; Mace, G.G.; Hallett, J.; Poellot, M.R.

    1998-03-01

    A high (14.0-km), cold ({minus}71.0thinsp{degree}C) cirrus cloud was studied by ground-based polarization lidar and millimeter radar and aircraft probes on the night of 19 April 1994 from the Cloud and Radiation Testbed site in northern Oklahoma. A rare cirrus cloud lunar corona was generated by this 1{endash}2-km-deep cloud, thus providing an opportunity to measure the composition {ital in situ}, which had previously been assumed only on the basis of lidar depolarization data and simple diffraction theory for spheres. In this case, corona ring analysis indicated an effective particle diameter of {approximately}22 {mu}m. A variety of {ital in situ} data corroborates the approximate ice-particle size derived from the passive retrieval method, especially near the cloud top, where impacted cloud samples show simple solid crystals. The homogeneous freezing of sulfuric acid droplets of stratospheric origin is assumed to be the dominant ice-particle nucleation mode acting in corona-producing cirrus clouds. It is speculated that this process results in a previously unrecognized mode of acid-contaminated ice-particle growth and that such small-particle cold cirrus clouds are potentially a radiatively distinct type of cloud. {copyright} 1998 Optical Society of America

  11. Cloud Infrastructure & Applications - CloudIA

    NASA Astrophysics Data System (ADS)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  12. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  13. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Li, C.; Wang, J.; Cui, C.; He, B.; Fan, D.; Yang, Y.; Chen, J.; Zhang, H.; Yu, C.; Xiao, J.; Wang, C.; Cao, Z.; Fan, Y.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Yin, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on CloudStack, an open source software, we set up the cloud computing environment for AstroCloud Project. It consists of five distributed nodes across the mainland of China. Users can use and analysis data in this cloud computing environment. Based on GlusterFS, we built a scalable cloud storage system. Each user has a private space, which can be shared among different virtual machines and desktop systems. With this environments, astronomer can access to astronomical data collected by different telescopes and data centers easily, and data producers can archive their datasets safely.

  14. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  15. Testbed System of Inter-Radio System Switching for Cognitive Radio

    NASA Astrophysics Data System (ADS)

    Hanaoka, Seishi; Yano, Masashi; Hirata, Tetsuhiko

    The cognitive radio system consists of multiple wireless access systems that cover overlapping areas and cognitive terminals that use one or more of the wireless accesses simultaneously. In this paper, we describe the architecture of the cognitive radio system and the inter-system handover protocols. In the architecture, each cognitive terminal, which can access multiple radio systems, operates with a single local IP address. The control sequence and packet format are designed to achieve fast handover among the radio systems. Based on the architecture, we have developed a testbed system. On this system, we demonstrate that data can be delivered continuously and radio systems can be switched correctly without any packet loss. In addition, we present the result of the evaluation of the end-to-end latency on the testbed system. These testbed results demonstrate the system architecture described in the paper can achieve a cognitive radio system.

  16. Terahertz standoff imaging testbed design and performance for concealed weapon and device identification model development

    NASA Astrophysics Data System (ADS)

    Franck, Charmaine C.; Lee, Dave; Espinola, Richard L.; Murrill, Steven R.; Jacobs, Eddie L.; Griffin, Steve T.; Petkie, Douglas T.; Reynolds, Joe

    2007-04-01

    This paper describes the design and performance of the U.S. Army RDECOM CERDEC Night Vision and Electronic Sensors Directorate's (NVESD), active 0.640-THz imaging testbed, developed in support of the Defense Advanced Research Project Agency's (DARPA) Terahertz Imaging Focal-Plane Technology (TIFT) program. The laboratory measurements and standoff images were acquired during the development of a NVESD and Army Research Laboratory terahertz imaging performance model. The imaging testbed is based on a 12-inch-diameter Off-Axis Elliptical (OAE) mirror designed with one focal length at 1 m and the other at 10 m. This paper will describe the design considerations of the OAE-mirror, dual-capability, active imaging testbed, as well as measurement/imaging results used to further develop the model.

  17. Towards an autonomous telescope system: the Test-Bed Telescope project

    NASA Astrophysics Data System (ADS)

    Racero, E.; Ocaña, F.; Ponz, D.; the TBT Consortium

    2015-05-01

    In the context of the Space Situational Awareness (SSA) programme of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. It is foreseen that this test-bed environment will be used to validate future prototype software systems as well as to evaluate remote monitoring and control techniques. The test-bed system will be capable to deliver astrometric and photometric data of the observed objects in near real-time. This contribution describes the current status of the project.

  18. Model-based beam control for illumination of remote objects, part II: laboratory testbed

    NASA Astrophysics Data System (ADS)

    Basu, Santasri; Voelz, David; Chandler, Susan M.; Lukesh, Gordon W.; Sjogren, Jon

    2004-10-01

    When a laser beam propagates through the atmosphere, it is subject to corrupting influences including mechanical vibrations, turbulence and tracker limitations. As a result, pointing errors can occur, causing loss of energy or signal at the target. Nukove Scientific Consulting has developed algorithms to estimate these pointing errors from the statistics of the return photons from the target. To prove the feasibility of this approach for real-time estimation, an analysis tool called RHINO was developed by Nukove. Associated with this effort, New Mexico State University developed a laboratory testbed, the ultimate objective being to test the estimation algorithms under controlled conditions and to stream data into RHINO to prove the feasibility of real-time operation. The present paper outlines the description of this testbed and the results obtained through RHINO when the testbed was used to test the estimation approach.

  19. Designing a Distributed Systems Architecture Testbed for Real-Time Power Grid Systems

    SciTech Connect

    Liu, Yan; Gorton, Ian; Chen, Yousu; Jin, Shuangshuang

    2011-07-09

    Power engineers who are striving to improve real-time attribute of power grid applications are ill equipped with software engineering methods and tools that allow them to rigorously evaluate their designs, taken into account data communication, geographic locations, and high performance computing capacity. This paper presents a technical approach to designing a testbed for embedding real-time monitoring and computation functionalities into the power grid system. The approach focuses on integrating the parallel computational models with the data management infrastructure for near-real time power grid state estimation. We study and summarize various forces and requirements that drive the design decisions in the distributed systems architecture. Given the continental scale of the power grid, it is important for the testbed to be extensible and scalable within a complex topology of physical entities, controlled by an overlaid network of power utilities and regulatory balancing authorities. This paper outlines the technical steps, and software toolkits to develop this testbed.

  20. SPHERES tethered formation flight testbed: advancements in enabling NASA's SPECS mission

    NASA Astrophysics Data System (ADS)

    Chung, Soon-Jo; Adams, Danielle; Saenz-Otero, Alvar; Kong, Edmund; Miller, David W.; Leisawitz, David; Lorenzini, Enrico; Sell, Steve

    2006-06-01

    This paper reports on efforts to control a tethered formation flight spacecraft array for NASA's SPECS mission using the SPHERES test-bed developed by the MIT Space Systems Laboratory. Specifically, advances in methodology and experimental results realized since the 2005 SPIE paper are emphasized. These include a new test-bed setup with a reaction wheel assembly, a novel relative attitude measurement system using force torque sensors, and modeling of non-ideal tethers to account for tether vibration modes. The nonlinear equations of motion of multi-vehicle tethered spacecraft with elastic flexible tethers are derived from Lagrange's equations. The controllability analysis indicates that both array resizing and spin-up are fully controllable by the reaction wheels and the tether motor, thereby saving thruster fuel consumption. Based upon this analysis, linear and nonlinear controllers have been successfully implemented on the tethered SPHERES testbed, and tested at the NASA MSFC's flat floor facility using two and three SPHERES configurations.

  1. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    NASA Technical Reports Server (NTRS)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  2. Flight Testing of Guidance, Navigation and Control Systems on the Mighty Eagle Robotic Lander Testbed

    NASA Technical Reports Server (NTRS)

    Hannan, Mike; Rickman, Doug; Chavers, Greg; Adam, Jason; Becker, Chris; Eliser, Joshua; Gunter, Dan; Kennedy, Logan; O'Leary, Patrick

    2015-01-01

    During 2011 a series of progressively more challenging flight tests of the Mighty Eagle autonomous terrestrial lander testbed were conducted primarily to validate the GNC system for a proposed lunar lander. With the successful completion of this GNC validation objective the opportunity existed to utilize the Mighty Eagle as a flying testbed for a variety of technologies. In 2012 an Autonomous Rendezvous and Capture (AR&C) algorithm was implemented in flight software and demonstrated in a series of flight tests. In 2012 a hazard avoidance system was developed and flight tested on the Mighty Eagle. Additionally, GNC algorithms from Moon Express and a MEMs IMU were tested in 2012. All of the testing described herein was above and beyond the original charter for the Mighty Eagle. In addition to being an excellent testbed for a wide variety of systems the Mighty Eagle also provided a great learning opportunity for many engineers and technicians to work a flight program.

  3. Definition study for variable cycle engine testbed engine and associated test program

    NASA Technical Reports Server (NTRS)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  4. Venus: uniformity of clouds, and photography.

    PubMed

    Keene, G T

    1968-01-19

    Photographs of Earth at a resolution of about 600 kilometers were compared to pictures of Venus taken from Earth at about the same resolution . Under these conditions Earth appear very heavily covered by clouds. Since details on the surface of Earth can be recorded from Earth orbit, it may be possible to phiotograph protions of the surface of Venus, through openings in the clouds, from an orbiting satellite. PMID:17799560

  5. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  6. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    NASA Technical Reports Server (NTRS)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and

  7. Statistical Analyses of Satellite Cloud Object Data From CERES. Part 4; Boundary-layer Cloud Objects During 1998 El Nino

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce A.; Parker, Lindsay

    2006-01-01

    Three boundary-layer cloud object types, stratus, stratocumulus and cumulus, that occurred over the Pacific Ocean during January-August 1998, are identified from the CERES (Clouds and the Earth s Radiant Energy System) single scanner footprint (SSF) data from the TRMM (Tropical Rainfall Measuring Mission) satellite. This study emphasizes the differences and similarities in the characteristics of each cloud-object type between the tropical and subtropical regions and among different size categories and among small geographic areas. Both the frequencies of occurrence and statistical distributions of cloud physical properties are analyzed. In terms of frequencies of occurrence, stratocumulus clouds dominate the entire boundary layer cloud population in all regions and among all size categories. Stratus clouds are more prevalent in the subtropics and near the coastal regions, while cumulus clouds are relatively prevalent over open ocean and the equatorial regions, particularly, within the small size categories. The largest size category of stratus cloud objects occurs more frequently in the subtropics than in the tropics and has much larger average size than its cumulus and stratocumulus counterparts. Each of the three cloud object types exhibits small differences in statistical distributions of cloud optical depth, liquid water path, TOA albedo and perhaps cloud-top height, but large differences in those of cloud-top temperature and OLR between the tropics and subtropics. Differences in the sea surface temperature (SST) distributions between the tropics and subtropics influence some of the cloud macrophysical properties, but cloud microphysical properties and albedo for each cloud object type are likely determined by (local) boundary-layer dynamics and structures. Systematic variations of cloud optical depth, TOA albedo, cloud-top height, OLR and SST with cloud object sizes are pronounced for the stratocumulus and stratus types, which are related to systematic

  8. Interferometric Testbed for Nanometer Level Stabilization of Environmental Motion Over Long Timescales

    NASA Technical Reports Server (NTRS)

    Numata, Kenji; Camp, Jordan

    2008-01-01

    We developed an interferometric testbed to stabilize environmental motions over timescales of several hours and a lengthscale of 1m. Typically, thermal and seismic motions on the ground are larger than 1 micron over these scales, affecting the precision of more sensitive measurements. To suppress such motions, we built an active stabilization system composed of interferometric sensors, a hexapod actuator, and a frequency stabilized laser. With this stabilized testbed, environmental motions were suppressed down to nm level. This system will allow us to perform sensitive measurements, such as ground testing of LISA (Laser Interferometer Space Antenna), in the presence of environmental noise.

  9. SAVA 3: A testbed for integration and control of visual processes

    NASA Technical Reports Server (NTRS)

    Crowley, James L.; Christensen, Henrik

    1994-01-01

    The development of an experimental test-bed to investigate the integration and control of perception in a continuously operating vision system is described. The test-bed integrates a 12 axis robotic stereo camera head mounted on a mobile robot, dedicated computer boards for real-time image acquisition and processing, and a distributed system for image description. The architecture was designed to: (1) be continuously operating, (2) integrate software contributions from geographically dispersed laboratories, (3) integrate description of the environment with 2D measurements, 3D models, and recognition of objects, (4) capable of supporting diverse experiments in gaze control, visual servoing, navigation, and object surveillance, and (5) dynamically reconfiguarable.

  10. Virtual Pipeline System Testbed to Optimize the U.S. Natural Gas Transmission Pipeline System

    SciTech Connect

    Kirby S. Chapman; Prakash Krishniswami; Virg Wallentine; Mohammed Abbaspour; Revathi Ranganathan; Ravi Addanki; Jeet Sengupta; Liubo Chen

    2005-06-01

    The goal of this project is to develop a Virtual Pipeline System Testbed (VPST) for natural gas transmission. This study uses a fully implicit finite difference method to analyze transient, nonisothermal compressible gas flow through a gas pipeline system. The inertia term of the momentum equation is included in the analysis. The testbed simulate compressor stations, the pipe that connects these compressor stations, the supply sources, and the end-user demand markets. The compressor station is described by identifying the make, model, and number of engines, gas turbines, and compressors. System operators and engineers can analyze the impact of system changes on the dynamic deliverability of gas and on the environment.

  11. Progress in testing exo-planet signal extraction on the TPF-I Planet Detection Testbed

    NASA Technical Reports Server (NTRS)

    Martin, Stefan R.; Szwaykowski, Piotr; Loya, Frank M.; Liewer, Kurt

    2006-01-01

    The TPF Interferometer (TPF-I) concept is being studied at the Jet Propulsion Laboratory and the TPF-I Planet Detection Testbed has been developed to simulate the detection process for an earthlike planet orbiting a star within about 15 pc. The testbed combines four beams of infrared light simulating the operation of a dual chopped Bracewell interferometer observing a star and a faint planet. This paper describes the results obtained this year including nulling of the starlight on four input beams at contrast ratios up to 250,000 to 1, and detection of faint planet signals at contrast ratios with the star of 2 million to 1.

  12. Regenerative Fuel Cell System Testbed Program for Government and Commercial Applications

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA Lewis Research Center's Electrochemical Technology Branch has led a multiagency effort to design, fabricate, and operate a regenerative fuel cell (RFC) system testbed. Key objectives of this program are to evaluate, characterize, and demonstrate fully integrated RFC's for space, military, and commercial applications. The Lewis-led team is implementing the program through a unique international coalition that encompasses both Government and industry participants. Construction of the 25-kW RFC testbed at the NASA facility at Edwards Air Force Base was completed in January 1995, and the system has been operational since that time.

  13. Experimental demonstration of a classical approach for flexible structure control - The ACES testbed

    NASA Technical Reports Server (NTRS)

    Wie, Bong

    1991-01-01

    This paper describes the results of an active structural control experiment performed for the Advanced Control Evaluation for Structures (ACES) testbed at NASA-Marshall as part of the NASA Control-Structure Interaction Guest Investigator Program. The experimental results successfully demonstrate the effectiveness of a 'dipole' concept for line-of-sight control of a pointing system mounted on a flexible structure. The simplicity and effectiveness of a classical 'single-loop-at-a-time' approach for the active structural control design for a complex structure, such as the ACES testbed, are demonstrated.

  14. The Oort cloud

    NASA Technical Reports Server (NTRS)

    Marochnik, Leonid S.; Mukhin, Lev M.; Sagdeev, Roald Z.

    1991-01-01

    Views of the large-scale structure of the solar system, consisting of the Sun, the nine planets and their satellites, changed when Oort demonstrated that a gigantic cloud of comets (the Oort cloud) is located on the periphery of the solar system. The following subject areas are covered: (1) the Oort cloud's mass; (2) Hill's cloud mass; (3) angular momentum distribution in the solar system; and (4) the cometary cloud around other stars.

  15. Chapter 25: Cloud-Resolving Modeling: ARM and the GCSS Story

    NASA Technical Reports Server (NTRS)

    Krueger, Steven K.; Morrison, Hugh; Fridlind, Ann M.

    2016-01-01

    The Global Energy and Water Cycle Experiment (GEWEX) Cloud System Study (GCSS) was created in 1992. As described by Browning et al., The focus of GCSS is on cloud systems spanning the mesoscale rather than on individual clouds. Observations from field programs will be used to develop and validate the cloud-resolving models, which in turn will be used as test-beds to develop the parameterizations for the large-scale models. The most important activities that GCSS promoted were the following: Identify key questions about cloud systems relating to parameterization issues and suggest approaches to address them, and Organize model intercomparison studies relevant to cloud parameterization. Four different cloud system types were chosen for GCSS to study: boundary layer, cirrus, frontal, and deep precipitating convective. A working group (WG) was formed for each of the cloud system types. The WGs organized model intercomparison studies and meetings to present results of the intercomparisons. The first such intercomparison study took place in 1994.

  16. An Unattended Cloud-Profiling Radar for Use in Climate Research.

    NASA Astrophysics Data System (ADS)

    Moran, Kenneth P.; Martner, Brooks E.; Post, M. J.; Kropfli, Robert A.; Welsh, David C.; Widener, Kevin B.

    1998-03-01

    A new millimeter-wave cloud radar (MMCR) has been designed to provide detailed, long-term observations of nonprecipitating and weakly precipitating clouds at Cloud and Radiation Testbed (CART) sites of the Department of Energy's Atmospheric Radiation Measurement (ARM) program. Scientific requirements included excellent sensitivity and vertical resolution to detect weak and thin multiple layers of ice and liquid water clouds over the sites and long-term, unattended operations in remote locales. In response to these requirements, the innovative radar design features a vertically pointing, single-polarization, Doppler system operating at 35 GHz (Ka band). It uses a low-peak-power transmitter for long-term reliability and high-gain antenna and pulse-compressed waveforms to maximize sensitivity and resolution. The radar uses the same kind of signal processor as that used in commercial wind profilers. The first MMCR began operations at the CART in northern Oklahoma in late 1996 and has operated continuously there for thousands of hours. It routinely provides remarkably detailed images of the ever-changing cloud structure and kinematics over this densely instrumented site. Examples of the data are presented. The radar measurements will greatly improve quantitative documentation of cloud conditions over the CART sites and will bolster ARM research to understand how clouds impact climate through their effects on radiative transfer. Millimeter-wave radars such as the MMCR also have potential applications in the fields of aviation weather, weather modification, and basic cloud physics research.

  17. Hydrometeorology Testbed in the American River Basin of Northern California

    NASA Astrophysics Data System (ADS)

    Kingsmill, D.; Lundquist, J.; Jorgensen, D.; McGinley, J.; Werner, K.

    2006-12-01

    In California, most precipitation occurs in the winter, as a mixture of rain at lower elevations and snow in the higher mountains. Storms from the Pacific carry large amounts of moisture, and put people and property at risk from flooding because of the vast urban development and infrastructure in low-lying areas of the central valley of California. Improved flood prediction at finer spatial and temporal resolutions can help minimize these risks. The first step is to accurately measure and predict spatially-distributed precipitation. This is particularly true for river basins with complex orography where the processes that lead to the development of precipitation and determine its distribution and fate on the ground are not well understood. To make progress in this important area, the U.S. National Oceanic and Atmospheric Administration (NOAA) is leading a Hydrometeorology Testbed (HMT) effort designed to accelerate the testing and infusion of new technologies, models, and scientific results from the research community into daily forecasting operations. HMT is a national effort (http://hmt.noaa.gov) that will be implemented in different regions of the U.S. over the next decade. In each region, the focus will be on individual experimental test basins. The first full-scale implementation of HMT, called HMT-West, targets northern California's flood-vulnerable American River Basin (4740 km2) on the west slopes of the Sierra Nevada between Sacramento and Lake Tahoe. The deployment strategy is focused on the North Fork of the basin (875 km2), which is the least- controlled portion of the entire catchment. This basin was selected as a test basin because it has reliable streamflow records dating back to 1941 and has been well characterized by prior field studies (e.g. the Sierra Cooperative Pilot Project) and modeling efforts, focusing on both short-term operations and long-term climate scenarios. Intensive field activities in the North Fork of the American River started in

  18. Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA)

    NASA Technical Reports Server (NTRS)

    Banker, Brian F.; Robinson, Travis

    2016-01-01

    The proposed paper will cover ongoing effort named HESTIA (Human Exploration Spacecraft Testbed for Integration and Advancement), led at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) to promote a cross-subsystem approach to developing Mars-enabling technologies with the ultimate goal of integrated system optimization. HESTIA also aims to develop the infrastructure required to rapidly test these highly integrated systems at a low cost. The initial focus is on the common fluids architecture required to enable human exploration of mars, specifically between life support and in-situ resource utilization (ISRU) subsystems. An overview of the advancements in both integrated technologies, in infrastructure, in simulation, and in modeling capabilities will be presented, as well as the results and findings of integrated testing,. Due to the enormous mass gear-ratio required for human exploration beyond low-earth orbit, (for every 1 kg of payload landed on Mars, 226 kg will be required on Earth), minimization of surface hardware and commodities is paramount. Hardware requirements can be minimized by reduction of equipment performing similar functions though for different subsystems. If hardware could be developed which meets the requirements of both life support and ISRU it could result in the reduction of primary hardware and/or reduction in spares. Minimization of commodities to the surface of mars can be achieved through the creation of higher efficiency systems producing little to no undesired waste, such as a closed-loop life support subsystem. Where complete efficiency is impossible or impractical, makeup commodities could be manufactured via ISRU. Although, utilization of ISRU products (oxygen and water) for crew consumption holds great promise of reducing demands on life support hardware, there exist concerns as to the purity and transportation of commodities. To date, ISRU has been focused on production rates and purities for

  19. Virtual infrastructure management in private and hybrid clouds.

    SciTech Connect

    Sotomayor, B.; Montero, R. S.; Llorente, I. M.; Foster, I.; Mathematics and Computer Science; Univ. of Chicago; Univ. Complutense of Madrid

    2009-01-01

    One of the many definitions of 'cloud' is that of an infrastructure-as-a-service (IaaS) system, in which IT infrastructure is deployed in a provider's data center as virtual machines. With IaaS clouds growing popularity, tools and technologies are emerging that can transform an organization's existing infrastructure into a private or hybrid cloud. OpenNebula is an open source, virtual infrastructure manager that deploys virtualized services on both a local pool of resources and external IaaS clouds. Haizea, a resource lease manager, can act as a scheduling back end for OpenNebula, providing features not found in other cloud software or virtualization-based data center management software.

  20. James Webb Space Telescope Optical Simulation Testbed I: overview and first results

    NASA Astrophysics Data System (ADS)

    Perrin, Marshall D.; Soummer, Rémi; Choquet, Élodie; N'Diaye, Mamadou; Levecq, Olivier; Lajoie, Charles-Philippe; Ygouf, Marie; Leboulleux, Lucie; Egron, Sylvain; Anderson, Rachel; Long, Chris; Elliott, Erin; Hartig, George; Pueyo, Laurent; van der Marel, Roeland; Mountain, Matt

    2014-08-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop workbench to study aspects of wavefront sensing and control for a segmented space telescope, including both commissioning and maintenance activities. JOST is complementary to existing optomechanical testbeds for JWST (e.g. the Ball Aerospace Testbed Telescope, TBT) given its compact scale and flexibility, ease of use, and colocation at the JWST Science & Operations Center. We have developed an optical design that reproduces the physics of JWST's three-mirror anastigmat using three aspheric lenses; it provides similar image quality as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at HeNe wavelength. A segmented deformable mirror stands in for the segmented primary mirror and allows control of the 18 segments in piston, tip, and tilt, while the secondary can be controlled in tip, tilt and x, y, z position. This will be sufficient to model many commissioning activities, to investigate field dependence and multiple field point sensing & control, to evaluate alternate sensing algorithms, and develop contingency plans. Testbed data will also be usable for cross-checking of the WFS&C Software Subsystem, and for staff training and development during JWST's five- to ten-year mission.

  1. An adaptable, low cost test-bed for unmanned vehicle systems research

    NASA Astrophysics Data System (ADS)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  2. System identification and structural control on the JPL Phase B testbed

    NASA Technical Reports Server (NTRS)

    Chu, Cheng-Chih; Obrien, John F.; Lurie, Boris J.

    1993-01-01

    The primary objective of NASA's CSI program at JPL is to develop and demonstrate the CSI technology required to achieve high precision structural stability on large complex optical class spacecraft. The focus mission for this work is an orbiting interferometer telescope. Toward the realization of such a mission, a series of evolutionary testbed structures are being constructed. The JPL's CSI Phase B testbed is the second structure constructed in this series which is designed to study the pathlength control problem of the optical train of a stellar interferometer telescope mounted on a large flexible structure. A detailed description of this testbed can be found. This paper describes our efforts in the first phase of active structural control experiments of Phase B testbed using the active control approach where a single piezoelectric active member is used as an actuation device and the measurements include both colocated and noncolocated sensors. Our goal for this experiment is to demonstrate the feasibility of active structural control using both colocated and noncolocated measurements by means of successive control design and loop closing. More specifically, the colocated control loop was designed and closed first to provide good damping improvement over the frequency range of interest. The noncolocated controller was then designed with respect to a partially controlled structure to further improve the performance. Based on our approach, experimental closed-loop results have demonstrated significant performance improvement with excellent stability margins.

  3. INFORM Lab: a testbed for high-level information fusion and resource management

    NASA Astrophysics Data System (ADS)

    Valin, Pierre; Guitouni, Adel; Bossé, Eloi; Wehn, Hans; Happe, Jens

    2011-05-01

    DRDC Valcartier and MDA have created an advanced simulation testbed for the purpose of evaluating the effectiveness of Network Enabled Operations in a Coastal Wide Area Surveillance situation, with algorithms provided by several universities. This INFORM Lab testbed allows experimenting with high-level distributed information fusion, dynamic resource management and configuration management, given multiple constraints on the resources and their communications networks. This paper describes the architecture of INFORM Lab, the essential concepts of goals and situation evidence, a selected set of algorithms for distributed information fusion and dynamic resource management, as well as auto-configurable information fusion architectures. The testbed provides general services which include a multilayer plug-and-play architecture, and a general multi-agent framework based on John Boyd's OODA loop. The testbed's performance is demonstrated on 2 types of scenarios/vignettes for 1) cooperative search-and-rescue efforts, and 2) a noncooperative smuggling scenario involving many target ships and various methods of deceit. For each mission, an appropriate subset of Canadian airborne and naval platforms are dispatched to collect situation evidence, which is fused, and then used to modify the platform trajectories for the most efficient collection of further situation evidence. These platforms are fusion nodes which obey a Command and Control node hierarchy.

  4. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our contrast assessment and the development, sensing and control of the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraphy (VNC) for exoplanet detection and characterization. Tbe VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center has an established effort to develop VNC technologies, and an incremental sequence of testbeds to advance this approach and its critical technologies. We discuss the development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(exp 8), 10(exp 9) and ideally 10(exp 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the laboratory results, optical configuration, critical technologies and the null sensing and control approach.

  5. Description of New Inflatable/Rigidizable Hexapod Structure Testbed for Shape and Vibration Control

    NASA Technical Reports Server (NTRS)

    Adetona, O.; Keel, L. H.; Horta, L. G.; Cadogan, D. P.; Sapna, G. H.; Scarborough, S. E.

    2002-01-01

    Larger and more powerful space based instruments are needed to meet increasingly sophisticated scientific demand. To support this need, concepts for telescopes with apertures of 100 meters are being investigated, but the required technologies are not in hand today. Due to the capacity limits of launch vehicles, the idea of deploying, erecting, or inflating large structures in space is being considered. Recently, rigidization concepts of large inflatable structures have demonstrated the capability of weight reductions of up to 50% from current concepts with packaging efficiencies near 80%. One of the important aspects of inflatable structures is vibration mitigation and line-of-sight control. Such control tasks are possible only after actuators/sensors are properly integrated into a rigidizable concept. To study these issues, we have developed an inflatable/rigidizable hexapod structure testbed. The testbed integrates state of the art piezo-electric self-sensing actuators into an inflatable/rigidizable structure and a flat membrane reflector. Using this testbed, we plan to experimentally demonstrate achievable vibration and line-of-sight control. This paper contains a description of the testbed and an outline of the test plan.

  6. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Bench-scale Testbed Design Report

    SciTech Connect

    Melin, Alexander M.; Kisner, Roger A.; Drira, Anis; Reed, Frederick K.

    2015-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings to support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.

  7. Particle-In-Cell Multi-Algorithm Numerical Test-Bed

    NASA Astrophysics Data System (ADS)

    Meyers, M. D.; Yu, P.; Tableman, A.; Decyk, V. K.; Mori, W. B.

    2015-11-01

    We describe a numerical test-bed that allows for the direct comparison of different numerical simulation schemes using only a single code. It is built from the UPIC Framework, which is a set of codes and modules for constructing parallel PIC codes. In this test-bed code, Maxwell's equations are solved in Fourier space in two dimensions. One can readily examine the numerical properties of a real space finite difference scheme by including its operators' Fourier space representations in the Maxwell solver. The fields can be defined at the same location in a simulation cell or can be offset appropriately by half-cells, as in the Yee finite difference time domain scheme. This allows for the accurate comparison of numerical properties (dispersion relations, numerical stability, etc.) across finite difference schemes, or against the original spectral scheme. We have also included different options for the charge and current deposits, including a strict charge conserving current deposit. The test-bed also includes options for studying the analytic time domain scheme, which eliminates numerical dispersion errors in vacuum. We will show examples from the test-bed that illustrate how the properties of some numerical instabilities vary between different PIC algorithms. Work supported by the NSF grant ACI 1339893 and DOE grant DE-SC0008491.

  8. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Steincamp, James; Taylor, Jaime

    2003-01-01

    A reduced surrogate, one point crossover genetic algorithm with random rank-based selection was used successfully to estimate the multiple phases of a segmented optical system modeled on the seven-mirror Systematic Image-Based Optical Alignment testbed located at NASA's Marshall Space Flight Center.

  9. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  10. Model-Based Diagnosis in a Power Distribution Test-Bed

    NASA Technical Reports Server (NTRS)

    Scarl, E.; McCall, K.

    1998-01-01

    The Rodon model-based diagnosis shell was applied to a breadboard test-bed, modeling an automated power distribution system. The constraint-based modeling paradigm and diagnostic algorithm were found to adequately represent the selected set of test scenarios.

  11. Development and validation of a low-cost mobile robotics testbed

    NASA Astrophysics Data System (ADS)

    Johnson, Michael; Hayes, Martin

    2012-03-01

    This paper considers the design, construction and validation of a low-cost experimental robotic testbed, which allows for the localisation and tracking of multiple robotic agents in real time. The testbed system is suitable for research and education in a range of different mobile robotic applications, for validating theoretical as well as practical research work in the field of digital control, mobile robotics, graphical programming and video tracking systems. It provides a reconfigurable floor space for mobile robotic agents to operate within, while tracking the position of multiple agents in real-time using the overhead vision system. The overall system provides a highly cost-effective solution to the topical problem of providing students with practical robotics experience within severe budget constraints. Several problems encountered in the design and development of the mobile robotic testbed and associated tracking system, such as radial lens distortion and the selection of robot identifier templates are clearly addressed. The testbed performance is quantified and several experiments involving LEGO Mindstorm NXT and Merlin System MiaBot robots are discussed.

  12. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  13. Aerosol and cloud droplet number concentrations observed in marine stratocumulus

    SciTech Connect

    Vong, R.J.; Covert, D.S.

    1995-12-01

    The relationship between measurements of cloud droplet number concentration and cloud condensation nuclei (CCN) concentration, as inferred from aerosol size spectra, was investigated at a {open_quote}clean air{close_quote}, marine site (Cheeka Peak) located near the coast of the Olympic Peninsula in Washington State. Preliminary results demonstrated that cloud droplet number increased and droplet diameter decreased as aerosol number concentration (CCN) increased. These results support predictions of a climate cooling due to any future increases in marine aerosol concentrations.

  14. Microphysical and macrophysical responses of marine stratocumulus polluted by underlying ships: Evidence of cloud deepening

    NASA Astrophysics Data System (ADS)

    Christensen, Matthew W.; Stephens, Graeme L.

    2011-02-01

    Ship tracks observed by the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) were analyzed to determine the extent to which aerosol plumes from ships passing below marine stratocumulus alter the microphysical and macrophysical properties of the clouds. Moderate Resolution Imaging Spectroradiometer (MODIS) imagery was used to distinguish ship tracks embedded in closed, open, and undefined mesoscale cellular cloud structures. The impact of aerosol on the microphysical cloud properties in both the closed and open cell regimes were consistent with the changes predicted by the Twomey hypothesis. For the macrophysical changes, differences were observed between regimes. In the open cell regime, polluted clouds had significantly higher cloud tops (16%) and more liquid water (39%) than nearby unpolluted clouds. However, in the closed cell regime, polluted clouds exhibited no change in cloud top height and had less liquid water (-6%). Both microphysical (effective radius) and macrophysical (liquid water path) cloud properties contribute to a fractional change in cloud optical depth; in the closed cell regime the microphysical contribution was 3 times larger than the macrophysical contribution. However, the opposite was true in the open cell regime where the macrophysical contribution was nearly 2 times larger than the microphysical contribution because the aerosol probably increased cloud coverage. The results presented here demonstrate key differences aerosols have on the microphysical and macrophysical responses of boundary layer clouds between mesoscale stratocumulus convective regimes.

  15. Cloud CCN feedback

    SciTech Connect

    Hudson, J.G.

    1992-12-31

    Cloud microphysics affects cloud albedo precipitation efficiency and the extent of cloud feedback in response to global warming. Compared to other cloud parameters, microphysics is unique in its large range of variability and the fact that much of the variability is anthropogenic. Probably the most important determinant of cloud microphysics is the spectra of cloud condensation nuclei (CCN) which display considerable variability and have a large anthropogenic component. When analyzed in combination three field observation projects display the interrelationship between CCN and cloud microphysics. CCN were measured with the Desert Research Institute (DRI) instantaneous CCN spectrometer. Cloud microphysical measurements were obtained with the National Center for Atmospheric Research Lockheed Electra. Since CCN and cloud microphysics each affect the other a positive feedback mechanism can result.

  16. SCDU testbed automated in-situ alignment, data acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-07-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easilyparseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  17. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  18. A testbed for wide-field, high-resolution, gigapixel-class cameras

    NASA Astrophysics Data System (ADS)

    Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  19. Laser Communications Airborne Testbed: Potential For An Air-To-Satellite Laser Communications Link

    NASA Astrophysics Data System (ADS)

    Feldmann, Robert J.

    1988-05-01

    The Laser Communications Airborne Testbed (LCAT) offers an excellent opportunity for testing of an air-to-satellite laser communications link with the NASA Advanced Communications Technology Satellite (ACTS). The direct detection laser portion of the ACTS is suitable for examining the feasibility of an airborne terminal. Development of an airborne laser communications terminal is not currently part of the ACTS program; however, an air-to-satellite link is of interest. The Air Force performs airborne laser communications experiments to examine the potential usefulness of this technology to future aircraft. Lasers could be used, for example, by future airborne command posts and reconnaissance aircraft to communicate via satellite over long distances and transmit large quantities of data in the fastest way possible from one aircraft to another or to ground sites. Lasers are potentially secure, jam resistant and hard to detect and in this regard increase the survivability of the users. Under a contract awarded by Aeronautical Systems Division's Avionics Laboratory, a C-135E testbed aircraft belonging to ASD's 4950th Test Wing will be modified to create a Laser Communications Airborne Testbed. The contract is for development and fabrication of laser testbed equipment and support of the aircraft modification effort by the Test Wing. The plane to be modified is already in use as a testbed for other satellite communications projects and the LCAT effort will expand those capabilities. This analysis examines the characteristics of an LCAT to ACTS direct detection communications link. The link analysis provides a measure of the feasibility of developing an airborne laser terminal which will interface directly to the LCAT. Through the existence of the LCAT, the potential for development of an air-to-satellite laser communications terminal for the experimentation with the ACTS system is greatly enhanced.

  20. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed. PMID:23742532

  1. Frequency and causes of failed MODIS cloud property retrievals for liquid phase clouds over global oceans

    NASA Astrophysics Data System (ADS)

    Cho, Hyoun-Myoung; Zhang, Zhibo; Meyer, Kerry; Lebsock, Matthew; Platnick, Steven; Ackerman, Andrew S.; Di Girolamo, Larry; -Labonnote, Laurent C.; Cornet, Céline; Riedi, Jerome; Holz, Robert E.

    2015-05-01

    Moderate Resolution Imaging Spectroradiometer (MODIS) retrieves cloud droplet effective radius (r_e) and optical thickness (τ) by projecting observed cloud reflectances onto a precomputed look-up table (LUT). When observations fall outside of the LUT, the retrieval is considered "failed" because no combination of τ and r_e within the LUT can explain the observed cloud reflectances. In this study, the frequency and potential causes of failed MODIS retrievals for marine liquid phase (MLP) clouds are analyzed based on 1 year of Aqua MODIS Collection 6 products and collocated CALIOP and CloudSat observations. The retrieval based on the 0.86 μm and 2.1 μm MODIS channel combination has an overall failure rate of about 16% (10% for the 0.86 μm and 3.7 μm combination). The failure rates are lower over stratocumulus regimes and higher over the broken trade wind cumulus regimes. The leading type of failure is the "r_e too large" failure accounting for 60%-85% of all failed retrievals. The rest is mostly due to the "r_e too small" or τ retrieval failures. Enhanced retrieval failure rates are found when MLP cloud pixels are partially cloudy or have high subpixel inhomogeneity, are located at special Sun-satellite viewing geometries such as sunglint, large viewing or solar zenith angles, or cloudbow and glory angles, or are subject to cloud masking, cloud overlapping, and/or cloud phase retrieval issues. The majority (more than 84%) of failed retrievals along the CALIPSO track can be attributed to at least one or more of these potential reasons. The collocated CloudSat radar reflectivity observations reveal that the remaining failed retrievals are often precipitating. It remains an open question whether the extremely large r_e values observed in these clouds are the consequence of true cloud microphysics or still due to artifacts not included in this study.

  2. Astronomy in the Cloud: Using MapReduce for Image Co-Addition

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-03-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computation challenges such as anomaly detection and classification and moving-object tracking. Since such studies benefit from the highest-quality data, methods such as image co-addition, i.e., astrometric registration followed by per-pixel summation, will be a critical preprocessing step prior to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this article we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data are partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources: i.e., platforms where Hadoop is offered as a service. We report on our experience of implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multiterabyte imaging data set provides a good testbed for algorithm development, since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image co-addition to the MapReduce framework. Then we describe a number of optimizations to our basic approach

  3. The Climate-G testbed: towards a large scale data sharing environment for climate change

    NASA Astrophysics Data System (ADS)

    Aloisio, G.; Fiore, S.; Denvil, S.; Petitdidier, M.; Fox, P.; Schwichtenberg, H.; Blower, J.; Barbera, R.

    2009-04-01

    The Climate-G testbed provides an experimental large scale data environment for climate change addressing challenging data and metadata management issues. The main scope of Climate-G is to allow scientists to carry out geographical and cross-institutional climate data discovery, access, visualization and sharing. Climate-G is a multidisciplinary collaboration involving both climate and computer scientists and it currently involves several partners such as: Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), Institut Pierre-Simon Laplace (IPSL), Fraunhofer Institut für Algorithmen und Wissenschaftliches Rechnen (SCAI), National Center for Atmospheric Research (NCAR), University of Reading, University of Catania and University of Salento. To perform distributed metadata search and discovery, we adopted a CMCC metadata solution (which provides a high level of scalability, transparency, fault tolerance and autonomy) leveraging both on P2P and grid technologies (GRelC Data Access and Integration Service). Moreover, data are available through OPeNDAP/THREDDS services, Live Access Server as well as the OGC compliant Web Map Service and they can be downloaded, visualized, accessed into the proposed environment through the Climate-G Data Distribution Centre (DDC), the web gateway to the Climate-G digital library. The DDC is a data-grid portal allowing users to easily, securely and transparently perform search/discovery, metadata management, data access, data visualization, etc. Godiva2 (integrated into the DDC) displays 2D maps (and animations) and also exports maps for display on the Google Earth virtual globe. Presently, Climate-G publishes (through the DDC) about 2TB of data related to the ENSEMBLES project (also including distributed replicas of data) as well as to the IPCC AR4. The main results of the proposed work are: wide data access/sharing environment for climate change; P2P/grid metadata approach; production-level Climate-G DDC; high quality tools for

  4. FY 2011 Second Quarter: Demonstration of New Aerosol Measurement Verification Testbed for Present-Day Global Aerosol Simulations

    SciTech Connect

    Koch, D

    2011-03-20

    The regional-scale Weather Research and Forecasting (WRF) model is being used by a DOE Earth System Modeling (ESM) project titled “Improving the Characterization of Clouds, Aerosols and the Cryosphere in Climate Models” to evaluate the performance of atmospheric process modules that treat aerosols and aerosol radiative forcing in the Arctic. We are using a regional-scale modeling framework for three reasons: (1) It is easier to produce a useful comparison to observations with a high resolution model; (2) We can compare the behavior of the CAM parameterization suite with some of the more complex and computationally expensive parameterizations used in WRF; (3) we can explore the behavior of this parameterization suite at high resolution. Climate models like the Community Atmosphere Model version 5 (CAM5) being used within the Community Earth System Model (CESM) will not likely be run at mesoscale spatial resolutions (10–20 km) until 5–10 years from now. The performance of the current suite of physics modules in CAM5 at such resolutions is not known, and current computing resources do not permit high-resolution global simulations to be performed routinely. We are taking advantage of two tools recently developed under PNNL Laboratory Directed Research and Development (LDRD) projects for this activity. The first is the Aerosol Modeling Testbed (Fast et al., 2011b), a new computational framework designed to streamline the process of testing and evaluating aerosol process modules over a range of spatial and temporal scales. The second is the CAM5 suite of physics parameterizations that have been ported into WRF so that their performance and scale dependency can be quantified at mesoscale spatial resolutions (Gustafson et al., 2010; with more publications in preparation).

  5. Cloud Processed CCN Affect Cloud Microphysics

    NASA Astrophysics Data System (ADS)

    Hudson, J. G.; Noble, S. R., Jr.; Tabor, S. S.

    2015-12-01

    Variations in the bimodality/monomodality of CCN spectra (Hudson et al. 2015) exert opposite effects on cloud microphysics in two aircraft field projects. The figure shows two examples, droplet concentration, Nc, and drizzle liquid water content, Ld, against classification of CCN spectral modality. Low ratings go to balanced separated bimodal spectra, high ratings go to single mode spectra, strictly monomodal 8. Intermediate ratings go merged modes, e.g., one mode a shoulder of another. Bimodality is caused by mass or hygroscopicity increases that go only to CCN that made activated cloud droplets. In the Ice in Clouds Experiment-Tropical (ICE-T) small cumuli with lower Nc, greater droplet mean diameters, MD, effective radii, re, spectral widths, σ, cloud liquid water contents, Lc, and Ld were closer to more bimodal (lower modal ratings) below cloud CCN spectra whereas clouds with higher Nc, smaller MD, re, σ, and Ld were closer to more monomodal CCN (higher modal ratings). In polluted stratus clouds of the MArine Stratus/Stratocumulus Experiment (MASE) clouds that had greater Nc, and smaller MD, re, σ, Lc, and Ld were closer to more bimodal CCN spectra whereas clouds with lower Nc, and greater MD, re, σ, Lc, and Ld were closer to more monomodal CCN. These relationships are opposite because the dominant ICE-T cloud processing was coalescence whereas chemical transformations (e.g., SO2 to SO4) were dominant in MASE. Coalescence reduces Nc and thus also CCN concentrations (NCCN) when droplets evaporate. In subsequent clouds the reduced competition increases MD and σ, which further enhance coalescence and drizzle. Chemical transformations do not change Nc but added sulfate enhances droplet and CCN solubility. Thus, lower critical supersaturation (S) CCN can produce more cloud droplets in subsequent cloud cycles, especially for the low W and effective S of stratus. The increased competition reduces MD, re, and σ, which inhibit coalescence and thus reduce drizzle

  6. Atmospheric cloud physics laboratory project study

    NASA Technical Reports Server (NTRS)

    Schultz, W. E.; Stephen, L. A.; Usher, L. H.

    1976-01-01

    Engineering studies were performed for the Zero-G Cloud Physics Experiment liquid cooling and air pressure control systems. A total of four concepts for the liquid cooling system was evaluated, two of which were found to closely approach the systems requirements. Thermal insulation requirements, system hardware, and control sensor locations were established. The reservoir sizes and initial temperatures were defined as well as system power requirements. In the study of the pressure control system, fluid analyses by the Atmospheric Cloud Physics Laboratory were performed to determine flow characteristics of various orifice sizes, vacuum pump adequacy, and control systems performance. System parameters predicted in these analyses as a function of time include the following for various orifice sizes: (1) chamber and vacuum pump mass flow rates, (2) the number of valve openings or closures, (3) the maximum cloud chamber pressure deviation from the allowable, and (4) cloud chamber and accumulator pressure.

  7. Cloud computing for geophysical applications (Invited)

    NASA Astrophysics Data System (ADS)

    Zhizhin, M.; Kihn, E. A.; Mishin, D.; Medvedev, D.; Weigel, R. S.

    2010-12-01

    Cloud computing offers a scalable on-demand resource allocation model to evolving needs in data intensive geophysical applications, where computational needs in CPU and storage can vary over time depending on modeling or field campaign. Separate, sometimes incompatible cloud platforms and services are already available from major computing vendors (Amazon AWS, Microsoft Azure, Google Apps Engine), government agencies (NASA Nebulae) and Open Source community (Eucalyptus). Multiple cloud platforms with layered virtualization patterns (hardware-platform- software-data-or-everything as a service) provide a feature-rich environment and encourage experimentation with distributed data modeling, processing and storage. However, application and especially database development in the Cloud is different from the desktop and the compute cluster. In this presentation we will review scientific cloud applications relevant to geophysical research and present our results in building software components and cloud services for a virtual geophysical data center. We will discuss in depth economy, scalability and reliability of the distributed array and image data stores, synchronous and asynchronous RESTful services to access and model georefernced data, virtual observatory services for metadata management, and data visualization for web applications in Cloud.

  8. Absorption of solar radiation in broken clouds

    SciTech Connect

    Zuev, V.E.; Titov, G.A.; Zhuravleva, T.B.

    1996-04-01

    It is recognized now that the plane-parallel model unsatisfactorily describes the transfer of radiation through broken clouds and that, consequently, the radiation codes of general circulation models (GCMs) must be refined. However, before any refinement in a GCM code is made, it is necessary to investigate the dependence of radiative characteristics on the effects caused by the random geometry of cloud fields. Such studies for mean fluxes of downwelling and upwelling solar radiation in the visible and near-infrared (IR) spectral range were performed by Zuev et al. In this work, we investigate the mean spectral and integrated absorption of solar radiation by broken clouds (in what follows, the term {open_quotes}mean{close_quotes} will be implied but not used, for convenience). To evaluate the potential effect of stochastic geometry, we will compare the absorption by cumulus (0.5 {le} {gamma} {le} 2) to that by equivalent stratus ({gamma} <<1) clouds; here {gamma} = H/D, H is the cloud layer thickness and D the characteristic horizontal cloud size. The equivalent stratus clouds differ from cumulus only in the aspect ratio {gamma}, all the other parameters coinciding.

  9. Physically-Retrieving Cloud and Thermodynamic Parameters from Ultraspectral IR Measurements

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Mango, Stephen A.; Huang, Hung-Lung

    2007-01-01

    A physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). NPOESS Airborne Sounder Testbed Interferometer (NAST-I) retrievals from the Atlantic-THORPEX Regional Campaign are compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL). This work was motivated by the need to obtain solutions for atmospheric soundings from infrared radiances observed for every individual field of view, regardless of cloud cover, from future ultraspectral geostationary satellite sounding instruments, such as the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and the Hyperspectral Environmental Suite (HES). However, this retrieval approach can also be applied to the ultraspectral sounding instruments to fly on Polar satellites, such as the Infrared Atmospheric Sounding Interferometer (IASI) on the European MetOp satellite, the Cross-track Infrared Sounder (CrIS) on the NPOESS Preparatory Project and the following NPOESS series of satellites.

  10. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560

  11. Noctilucent Cloud Sightings

    NASA Video Gallery

    Polar Mesospheric Clouds form during each polar region's summer months in the coldest place in the atmosphere, 50 miles above Earth's surface. Noctilucent Clouds were first observed in 1885 by an a...

  12. Closed Small Cell Clouds

    Atmospheric Science Data Center

    2013-04-19

    article title:  Closed Small Cell Clouds in the South Pacific     ... the Multi-angle Imaging SpectroRadiometer (MISR). Closed cell clouds are formed under conditions of widespread sinking of the air above. ...

  13. Development and testing of an aerosol-stratus cloud parameterization scheme for middle and high latitudes

    SciTech Connect

    Olsson, P.Q.; Meyers, M.P.; Kreidenweis, S.; Cotton, W.R.

    1996-04-01

    The aim of this new project is to develop an aerosol/cloud microphysics parameterization of mixed-phase stratus and boundary layer clouds. Our approach is to create, test, and implement a bulk-microphysics/aerosol model using data from Atmospheric Radiation Measurement (ARM) Cloud and Radiation Testbed (CART) sites and large-eddy simulation (LES) explicit bin-resolving aerosol/microphysics models. The primary objectives of this work are twofold. First, we need the prediction of number concentrations of activated aerosol which are transferred to the droplet spectrum, so that the aerosol population directly affects the cloud formation and microphysics. Second, we plan to couple the aerosol model to the gas and aqueous-chemistry module that will drive the aerosol formation and growth. We begin by exploring the feasibility of performing cloud-resolving simulations of Arctic stratus clouds over the North Slope CART site. These simulations using Colorado State University`s regional atmospheric modeling system (RAMS) will be useful in designing the structure of the cloud-resolving model and in interpreting data acquired at the North Slope site.

  14. Embedded Clusters in Molecular Clouds

    NASA Astrophysics Data System (ADS)

    Lada, Charles J.; Lada, Elizabeth A.

    Stellar clusters are born embedded within giant molecular clouds (GMCs) and during their formation and early evolution are often only visible at infrared wavelengths, being heavily obscured by dust. Over the past 15 years advances in infrared detection capabilities have enabled the first systematic studies of embedded clusters in galactic molecular clouds. In this article we review the current state of empirical knowledge concerning these extremely young protocluster systems. From a survey of the literature we compile the first extensive catalog of galactic embedded clusters. We use the catalog to construct the mass function and estimate the birthrate for embedded clusters within 2 kpc of the sun. We find that the embedded cluster birthrate exceeds that of visible open clusters by an order of magnitude or more indicating a high infant mortality rate for protocluster systems. Less than 4-7% of embedded clusters survive emergence from molecular clouds to become bound clusters of Pleiades age. The vast majority (90%) of stars that form in embedded clusters form in rich clusters of 100 or more members with masses in excess of 50 M⊙. Moreover, observations of nearby cloud complexes indicate that embedded clusters account for a significant (70-90%) fraction of all stars formed in GMCs. We review the role of embedded clusters in investigating the nature of the initial mass function (IMF) that, in one nearby example, has been measured over the entire range of stellar and substellar mass, from OB stars to substellar objects near the deuterium burning limit. We also review the role embedded clusters play in the investigation of circumstellar disk evolution and the important constraints they provide for understanding the origin of planetary systems. Finally, we discuss current ideas concerning the origin and dynamical evolution of embedded clusters and the implications for the formation of bound open clusters.

  15. Tropical thermostats and low cloud clover

    SciTech Connect

    Miller, R.L.

    1997-03-01

    The ability of subtropical stratus low cloud cover to moderate amplify the tropical response to climate forcing such as increased CO{sub 2} is considered. Cloud radiative forcing over the subtropics is parameterized using an empirical relation between stratus cloud cover and the difference in potential temperature between 700 mb (a level that is above the trade inversion) and the surface. This relation includes the empirical negative correlation between SST and low cloud cover and is potentially a positive feedback to climate forcing. Since potential temperature above the trade inversion varies in unison across the Tropics as a result of the large-scale circulation and because moist convection relates tropospheric temperature within the convecting region to variations in surface temperature and moisture, the subtropical potential temperature at 700 mb depends upon surface conditions within the convecting region. As a result, subtropical stratus cloud cover and the associated feedback depend upon the entire tropical climate and not just the underlying SST. A simple tropical model is constructed, consisting of separate budgets of dry static energy and moisture for the convecting region (referred to as the {open_quotes}warm{close_quotes} pool) and the subtropical descending region (the {open_quotes}cold{close_quotes} pool). The cold pool is the location of stratus low clouds in the model. Dynamics is implicitly included through the assumption that temperature above the boundary layer is horizontally uniform as a result of the large-scale circulation. The tropropause and warm pool surface are shown to be connected by a moist adiabat in the limit of vanishingly narrow convective updrafts. Stratus low cloud cover is found to be a negative feedback, increasing in response to doubled CO{sub 2} and reducing the tropically averaged warming in comparison to the warming with low cloud cover held fixed. 72 refs., 13 figs., 2 tabs.

  16. Intercomparison of model simulations of mixed-phase clouds observed during the ARM Mixed-Phase Arctic Cloud Experiment. II: Multi layered cloud

    SciTech Connect

    Morrison, H.; McCoy, Renata; Klein, Stephen A.; Xie, Shaocheng; Luo, Yali; Avramov, Alexander; Chen, Mingxuan; Cole, Jason N.; Falk, Michael; Foster, Mike; Del Genio, Anthony D.; Harrington, Jerry Y.; Hoose, Corinna; Khrairoutdinov, Marat; Larson, Vince; Liu, Xiaohong; McFarquhar, Greg; Poellot, M. R.; Von Salzen, Knut; Shipway, Ben; Shupe, Matthew D.; Sud, Yogesh C.; Turner, David D.; Veron, Dana; Walker, Gregory K.; Wang, Zhien; Wolf, Audrey; Xu, Kuan-Man; Yang, Fanglin; Zhang, G.

    2009-05-21

    Results are presented from an intercomparison of single-column and cloud resolving model simulations of a deep, multi-layered, mixed-phase cloud system observed during the ARM Mixed-Phase Arctic Cloud Experiment. This cloud system was associated with strong surface turbulent sensible and latent heat fluxes as cold air flowed over the open Arctic Ocean, combined with a low pressure system that supplied moisture at mid-level. The simulations, performed by 13 single-column and 4 cloud-resolving models, generally overestimate the liquid water path and strongly underestimate the ice water path, although there is a large spread among the models. This finding is in contrast with results for the single-layer, low-level mixed-phase stratocumulus case in Part I of this study, as well as previous studies of shallow mixed-phase Arctic clouds, that showed an underprediction of liquid water path. The overestimate of liquid water path and underestimate of ice water path occur primarily when deeper mixed-phase clouds extending into the mid-troposphere were observed. These results suggest important differences in the ability of models to simulate Arctic mixed-phase clouds that are deep and multi-layered versus shallow and single-layered. In general, the cloud-resolving models and models with a more sophisticated, two-moment treatment of the cloud microphysics produce a somewhat smaller liquid water path that is closer to observations. The cloud-resolving models also tend to produce a larger cloud fraction than the single column models. The liquid water path and especially the cloud fraction have a large impact on the cloud radiative forcing at the surface, which is dominated by the longwave flux for this case.

  17. Analysis and testing of a space crane articulating joint testbed

    NASA Technical Reports Server (NTRS)

    Sutter, Thomas R.; Wu, K. Chauncey

    1992-01-01

    The topics are presented in viewgraph form and include: space crane concept with mobile base; mechanical versus structural articulating joint; articulating joint test bed and reference truss; static and dynamic characterization completed for space crane reference truss configuration; improved linear actuators reduce articulating joint test bed backlash; 1-DOF space crane slew maneuver; boom 2 tip transient response finite element dynamic model; boom 2 tip transient response shear-corrected component modes torque driver profile; peak root member force vs. slew time torque driver profile; and open loop control of space crane motion.

  18. Csi-star: a Low-cost CSI Orbital Testbed

    NASA Technical Reports Server (NTRS)

    Edberg, D.

    1992-01-01

    The topics are presented in viewgraph form and include the following: rationale for an on-orbit control-structurs interaction (CSI) test facility; CSI flight experiment objectives; feasibility study objectives; CSI free-flyer solution; feasibility study technical status summary; CSI-Star - a low-cost CSI free flyer; conceptual experiment design - option 2 configuration; Delta 2 - Quickstar Interface clambband capability; open and closed loop response of baselined truss with active struts; experiment weight baseline (option 1) configuration; experiment weight option 2 configuration; experiment power baseline (option 1) configuration; experiment power option 2 configuration; CSI Quickstar capabilities/requirements; and remaining work.

  19. Computer animation of clouds

    SciTech Connect

    Max, N.

    1994-01-28

    Computer animation of outdoor scenes is enhanced by realistic clouds. I will discuss several different modeling and rendering schemes for clouds, and show how they evolved in my animation work. These include transparency-textured clouds on a 2-D plane, smooth shaded or textured 3-D clouds surfaces, and 3-D volume rendering. For the volume rendering, I will present various illumination schemes, including the density emitter, single scattering, and multiple scattering models.

  20. Private Cloud Communities for Faculty and Students

    ERIC Educational Resources Information Center

    Tomal, Daniel R.; Grant, Cynthia

    2015-01-01

    Massive open online courses (MOOCs) and public and private cloud communities continue to flourish in the field of higher education. However, MOOCs have received criticism in recent years and offer little benefit to students already enrolled at an institution. This article advocates for the collaborative creation and use of institutional, program…

  1. Cloud Scaling Properties and Cloud Parameterization

    NASA Technical Reports Server (NTRS)

    Cahalan, R. F.; Morcrette, J. J.

    1998-01-01

    Cloud liquid and cloud traction variability is studied as a function of horizontal scale in the ECMWF forecast model during several 10-day runs at the highest available model resolution, recently refined from approximately 60 km (T213) down to approximately 20 km (T639). At higher resolutions, model plane-parallel albedo biases are reduced, so that models may be tuned to have larger, more realistic, cloud liquid water amounts, However, the distribution of cloud liquid assumed -within- each gridbox, for radiative and thermodynamic computations, depends on ad hoc assumptions that are not necessarily consistent with observed scaling properties, or with scaling properties produced by the model at larger scales. To study the larger-scale cloud properties, ten locations on the Earth are chosen to coincide with locations having considerable surface data available for validation, and representing a variety of climatic regimes, scaling exponents are determined from a range or scales down to model resolution, and are re-computed every three hours, separately for low, medium and high clouds, as well as column-integrated cloudiness. Cloud variability fluctuates in time, due to diurnal, synoptic and other' processes, but scaling exponents are found to be relatively stable. various approaches are considered for applying computed cloud scaling to subgrid cloud distributions used for radiation, beyond simple random or maximal overlap now in common use. Considerably more work is needed to compare model cloud scaling with observations. This will be aided by increased availability of high-resolution surface, aircraft and satellite data, and by the increasing resolution of global models,

  2. Cloud Computing Explained

    ERIC Educational Resources Information Center

    Metz, Rosalyn

    2010-01-01

    While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…

  3. Cloud Coverage and Height Distribution from the GLAS Polar Orbiting Lidar: Comparison to Passive Cloud Retrievals

    NASA Technical Reports Server (NTRS)

    Spinhime, J. D.; Palm, S. P.; Hlavka, D. L.; Hart, W. D.; Mahesh, A.

    2004-01-01

    The Geoscience Laser Altimeter System (GLAS) began full on orbit operations in September 2003. A main application of the two-wavelength GLAS lidar is highly accurate detection and profiling of global cloud cover. Initial analysis indicates that cloud and aerosol layers are consistently detected on a global basis to cross-sections down to 10(exp -6) per meter. Images of the lidar data dramatically and accurately show the vertical structure of cloud and aerosol to the limit of signal attenuation. The GLAS lidar has made the most accurate measurement of global cloud coverage and height to date. In addition to the calibrated lidar signal, GLAS data products include multi level boundaries and optical depth of all transmissive layers. Processing includes a multi-variable separation of cloud and aerosol layers. An initial application of the data results is to compare monthly cloud means from several months of GLAS observations in 2003 to existing cloud climatologies from other satellite measurement. In some cases direct comparison to passive cloud retrievals is possible. A limitation of the lidar measurements is nadir only sampling. However monthly means exhibit reasonably good global statistics and coverage results, at other than polar regions, compare well with other measurements but show significant differences in height distribution. For polar regions where passive cloud retrievals are problematic and where orbit track density is greatest, the GLAS results are particularly an advance in cloud cover information. Direct comparison to MODIS retrievals show a better than 90% agreement in cloud detection for daytime, but less than 60% at night. Height retrievals are in much less agreement. GLAS is a part of the NASA EOS project and data products are thus openly available to the science community (see http://glo.gsfc.nasa.gov).

  4. Evaluation of a New Mixed-Phase Cloud Microphysics Parameterization with a Single Column Model, CAPT Forecasts and M-PACE Observations

    NASA Astrophysics Data System (ADS)

    Liu, X.; Xie, S.; Boyle, J.; Klein, S.; Ghan, S.

    2007-12-01

    Most global climate models generally prescribe the partitioning of condensed water into liquid droplets and ice crystals in mixed-phase clouds according to a temperature-dependent function, which affects modeled cloud phase, cloud lifetime and radiative properties. In this study we evaluate a new mixed-phase cloud microphysics parameterization (for ice nucleation and water vapor deposition) against the Atmospheric Radiation Measurement (ARM) Mixed-phase Arctic Cloud Experiment (M-PACE) observations using the NCAR Community Atmospheric Model Version 3 (CAM3) running in the single column mode (SCAM) and in the CCPP-ARM Parameterization Testbed (CAPT) forecasts. It is found that SCAM with the new physically-based cloud microphysical scheme produces a more realistic simulation of the cloud phase structure and the partitioning of condensed water into liquid droplets against observations during the M-PACE than the standard CAM with an oversimplified cloud microphysics. CAM3 in the CAPT forecasts significantly underestimates the observed boundary layer mixed- phase cloud fraction. The simulation of the boundary layer mixed-phase clouds and their microphysical properties is considerably improved in CAM3 when the new scheme is used. The new scheme also leads to an improved simulation of the surface and top of the atmosphere longwave radiative fluxes. Both SCAM simulations and CAPT forecasts suggest that the ice number concentration could play an important role in the simulated mixed-phase cloud microphysics, and thereby needs to be realistically represented in global climate models.

  5. The Roles of Cloud Drop Effective Radius and LWP in Determining Rain Properties in Marine Stratocumulus

    SciTech Connect

    Rosenfeld, Daniel; Wang, Hailong; Rasch, Philip J.

    2012-07-04

    Numerical simulations described in previous studies showed that adding cloud condensation nuclei to marine stratocumulus can prevent their breakup from closed into open cells. Additional analyses of the same simulations show that the suppression of rain is well described in terms of cloud drop effective radius (re). Rain is initiated when re near cloud top is around 12-14 um. Cloud water starts to get depleted when column-maximum rain intensity (Rmax) exceeds 0.1 mm h-1. This happens when cloud-top re reaches 14 um. Rmax is mostly less than 0.1 mm h-1 at re<14 um, regardless of the cloud water path, but increases rapidly when re exceeds 14 um. This is in agreement with recent aircraft observations and theoretical observations in convective clouds so that the mechanism is not limited to describing marine stratocumulus. These results support the hypothesis that the onset of significant precipitation is determined by the number of nucleated cloud drops and the height (H) above cloud base within the cloud that is required for cloud drops to reach re of 14 um. In turn, this can explain the conditions for initiation of significant drizzle and opening of closed cells providing the basis for a simple parameterization for GCMs that unifies the representation of both precipitating and non-precipitating clouds as well as the transition between them. Furthermore, satellite global observations of cloud depth (from base to top), and cloud top re can be used to derive and validate this parameterization.

  6. Testbed for development of a DSP-based signal processing subsystem for an Earth-orbiting radar scatterometer

    NASA Technical Reports Server (NTRS)

    Clark, Douglas J.; Lux, James P.; Shirbacheh, Mike

    2002-01-01

    A testbed for evaluation of general-purpose digital signal processors in earth-orbiting radar scatterometers is discussed. Because general purpose DSP represents a departure from previous radar signal processing techniques used on scatterometers, there was a need to demonstrate key elements of the system to verify feasibility for potential future scatterometer instruments. Construction of the testbed also facilitated identification of an appropriate software development environment and the skills mix necessary to perform the work.

  7. Corona Discharge in Clouds

    NASA Astrophysics Data System (ADS)

    Sin'kevich, A. A.; Dovgalyuk, Yu. A.

    2014-04-01

    We present a review of the results of theoretical studies and laboratory modeling of corona discharge initiation in clouds. The influence of corona discharges on the evolution of the cloud microstructure and electrification is analyzed. It is shown that corona discharges are initiated when large-size hydrometeors approach each other, whereas in some cases, corona discharges from crystals, ice pellets, and hailstones can appear. The corona discharges lead to significant air ionization, charging of cloud particles, and separation of charges in clouds and initiate streamers and lightnings. The influence of corona discharges on changes in the phase composition of clouds is analyzed.

  8. Cloud microstructure studies

    NASA Technical Reports Server (NTRS)

    Blau, H. H., Jr.; Fowler, M. G.; Chang, D. T.; Ryan, R. T.

    1972-01-01

    Over two thousand individual cloud droplet size distributions were measured with an optical cloud particle spectrometer flown on the NASA Convair 990 aircraft. Representative droplet spectra and liquid water content, L (gm/cu m) were obtained for oceanic stratiform and cumuliform clouds. For non-precipitating clouds, values of L range from 0.1 gm/cu m to 0.5 gm/cu m; with precipitation, L is often greater than 1 gm/cu m. Measurements were also made in a newly formed contrail and in cirrus clouds.

  9. Model evaluation, recommendation and prioritizing of future work for the manipulator emulator testbed

    NASA Technical Reports Server (NTRS)

    Kelly, Frederick A.

    1989-01-01

    The Manipulator Emulator Testbed (MET) is to provide a facility capable of hosting the simulation of various manipulator configurations to support concept studies, evaluation, and other engineering development activities. Specifically, the testbed is intended to support development of the Space Station Remote Manipulator System (SSRMS) and related systems. The objective of this study is to evaluate the math models developed for the MET simulation of a manipulator's rigid body dynamics and the servo systems for each of the driven manipulator joints. Specifically, the math models are examined with regard to their amenability to pipeline and parallel processing. Based on this evaluation and the project objectives, a set of prioritized recommendations are offered for future work.

  10. The implementation of the Human Exploration Demonstration Project (HEDP), a systems technology testbed

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Korsmeyer, David J.

    1993-01-01

    The Human Exploration Demonstration Project (HEDP) is an ongoing task at the NASA's Ames Research Center to address the advanced technology requirements necessary to implement an integrated working and living environment for a planetary surface habitat. The integrated environment consists of life support systems, physiological monitoring of project crew, a virtual environment work station, and centralized data acquisition and habitat systems health monitoring. The HEDP is an integrated technology demonstrator, as well as an initial operational testbed. There are several robotic systems operational in a simulated planetary landscape external to the habitat environment, to provide representative work loads for the crew. This paper describes the evolution of the HEDP from initial concept to operational project; the status of the HEDP after two years; the final facilities composing the HEDP; the project's role as a NASA Ames Research Center systems technology testbed; and the interim demonstration scenarios that have been run to feature the developing technologies in 1993.

  11. TEXSYS. [a knowledge based system for the Space Station Freedom thermal control system test-bed

    NASA Technical Reports Server (NTRS)

    Bull, John

    1990-01-01

    The Systems Autonomy Demonstration Project has recently completed a major test and evaluation of TEXSYS, a knowledge-based system (KBS) which demonstrates real-time control and FDIR for the Space Station Freedom thermal control system test-bed. TEXSYS is the largest KBS ever developed by NASA and offers a unique opportunity for the study of technical issues associated with the use of advanced KBS concepts including: model-based reasoning and diagnosis, quantitative and qualitative reasoning, integrated use of model-based and rule-based representations, temporal reasoning, and scale-up performance issues. TEXSYS represents a major achievement in advanced automation that has the potential to significantly influence Space Station Freedom's design for the thermal control system. An overview of the Systems Autonomy Demonstration Project, the thermal control system test-bed, the TEXSYS architecture, preliminary test results, and thermal domain expert feedback are presented.

  12. Towards an Experimental Testbed Facility for Cyber-Physical Security Research

    SciTech Connect

    Edgar, Thomas W.; Manz, David O.; Carroll, Thomas E.

    2012-01-07

    Cyber-Physical Systems (CPSs) are under great scrutiny due to large Smart Grid investments and recent high profile security vulnerabilities and attacks. Research into improved security technologies, communication models, and emergent behavior is necessary to protect these systems from sophisticated adversaries and new risks posed by the convergence of CPSs with IT equipment. However, cyber-physical security research is limited by the lack of access to universal cyber-physical testbed facilities that permit flexible, high-fidelity experiments. This paper presents a remotely-configurable and community-accessible testbed design that integrates elements from the virtual, simulated, and physical environments. Fusing data between the three environments enables the creation of realistic and scalable environments where new functionality and ideas can be exercised. This novel design will enable the research community to analyze and evaluate the security of current environments and design future, secure, cyber-physical technologies.

  13. The Langley Research Center CSI phase-0 evolutionary model testbed-design and experimental results

    NASA Technical Reports Server (NTRS)

    Belvin, W. K.; Horta, Lucas G.; Elliott, K. B.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology is described. The design philosophy, capabilities, and early experimental results are presented to introduce some of the ongoing CSI research at NASA-Langley. The testbed, referred to as the Phase 0 version of the CSI Evolutionary model (CEM), is the first stage of model complexity designed to show the benefits of CSI technology and to identify weaknesses in current capabilities. Early closed loop test results have shown non-model based controllers can provide an order of magnitude increase in damping in the first few flexible vibration modes. Model based controllers for higher performance will need to be robust to model uncertainty as verified by System ID tests. Data are presented that show finite element model predictions of frequency differ from those obtained from tests. Plans are also presented for evolution of the CEM to study integrated controller and structure design as well as multiple payload dynamics.

  14. NASA's flight-technological development program - A 650 Mbps laser communications testbed

    NASA Technical Reports Server (NTRS)

    Hayden, W. L.; Fitzmaurice, M. W.; Nace, D. A.; Lokerson, D. C.; Minott, P. O.; Chapman, W. W.

    1991-01-01

    A 650 Mbps laser communications testbed under construction for the development of flight qualifiable hardware suitable for near-term operation on geosynchronous-to-geosynchronous crosslink missions is presented. The program's primary purpose is to develop and optimize laser communications unique subsystems. Requirements for the testbed experiments are to optimize the acquisition processes, to fully simulate the long range (up to 21,000 km) and the fine tracking characteristics of two narrow-beam laser communications terminals, and to fully test communications performance which will include average and burst bit error rates, effects of laser diode coalignment, degradation due to internal and external stray light, and the impact of drifts in the optical components.

  15. Visible-band testbed projector with a replicated diffractive optical element.

    PubMed

    Chen, C B; Hegg, R G; Johnson, W T; King, W B; Rock, D F; Spande, R

    1999-12-01

    Raytheon has designed, fabricated, and tested a diffractive-optical-element-based (DOE-based) testbed projector for direct and indirect visual optical applications. By use of a low-cost replicated DOE surface from Rochester Photonics Corporation for color correction the projector optics bettered the modular transfer function of an equivalent commercial camera lens. The testbed demonstrates that a practical DOE-based optical system is suitable for both visual applications (e.g., head-mounted displays) and visual projection (e.g., tactical sensors). The need for and the proper application of DOE's in visual optical systems, the nature and the performance of the projector optical design, and test results are described. PMID:18324257

  16. An investigation of DTNS2D for use as an incompressible turbulence modelling test-bed

    NASA Technical Reports Server (NTRS)

    Steffen, Christopher J., Jr.

    1992-01-01

    This paper documents an investigation of a two dimensional, incompressible Navier-Stokes solver for use as a test-bed for turbulence modelling. DTNS2D is the code under consideration for use at the Center for Modelling of Turbulence and Transition (CMOTT). This code was created by Gorski at the David Taylor Research Center and incorporates the pseudo compressibility method. Two laminar benchmark flows are used to measure the performance and implementation of the method. The classical solution of the Blasius boundary layer is used for validating the flat plate flow, while experimental data is incorporated in the validation of backward facing step flow. Velocity profiles, convergence histories, and reattachment lengths are used to quantify these calculations. The organization and adaptability of the code are also examined in light of the role as a numerical test-bed.

  17. System engineering techniques for establishing balanced design and performance guidelines for the advanced telerobotic testbed

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.; Matijevic, J. R.

    1987-01-01

    Novel system engineering techniques have been developed and applied to establishing structured design and performance objectives for the Telerobotics Testbed that reduce technical risk while still allowing the testbed to demonstrate an advancement in state-of-the-art robotic technologies. To estblish the appropriate tradeoff structure and balance of technology performance against technical risk, an analytical data base was developed which drew on: (1) automation/robot-technology availability projections, (2) typical or potential application mission task sets, (3) performance simulations, (4) project schedule constraints, and (5) project funding constraints. Design tradeoffs and configuration/performance iterations were conducted by comparing feasible technology/task set configurations against schedule/budget constraints as well as original program target technology objectives. The final system configuration, task set, and technology set reflected a balanced advancement in state-of-the-art robotic technologies, while meeting programmatic objectives and schedule/cost constraints.

  18. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Taylor, Jaime; Rakoczy, John; Steincamp, James

    2003-01-01

    Phase retrieval requires calculation of the real-valued phase of the pupil fimction from the image intensity distribution and characteristics of an optical system. Genetic 'algorithms were used to solve two one-dimensional phase retrieval problem. A GA successfully estimated the coefficients of a polynomial expansion of the phase when the number of coefficients was correctly specified. A GA also successfully estimated the multiple p h e s of a segmented optical system analogous to the seven-mirror Systematic Image-Based Optical Alignment (SIBOA) testbed located at NASA s Marshall Space Flight Center. The SIBOA testbed was developed to investigate phase retrieval techniques. Tiphilt and piston motions of the mirrors accomplish phase corrections. A constant phase over each mirror can be achieved by an independent tip/tilt correction: the phase Conection term can then be factored out of the Discrete Fourier Tranform (DFT), greatly reducing computations.

  19. Anti-interference of the tribological test-bed for thrust bearings used in submersible pumps

    NASA Astrophysics Data System (ADS)

    Yu, Jianwei; You, Tao; Yu, Xiaofen; Jiao, Minghua; Xie, Ting

    2006-11-01

    This test-bed was developed to investigate the tribological behavior of the thrust bearings - the key parts of submersible pumps, in the simulating work conditions in oil well. A great deal of Electromagnetic Interferences (EMI) are unavoidable and will lead to system failure or abnormal data in condition of quite high load and speed. On the basis of analysis and investigation of interference source and mode in system, such as power interference, vector control inverter, mechanical contact, static interference and so on, propose several hardware case, such as new power EMI filter, electromagnetic shielding, RC link, impedance matching, etc., were employed, and novel digital filtering algorithm, in measurement and control system of test-bed. The results showed that the combination of the different anti-interference methods can reach optimal reliability and precision.

  20. Utilizing the EUVE Innovative Technology Testbed to Reduce Operations Cost for Present and Future Orbiting Mission

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This report summarizes work done under Cooperative Agreement (CA) on the following testbed projects: TERRIERS - The development of the ground systems to support the TERRIERS satellite mission at Boston University (BU). HSTS - The application of ARC's Heuristic Scheduling Testbed System (HSTS) to the EUVE satellite mission. SELMON - The application of NASA's Jet Propulsion Laboratory's (JPL) Selective Monitoring (SELMON) system to the EUVE satellite mission. EVE - The development of the EUVE Virtual Environment (EVE), a prototype three-dimensional (3-D) visualization environment for the EUVE satellite and its sensors, instruments, and communications antennae. FIDO - The development of the Fault-Induced Document Officer (FIDO) system, a prototype application to respond to anomalous conditions by automatically searching for, retrieving, and displaying relevant documentation for an operators use.