Science.gov

Sample records for open cloud testbed

  1. The Open Cloud Testbed: Supporting Open Source Cloud Computing Systems Based on Large Scale High Performance, Dynamic Network Services

    NASA Astrophysics Data System (ADS)

    Grossman, Robert; Gu, Yunhong; Sabala, Michal; Bennet, Colin; Seidman, Jonathan; Mambratti, Joe

    Recently, a number of cloud platforms and services have been developed for data intensive computing, including Hadoop, Sector, CloudStore (formerly KFS), HBase, and Thrift. In order to benchmark the performance of these systems, to investigate their interoperability, and to experiment with new services based on flexible compute node and network provisioning capabilities, we have designed and implemented a large scale testbed called the Open Cloud Testbed (OCT). Currently OCT has 120 nodes in 4 data centers: Baltimore, Chicago (two locations), and San Diego. In contrast to other cloud testbeds, which are in small geographic areas and which are based on commodity Internet services, the OCT is a wide area testbed and the 4 data centers are connected with a high performance 10Gb/s network, based on a foundation of dedicated lightpaths. This testbed can address the requirements of extremely large data streams that challenge other types of distributed infrastructure. We have also developed several utilities to support the development of cloud computing systems and services, including novel node and network provisioning services, a monitoring system, and an RPC system. In this paper, we describe the OCT concepts, architecture, infrastructure, a few benchmarks that were developed for this platform, interoperability studies, and results.

  2. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  3. Southern Great Plains cloud and radiation testbed site

    SciTech Connect

    1996-09-01

    This document presents information about the Cloud and Radiation Testbed Site and the Atmospheric Radiation Measurement program. Topics include; measuring methods, general circulation methods, milestones, instrumentation, meteorological observations, and computing facilities.

  4. A boundary-layer cloud study using Southern Great Plains Cloud and radiation testbed (CART) data

    SciTech Connect

    Albrecht, B.; Mace, G.; Dong, X.; Syrett, W.

    1996-04-01

    Boundary layer clouds-stratus and fairweather cumulus - are closely coupled involves the radiative impact of the clouds on the surface energy budget and the strong dependence of cloud formation and maintenance on the turbulent fluxes of heat and moisture in the boundary layer. The continuous data collection at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site provides a unique opportunity to study components of the coupling processes associated with boundary layer clouds and to provide descriptions of cloud and boundary layer structure that can be used to test parameterizations used in climate models. But before the CART data can be used for process studies and parameterization testing, it is necessary to evaluate and validate data and to develop techniques for effectively combining the data to provide meaningful descriptions of cloud and boundary layer characteristics. In this study we use measurements made during an intensive observing period we consider a case where low-level stratus were observed at the site for about 18 hours. This case is being used to examine the temporal evolution of cloud base, cloud top, cloud liquid water content, surface radiative fluxes, and boundary layer structure. A method for inferring cloud microphysics from these parameters is currently being evaluated.

  5. Clouds over Open Ocean

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The heavy concentration of these cirrocumulus and nimbostratus clouds over open ocean - location unknown, indicate that a heavy downpouring of rain is occuring on the Earth's surface below. Towering anvils, seen rising high above the base cloud cover and casting long shadows, also indicate high winds and possible tornado activity.

  6. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  7. Site/Systems Operations, Maintenance and Facilities Management of the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site

    SciTech Connect

    Wu, Susan

    2005-08-01

    This contract covered the site/systems operations, maintenance, and facilities management of the DOE Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site.

  8. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-12-31

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  9. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-01-01

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  10. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.; Wasmer, F.; Pentecost, E.D.

    1992-03-01

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth`s atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy`s Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climate change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described.

  11. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  12. An open science cloud for scientific research

    NASA Astrophysics Data System (ADS)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  13. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  14. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  15. Analytical study of the effects of the Low-Level Jet on moisture convergence and vertical motion fields at the Southern Great Plains Cloud and Radiation Testbed site

    SciTech Connect

    Bian, X.; Zhong, S.; Whiteman, C.D.; Stage, S.A.

    1996-04-01

    The Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) is located in a region that is strongly affected by a prominent meteorological phenomenon--the Great Plains Low-Level Jet (LLJ). Observations have shown that the LLJ plays a vital role in spring and summertime cloud formation and precipitation over the Great Plains. An improved understanding of the LLJ characteristics and its impact on the environment is necessary for addressing the fundamental issue of development and testing of radiational transfer and cloud parameterization schemes for the general circulation models (GCMs) using data from the SGP CART site. A climatological analysis of the summertime LLJ over the SGP has been carried out using hourly observations from the National Oceanic and Atmospheric Administration (NOAA) Wind Profiler Demonstration Network and from the ARM June 1993 Intensive Observation Period (IOP). The hourly data provide an enhanced temporal and spatial resolution relative to earlier studies which used 6- and 12-hourly rawinsonde observations at fewer stations.

  16. A comparison of radiometric fluxes influenced by parameterization cirrus clouds with observed fluxes at the Southern Great Plains (SGP) cloud and radiation testbed (CART) site

    SciTech Connect

    Mace, G.G.; Ackerman, T.P.; George, A.T.

    1996-04-01

    The data from the Atmospheric Radiation Measurement (ARM) Program`s Southern Great plains Site (SCP) is a valuable resource. We have developed an operational data processing and analysis methodology that allows us to examine continuously the influence of clouds on the radiation field and to test new and existing cloud and radiation parameterizations.

  17. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-01

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com. PMID:23305951

  18. Kontur: Observations of cloud streets and open cellular structures

    NASA Astrophysics Data System (ADS)

    Brümmer, B.; Bakan, S.; Hinzpeter, H.

    1985-08-01

    In September and October 1981 the experiment KonTur (Convection and turbulence) was conducted over the North Sea. Its objectives were to investigate organized convective patterns, like cloud streets (boundary layer rolls) and cellular cloud structures. Two aircraft (British Hercules C-130 and German Falcon 20) performed detailed measurements within these patterns. Several cases of cloud streets and open cells were observed. Boundary layer rolls appear to be connected with an inflection point in the cross-roll wind component. The aspect ratio of the rolls (wavelength versus depth) is between three and four in accordance with other observations and linear stability analysis. Four scales of motion are involved: the mean flow, the roll circulation, individual clouds and turbulence. The vertical transport are dominated at lower levels by turbulence and at higher levels by roll-scale motions. Open cellular cloud structures are connected with large air-sea temperature differences due to cold air outbreaks from the northwest. The aspect ratio of the cells is of the order of 10. The bulk contribution to the total transport of heat and momentum originates from the cloudy walls of the cells. A vertical cross section through a composite open cell is presented.

  19. Open-cell cloud formation over the Bahamas

    NASA Technical Reports Server (NTRS)

    2002-01-01

    What atmospheric scientists refer to as open cell cloud formation is a regular occurrence on the back side of a low-pressure system or cyclone in the mid-latitudes. In the Northern Hemisphere, a low-pressure system will draw in surrounding air and spin it counterclockwise. That means that on the back side of the low-pressure center, cold air will be drawn in from the north, and on the front side, warm air will be drawn up from latitudes closer to the equator. This movement of an air mass is called advection, and when cold air advection occurs over warmer waters, open cell cloud formations often result. This MODIS image shows open cell cloud formation over the Atlantic Ocean off the southeast coast of the United States on February 19, 2002. This particular formation is the result of a low-pressure system sitting out in the North Atlantic Ocean a few hundred miles east of Massachusetts. (The low can be seen as the comma-shaped figure in the GOES-8 Infrared image from February 19, 2002.) Cold air is being drawn down from the north on the western side of the low and the open cell cumulus clouds begin to form as the cold air passes over the warmer Caribbean waters. For another look at the scene, check out the MODIS Direct Broadcast Image from the University of Wisconsin. Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  20. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  1. Open reading frame phylogenetic analysis on the cloud.

    PubMed

    Hung, Che-Lun; Lin, Chun-Yuan

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  2. Cloud-Based Model Calibration Using OpenStudio: Preprint

    SciTech Connect

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  3. Optical interferometer testbed

    NASA Technical Reports Server (NTRS)

    Blackwood, Gary H.

    1991-01-01

    Viewgraphs on optical interferometer testbed presented at the MIT Space Research Engineering Center 3rd Annual Symposium are included. Topics covered include: space-based optical interferometer; optical metrology; sensors and actuators; real time control hardware; controlled structures technology (CST) design methodology; identification for MIMO control; FEM/ID correlation for the naked truss; disturbance modeling; disturbance source implementation; structure design: passive damping; low authority control; active isolation of lightweight mirrors on flexible structures; open loop transfer function of mirror; and global/high authority control.

  4. Tidal disruption of open clusters in their parent molecular clouds

    NASA Technical Reports Server (NTRS)

    Long, Kevin

    1989-01-01

    A simple model of tidal encounters has been applied to the problem of an open cluster in a clumpy molecular cloud. The parameters of the clumps are taken from the Blitz, Stark, and Long (1988) catalog of clumps in the Rosette molecular cloud. Encounters are modeled as impulsive, rectilinear collisions between Plummer spheres, but the tidal approximation is not invoked. Mass and binding energy changes during an encounter are computed by considering the velocity impulses given to individual stars in a random realization of a Plummer sphere. Mean rates of mass and binding energy loss are then computed by integrating over many encounters. Self-similar evolutionary calculations using these rates indicate that the disruption process is most sensitive to the cluster radius and relatively insensitive to cluster mass. The calculations indicate that clusters which are born in a cloud similar to the Rosette with a cluster radius greater than about 2.5 pc will not survive long enough to leave the cloud. The majority of clusters, however, have smaller radii and will survive the passage through their parent cloud.

  5. Fast Physics Testbed for the FASTER Project

    SciTech Connect

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  6. The EPOS Vision for the Open Science Cloud

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  7. A Business-to-Business Interoperability Testbed: An Overview

    SciTech Connect

    Kulvatunyou, Boonserm; Ivezic, Nenad; Monica, Martin; Jones, Albert

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  8. Open Source Cloud Computing for Transiting Planet Discovery

    NASA Astrophysics Data System (ADS)

    McCullough, Peter R.; Fleming, Scott W.; Zonca, Andrea; Flowers, Jack; Nguyen, Duy Cuong; Sinkovits, Robert; Machalek, Pavel

    2014-06-01

    We provide an update on the development of the open-source software suite designed to detect exoplanet transits using high-performance and cloud computing resources (https://github.com/openEXO). Our collaboration continues to grow as we are developing algorithms and codes related to the detection of transit-like events, especially in Kepler data, Kepler 2.0 and TESS data when available. Extending the work of Berriman et al. (2010, 2012), we describe our use of the XSEDE-Gordon supercomputer and Amazon EMR cloud to search for aperiodic transit-like events in Kepler light curves. Such events may be caused by circumbinary planets or transiting bodies, either planets or stars, with orbital periods comparable to or longer than the observing baseline such that only one transit is observed. As a bonus, we use the same code to find stellar flares too; whereas transits reduce the flux in a box-shaped profile, flares increase the flux in a fast-rise, exponential-decay (FRED) profile that nevertheless can be detected reliably with a square-wave finder.

  9. Open Source Software Reuse in the Airborne Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Khudikyan, S. E.; Hart, A. F.; Hardman, S.; Freeborn, D.; Davoodi, F.; Resneck, G.; Mattmann, C. A.; Crichton, D. J.

    2012-12-01

    Earth science airborne missions play an important role in helping humans understand our climate. A challenge for airborne campaigns in contrast to larger NASA missions is that their relatively modest budgets do not permit the ground-up development of data management tools. These smaller missions generally consist of scientists whose primary focus is on the algorithmic and scientific aspects of the mission, which often leaves data management software and systems to be addressed as an afterthought. The Airborne Cloud Computing Environment (ACCE), developed by the Jet Propulsion Laboratory (JPL) to support Earth Science Airborne Program, is a reusable, multi-mission data system environment for NASA airborne missions. ACCE provides missions with a cloud-enabled platform for managing their data. The platform consists of a comprehensive set of robust data management capabilities that cover everything from data ingestion and archiving, to algorithmic processing, and to data delivery. Missions interact with this system programmatically as well as via browser-based user interfaces. The core components of ACCE are largely based on Apache Object Oriented Data Technology (OODT), an open source information integration framework at the Apache Software Foundation (ASF). Apache OODT is designed around a component-based architecture that allows for selective combination of components to create highly configurable data management systems. The diverse and growing community that currently contributes to Apache OODT fosters on-going growth and maturation of the software. ACCE's key objective is to reduce cost and risks associated with developing data management systems for airborne missions. Software reuse plays a prominent role in mitigating these problems. By providing a reusable platform based on open source software, ACCE enables airborne missions to allocate more resources to their scientific goals, thereby opening the doors to increased scientific discovery.

  10. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  11. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  12. Aspects of the quality of data from the Southern Great Plains (SGP) cloud and radiation testbed (CART) site broadband radiation sensors

    SciTech Connect

    Splitt, M.E.; Wesely, M.L.

    1996-04-01

    A systmatic evaluation of the performance of broadband radiometers at the Radiation Testbed (CART) site is needed to estimate the uncertainties of the irradiance observations. Here, net radiation observed with the net radiometer in the enrgy balance Bowen ratio station at the Central facility is compared with the net radiation computed as the sum of component irradiances recorded by nearby pyranameters and pyrgeometers. In addition, data obtained from the central facility pyranometers, pyrgeometers, and pyrheliometers are examined for April 1994, when intensive operations periods were being carried out. The data used in this study are from central facility radiometers in a solar and infrared observation station, and EBBR station, the so-called `BSRN` set of upward pointing radiometers, and a set of radiometers pointed down at the 25-m level of a 60-m tower.

  13. Modeling Aerosol-Cloud Interactions in Marine Open- and Closed-Cell Stratocumulus

    NASA Astrophysics Data System (ADS)

    Wang, H.; Feingold, G.

    2008-12-01

    Satellite imagery shows the recurrence of striking images of cellular structures exhibiting both closed- and open-cell patterns in marine stratocumulus fields. The open-cell region has much lower cloud albedo than closed cells. Aside from that, previous observational and modeling studies have suggested that open- and closed-cell regions are different in many other aspects, such as concentration of cloud condensation nuclei (CCN), cloud droplet number and size, precipitation efficiency, and cloud dynamics. In this work, aerosol- cloud interactions and dynamical feedbacks are investigated within a large eddy simulation (LES) modeling framework to study the activation, cloud scavenging, mixing and transport of CCN in the open- and closed- cell boundary layer and near the open/closed-cell boundaries. The model domain size of 120 km by 60 km is large enough to represent mesoscale organizations that are associated with different cellular structures and that are promoted by CCN perturbation from ship emissions. Simulation results show that depletion of CCN by collision and coalescence in clouds is critical to the formation of precipitation and open-cell structure in a stratocumulus deck. Once the open cellular structure has formed in the clean environment, a substantial increase of CCN transported from a neighboring polluted environment or from ship emissions do not close it during the 12-hour simulation due to the lack of dynamical and moisture support in the open-cell cloud-free region. However, the contaminated open cells are not able to self-sustain as a result of shutoff of precipitation. This points to the critical role of precipitation-triggered circulations in maintaining an open-cellular structure.

  14. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  15. The Fizeau Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Zhang, Xiaolei; Carpenter, Kenneth G.; Lyon, Richard G,; Huet, Hubert; Marzouk, Joe; Solyar, Gregory

    2003-01-01

    The Fizeau Interferometer Testbed (FIT) is a collaborative effort between NASA's Goddard Space Flight Center, the Naval Research Laboratory, Sigma Space Corporation, and the University of Maryland. The testbed will be used to explore the principles of and the requirements for the full, as well as the pathfinder, Stellar Imager mission concept. It has a long term goal of demonstrating closed-loop control of a sparse array of numerous articulated mirrors to keep optical beams in phase and optimize interferometric synthesis imaging. In this paper we present the optical and data acquisition system design of the testbed, and discuss the wavefront sensing and control algorithms to be used. Currently we have completed the initial design and hardware procurement for the FIT. The assembly and testing of the Testbed will be underway at Goddard's Instrument Development Lab in the coming months.

  16. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  17. AutoGNC Testbed

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Vaughan, Andrew T.; Bayard, David S.; Riedel, Joseph E.; Balaram, J.

    2010-01-01

    A simulation testbed architecture was developed and implemented for the integration, test, and development of a TRL-6 flight software set called Auto- GNC. The AutoGNC software will combine the TRL-9 Deep Impact AutoNAV flight software suite, the TRL-9 Virtual Machine Language (VML) executive, and the TRL-3 G-REX guidance, estimation, and control algorithms. The Auto- GNC testbed was architected to provide software interface connections among the AutoNAV and VML flight code written in C, the G-REX algorithms in MATLAB and C, stand-alone image rendering algorithms in C, and other Fortran algorithms, such as the OBIRON landmark tracking suite. The testbed architecture incorporates software components for propagating a high-fidelity truth model of the environment and the spacecraft dynamics, along with the flight software components for onboard guidance, navigation, and control (GN&C). The interface allows for the rapid integration and testing of new algorithms prior to development of the C code for implementation in flight software. This testbed is designed to test autonomous spacecraft proximity operations around small celestial bodies, moons, or other spacecraft. The software is baselined for upcoming comet and asteroid sample return missions. This architecture and testbed will provide a direct improvement upon the onboard flight software utilized for missions such as Deep Impact, Stardust, and Deep Space 1.

  18. Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun

    NASA Technical Reports Server (NTRS)

    Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.

    2008-01-01

    Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.

  19. MIT's interferometer CST testbed

    NASA Technical Reports Server (NTRS)

    Hyde, Tupper; Kim, ED; Anderson, Eric; Blackwood, Gary; Lublin, Leonard

    1990-01-01

    The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.

  20. Use of AVHRR-derived spectral reflectances to estimate surface albedo across the Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Qiu, J.; Gao, W.

    1997-03-01

    Substantial variations in surface albedo across a large area cause difficulty in estimating regional net solar radiation and atmospheric absorption of shortwave radiation when only ground point measurements of surface albedo are used to represent the whole area. Information on spatial variations and site-wide averages of surface albedo, which vary with the underlying surface type and conditions and the solar zenith angle, is important for studies of clouds and atmospheric radiation over a large surface area. In this study, a bidirectional reflectance model was used to inversely retrieve surface properties such as leaf area index and then the bidirectional reflectance distribution was calculated by using the same radiation model. The albedo was calculated by converting the narrowband reflectance to broadband reflectance and then integrating over the upper hemisphere.

  1. Point Cloud Visualization in AN Open Source 3d Globe

    NASA Astrophysics Data System (ADS)

    De La Calle, M.; Gómez-Deck, D.; Koehler, O.; Pulido, F.

    2011-09-01

    During the last years the usage of 3D applications in GIS is becoming more popular. Since the appearance of Google Earth, users are familiarized with 3D environments. On the other hand, nowadays computers with 3D acceleration are common, broadband access is widespread and the public information that can be used in GIS clients that are able to use data from the Internet is constantly increasing. There are currently several libraries suitable for this kind of applications. Based on these facts, and using libraries that are already developed and connected to our own developments, we are working on the implementation of a real 3D GIS with analysis capabilities. Since a 3D GIS such as this can be very interesting for tasks like LiDAR or Laser Scanner point clouds rendering and analysis, special attention is given to get an optimal handling of very large data sets. Glob3 will be a multidimensional GIS in which 3D point clouds could be explored and analysed, even if they are consist of several million points.The latest addition to our visualization libraries is the development of a points cloud server that works regardless of the cloud's size. The server receives and processes petitions from a 3d client (for example glob3, but could be any other, such as one based on WebGL) and delivers the data in the form of pre-processed tiles, depending on the required level of detail.

  2. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program is developing initial recommendations for requirements and design approaches for the information systems of the Space Station era. During this quarter, drafting of the final reports of the various participants was initiated. Several drafts are included in this report as the University technical reports.

  3. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License

  4. Continuation: The EOSDIS testbed data system

    NASA Technical Reports Server (NTRS)

    Emery, Bill; Kelley, Timothy D.

    1995-01-01

    The continuation of the EOSDIS testbed ('Testbed') has materialized from a multi-task system to a fully functional stand-alone data archive distribution center that once was only X-Windows driven to a system that is accessible by all types of users and computers via the World Wide Web. Throughout the past months, the Testbed has evolved into a completely new system. The current system is now accessible through Netscape, Mosaic, and all other servers that can contact the World Wide Web. On October 1, 1995 we will open to the public and we expect that the statistics of the type of user, where they are located, and what they are looking for will drastically change. What is the most important change in the Testbed has been the Web interface. This interface will allow more users access to the system and walk them through the data types with more ease than before. All of the callbacks are written in such a way that icons can be used to easily move around in the programs interface. The homepage offers the user the opportunity to go and get more information about each satellite data type and also information on free programs. These programs are grouped into categories for types of computers that the programs are compiled for, along with information on how to FTP the programs back to the end users computer. The heart of the Testbed is still the acquisition of satellite data. From the Testbed homepage, the user selects the 'access to data system' icon, which will take them to the world map and allow them to select an area that they would like coverage on by simply clicking that area of the map. This creates a new map where other similar choices can be made to get the latitude and longitude of the region the satellite data will cover. Once a selection has been made the search parameters page will appear to be filled out. Afterwards, the browse image will be called for once the search is completed and the images for viewing can be selected. There are several other option pages

  5. Establishment of an NWP testbed using ARM data

    SciTech Connect

    O'Connor, E.; Liu, Y.; Hogan, R.

    2010-03-15

    The aim of the FAst-physics System TEstbed and Research (FASTER) project is to evaluate and improve the parameterizations of fast physics (involving clouds, precipitation, aerosol) in numerical models using ARM measurements. One objective within FASTER is to evaluate model representations of fast physics with long-term continuous cloud observations by use of an 'NWP testbed'. This approach was successful in the European Cloudnet project. NWP model data (NCEP, ECMWF, etc.) is routinely output at ARM sites, and model evaluation can potentially be achieved in quasi-real time. In this poster, we will outline our progress in the development of the NWP testbed and discuss the successful integration of ARM algorithms, such as ARSCL, with algorithms and lessons learned from Cloudnet. Preliminary results will be presented of the evaluation of the ECMWF, NCEP, and UK Met Office models over the SGP site using this approach.

  6. The Palomar Testbed Interferometer

    NASA Technical Reports Server (NTRS)

    Colavita, M. M.; Wallace, J. K.; Hines, B. E.; Gursel, Y.; Malbet, F.; Palmer, D. L.; Pan, X. P.; Shao, M.; Yu, J. W.; Boden, A. F.

    1999-01-01

    The Palomar Testbed Interferometer (PTI) is a long-baseline infrared interferometer located at Palomar Observatory, California. It was built as a testbed for interferometric techniques applicable to the Keck Interferometer. First fringes were obtained in 1995 July. PTI implements a dual-star architecture, tracking two stars simultaneously for phase referencing and narrow-angle astrometry. The three fixed 40 cm apertures can be combined pairwise to provide baselines to 110 m. The interferometer actively tracks the white-light fringe using an array detector at 2.2 microns and active delay lines with a range of +/-38 m. Laser metrology of the delay lines allows for servo control, and laser metrology of the complete optical path enables narrow-angle astrometric measurements. The instrument is highly automated, using a multiprocessing computer system for instrument control and sequencing.

  7. Testbed for LISA Photodetectors

    NASA Technical Reports Server (NTRS)

    Guzman, Felipe; Livas, Jeffrey; Silverberg, Robert

    2009-01-01

    The Laser Interferometer Space Antenna (LISA) is a gravitational wave observatory consisting of three spacecraft separated by 5 million km in an equilateral triangle whose center follows the Earth in orbit around the Sun but offset in orbital phase by 20 degrees. LISA is designed to observe sources in the frequency range of 0.1 mHz-100 mHz by measuring fluctuations of the inter-spacecraft separation with laser interferometry. Quadrant photodetectors are used to measure both separation and angular orientation. Noise level, phase and amplitude inhomogeneities of the semiconductor response, and channel cross-talk between quadrant cells need to be assessed in order to ensure the 10 pm/Square root(Hz) sensitivity required for the interferometric length measurement in LISA. To this end, we are currently developing a testbed that allows us to evaluate photodetectors to the sensitivity levels required for LISA. A detailed description of the testbed and preliminary results will be presented.

  8. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  9. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing a aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in real time in a sequence similar to what would occur in the NAS.

  10. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program (TTPP) is intended to develop initial recommendations for requirements and design approaches for the information system of the Space Station era. Multiple scientific experiments are being performed, each exploring advanced technologies and technical approaches and each emulating some aspect of Space Station era science. The aggregate results of the program will serve to guide the development of future NASA information systems.

  11. Telescience testbed pilot program

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1988-01-01

    The Universities Space Research Association (USRA), under sponsorship from the NASA Office of Space Science and Applications, is conducting a Telescience Testbed Pilot Program. Fifteen universities, under subcontract to USRA, are conducting a variety of scientific experiments using advanced technology to determine the requirements and evaluate the tradeoffs for the information system of the Space Station era. An interim set of recommendations based on the experiences of the first six months of the pilot program is presented.

  12. Enabling Open Cloud Markets Through WS-Agreement Extensions

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn

    Research into computing resource markets has mainly considered the question of which market mechanisms provide a fair resource allocation. However, while developing such markets, the definition of the unit of trade (i.e. the definition of resource) has not been given much attention. In this paper, we analyze the requirements for tradable resource goods. Based on the results, we suggest a detailed goods definition, which is easy to understand, can be used with many market mechanisms, and addresses the needs of a Cloud resource market. The goods definition captures the complete system resource, including hardware specifications, software specifications, the terms of use, and a pricing function. To demonstrate the usefulness of such a standardized goods definition, we demonstrate its application in the form of a WS-Agreement template for a number of market mechanisms for commodity system resources.

  13. Advanced turboprop testbed systems study

    NASA Technical Reports Server (NTRS)

    Goldsmith, I. M.

    1982-01-01

    The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.

  14. Building a Parallel Cloud Storage System using OpenStack’s Swift Object Store and Transformative Parallel I/O

    SciTech Connect

    Burns, Andrew J.; Lora, Kaleb D.; Martinez, Esteban; Shorter, Martel L.

    2012-07-30

    Our project consists of bleeding-edge research into replacing the traditional storage archives with a parallel, cloud-based storage solution. It used OpenStack's Swift Object Store cloud software. It's Benchmarked Swift for write speed and scalability. Our project is unique because Swift is typically used for reads and we are mostly concerned with write speeds. Cloud Storage is a viable archive solution because: (1) Container management for larger parallel archives might ease the migration workload; (2) Many tools that are written for cloud storage could be utilized for local archive; and (3) Current large cloud storage practices in industry could be utilized to manage a scalable archive solution.

  15. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  16. LISA Optical Bench Testbed

    NASA Astrophysics Data System (ADS)

    Lieser, M.; d'Arcio, L.; Barke, S.; Bogenstahl, J.; Diekmann, C.; Diepholz, I.; Fitzsimons, E. D.; Gerberding, O.; Henning, J.-S.; Hewitson, M.; Hey, F. G.; Hogenhuis, H.; Killow, C. J.; Lucarelli, S.; Nikolov, S.; Perreur-Lloyd, M.; Pijnenburg, J.; Robertson, D. I.; Sohmer, A.; Taylor, A.; Tröbs, M.; Ward, H.; Weise, D.; Heinzel, G.; Danzmann, K.

    2013-01-01

    The optical bench (OB) is a part of the LISA spacecraft, situated between the telescope and the testmass. For measuring the inter-spacecraft distances there are several interferometers on the OB. The elegant breadboard of the OB for LISA is developed for the European Space Agency (ESA) by EADS Astrium, TNO Science & Industry, University of Glasgow and the Albert Einstein Intitute (AEI), the performance tests then will be done at the AEI. Here we present the testbed that will be used for the performance tests with the focus on the thermal environment and the laser infrastructure.

  17. Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.

  18. Single link flexible beam testbed project. Thesis

    NASA Technical Reports Server (NTRS)

    Hughes, Declan

    1992-01-01

    This thesis describes the single link flexible beam testbed at the CLaMS laboratory in terms of its hardware, software, and linear model, and presents two controllers, each including a hub angle proportional-derivative (PD) feedback compensator and one augmented by a second static gain full state feedback loop, based upon a synthesized strictly positive real (SPR) output, that increases specific flexible mode pole damping ratios w.r.t the PD only case and hence reduces unwanted residual oscillation effects. Restricting full state feedback gains so as to produce a SPR open loop transfer function ensures that the associated compensator has an infinite gain margin and a phase margin of at least (-90, 90) degrees. Both experimental and simulation data are evaluated in order to compare some different observer performance when applied to the real testbed and to the linear model when uncompensated flexible modes are included.

  19. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA s Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing an aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in realtime in a sequence similar to what would occur in the NAS. The ATC applications that can be studied are the Aeronautical Telecommunications Network s (ATN) Context Management (CM) and Controller Pilot Data Link Communications (CPDLC). The Surveillance applications are Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Services - Broadcast (TIS-B).

  20. PROOF on the Cloud for ALICE using PoD and OpenNebula

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Bagnasco, S.; Brunetti, R.; Lusso, S.

    2012-06-01

    In order to optimize the use and management of computing centres, their conversion to cloud facilities is becoming increasingly popular. In a medium to large cloud facility, many different virtual clusters may concur for the same resources: unused resources can be freed either by turning off idle virtual machines, or by lowering resources assigned to a virtual machine at runtime. PROOF, a ROOT-based parallel and interactive analysis framework, is officially endorsed in the computing model of the ALICE experiment as complementary to the Grid, and it has become very popular over the last three years. The locality of PROOF-based analysis facilities forces system administrators to scavenge resources, yet the chaotic nature of user analysis tasks deems them unstable and inconstantly used, making PROOF a typical use-case for HPC cloud computing. Currently, PoD dynamically and easily provides a PROOF-enabled cluster by submitting agents to a job scheduler. Unfortunately, a Tier-2 does not comfortably share the same queue between interactive and batch jobs, due to the very large average time to completion of the latter: an elastic cloud approach would enable interactive virtual machines to temporarily subtract resources to the batch ones, without a noticeable impact on them. In this work we describe our setup of a dynamic PROOF-based cloud analysis facility based on PoD and OpenNebula, orchestrated by a simple and lightweight control daemon that makes virtualization transparent for the user.

  1. Uav-Based Photogrammetric Point Clouds - Tree STEM Mapping in Open Stands in Comparison to Terrestrial Laser Scanner Point Clouds

    NASA Astrophysics Data System (ADS)

    Fritz, A.; Kattenborn, T.; Koch, B.

    2013-08-01

    In both ecology and forestry, there is a high demand for structural information of forest stands. Forest structures, due to their heterogeneity and density, are often difficult to assess. Hence, a variety of technologies are being applied to account for this "difficult to come by" information. Common techniques are aerial images or ground- and airborne-Lidar. In the present study we evaluate the potential use of unmanned aerial vehicles (UAVs) as a platform for tree stem detection in open stands. A flight campaign over a test site near Freiburg, Germany covering a target area of 120 × 75 [m2] was conducted. The dominant tree species of the site is oak (quercus robur) with almost no understory growth. Over 1000 images with a tilt angle of 45° were shot. The flight pattern applied consisted of two antipodal staggered flight routes at a height of 55 [m] above the ground. We used a Panasonic G3 consumer camera equipped with a 14-42 [mm] standard lens and a 16.6 megapixel sensor. The data collection took place in leaf-off state in April 2013. The area was prepared with artificial ground control points for transformation of the structure-from-motion (SFM) point cloud into real world coordinates. After processing, the results were compared with a terrestrial laser scanner (TLS) point cloud of the same area. In the 0.9 [ha] test area, 102 individual trees above 7 [cm] diameter at breast height were located on in the TLS-cloud. We chose the software CMVS/PMVS-2 since its algorithms are developed with focus on dense reconstruction. The processing chain for the UAV-acquired images consists of six steps: a. cleaning the data: removing of blurry, under- or over exposed and off-site images; b. applying the SIFT operator [Lowe, 2004]; c. image matching; d. bundle adjustment; e. clustering; and f. dense reconstruction. In total, 73 stems were considered as reconstructed and located within one meter of the reference trees. In general stems were far less accurate and complete as

  2. Long Duration Sorbent Testbed

    NASA Technical Reports Server (NTRS)

    Howard, David F.; Knox, James C.; Long, David A.; Miller, Lee; Cmaric, Gregory; Thomas, John

    2016-01-01

    The Long Duration Sorbent Testbed (LDST) is a flight experiment demonstration designed to expose current and future candidate carbon dioxide removal system sorbents to an actual crewed space cabin environment to assess and compare sorption working capacity degradation resulting from long term operation. An analysis of sorbent materials returned to Earth after approximately one year of operation in the International Space Station's (ISS) Carbon Dioxide Removal Assembly (CDRA) indicated as much as a 70% loss of working capacity of the silica gel desiccant material at the extreme system inlet location, with a gradient of capacity loss down the bed. The primary science objective is to assess the degradation of potential sorbents for exploration class missions and ISS upgrades when operated in a true crewed space cabin environment. A secondary objective is to compare degradation of flight test to a ground test unit with contaminant dosing to determine applicability of ground testing.

  3. Holodeck Testbed Project

    NASA Technical Reports Server (NTRS)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  4. Space Environments Testbed

    NASA Technical Reports Server (NTRS)

    Leucht, David K.; Koslosky, Marie J.; Kobe, David L.; Wu, Jya-Chang C.; Vavra, David A.

    2011-01-01

    The Space Environments Testbed (SET) is a flight controller data system for the Common Carrier Assembly. The SET-1 flight software provides the command, telemetry, and experiment control to ground operators for the SET-1 mission. Modes of operation (see dia gram) include: a) Boot Mode that is initiated at application of power to the processor card, and runs memory diagnostics. It may be entered via ground command or autonomously based upon fault detection. b) Maintenance Mode that allows for limited carrier health monitoring, including power telemetry monitoring on a non-interference basis. c) Safe Mode is a predefined, minimum power safehold configuration with power to experiments removed and carrier functionality minimized. It is used to troubleshoot problems that occur during flight. d) Operations Mode is used for normal experiment carrier operations. It may be entered only via ground command from Safe Mode.

  5. Autonomous Flying Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2005-01-01

    The Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis,Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights.

  6. Optical Network Testbeds Workshop

    SciTech Connect

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  7. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  8. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  9. Updated Electronic Testbed System

    NASA Technical Reports Server (NTRS)

    Brewer, Kevin L.

    2001-01-01

    As we continue to advance in exploring space frontiers, technology must also advance. The need for faster data recovery and data processing is crucial. In this, the less equipment used, and lighter that equipment is, the better. Because integrated circuits become more sensitive in high altitude, experimental verification and quantification is required. The Center for Applied Radiation Research (CARR) at Prairie View A&M University was awarded a grant by NASA to participate in the NASA ER-2 Flight Program, the APEX balloon flight program, and the Student Launch Program. These programs are to test anomalous errors in integrated circuits due to single event effects (SEE). CARR had already begun experiments characterizing the SEE behavior of high speed and high density SRAM's. The research center built a error testing system using a PC-104 computer unit, an Iomega Zip drive for storage, a test board with the components under test, and a latchup detection and reset unit. A test program was written to continuously monitor a stored data pattern in the SRAM chip and record errors. The devices under test were eight 4Mbit memory chips totaling 4Mbytes of memory. CARR was successful at obtaining data using the Electronic TestBed System (EBS) in various NASA ER-2 test flights. These series of high altitude flights of up to 70,000 feet, were effective at yielding the conditions which single event effects usually occur. However, the data received from the series of flights indicated one error per twenty-four hours. Because flight test time is very expensive, the initial design proved not to be cost effective. The need for orders of magnitude with more memory became essential. Therefore, a project which could test more memory within a given time was created. The goal of this project was not only to test more memory within a given time, but also to have a system with a faster processing speed, and which used less peripherals. This paper will describe procedures used to build an

  10. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  11. Near infrared testbed sensor

    NASA Astrophysics Data System (ADS)

    Sanderson, R. B.; McCalmont, J. F.; Montgomery, J. B.; Johnson, R. S.; McDermott, D. J.

    2007-04-01

    A new tactical airborne multicolor missile warning testbed was developed and fielded as part of an Air Force Research Laboratory (AFRL) initiative focusing on clutter and missile signature measurements for algorithm development. Multicolor discrimination is one of the most effective ways of improving the performance of infrared missile warning sensors, particularly for heavy clutter situations. Its utility has been demonstrated in multiple fielded sensors. Traditionally, multicolor discrimination has been performed in the mid-infrared, 3-5 μm band, where the molecular emission of CO and CO2 characteristic of a combustion process is readily distinguished from the continuum of a black body radiator. Current infrared warning sensor development is focused on near infrared (NIR) staring mosaic detector arrays that provide similar spectral discrimination in different bands to provide a cost effective and mechanically simpler system. This, in turn, has required that multicolor clutter data be collected for both analysis and algorithm development. The developed sensor test bed is a multi-camera system 1004x1004 FPA coupled with optimized filters integrated with the optics. The collection portion includes a ruggedized field-programmable gate array processor coupled with with an integrated controller/tracker and fast disk array capable of real-time processing and collection of up to 60 full frames per second. This configuration allowed the collection and real-time processing of temporally correlated, radiometrically calibrated data in multiple spectral bands that was then compared to background and target imagery taken previously

  12. Experimental demonstration of an OpenFlow based software-defined optical network employing packet, fixed and flexible DWDM grid technologies on an international multi-domain testbed.

    PubMed

    Channegowda, M; Nejabati, R; Rashidi Fard, M; Peng, S; Amaya, N; Zervas, G; Simeonidou, D; Vilalta, R; Casellas, R; Martínez, R; Muñoz, R; Liu, L; Tsuritani, T; Morita, I; Autenrieth, A; Elbers, J P; Kostecki, P; Kaczmarek, P

    2013-03-11

    Software defined networking (SDN) and flexible grid optical transport technology are two key technologies that allow network operators to customize their infrastructure based on application requirements and therefore minimizing the extra capital and operational costs required for hosting new applications. In this paper, for the first time we report on design, implementation & demonstration of a novel OpenFlow based SDN unified control plane allowing seamless operation across heterogeneous state-of-the-art optical and packet transport domains. We verify and experimentally evaluate OpenFlow protocol extensions for flexible DWDM grid transport technology along with its integration with fixed DWDM grid and layer-2 packet switching. PMID:23482120

  13. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  14. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations. In neurosurgery, the needle used in the standard stereotactic CT (Computational Tomography) or MRI (Magnetic Resonance Imaging) guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled 'Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification' is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  15. Comparison of the cloud activation potential of open ocean and coastal aerosol in the Pacific Ocean

    NASA Astrophysics Data System (ADS)

    Vidaurre, G.; Brooks, S. D.; Thornton, D. C.

    2010-12-01

    Continuous measurements of aerosol concentration, particle size distribution, and cloud activation potential between 0.15 and 1.2% supersaturation were performed for open ocean and coastal air during the Halocarbon Air Sea Transect - Pacific (HalocAST) campaign. The nearly 7000 mile transect, aboard the R/V Thomas G. Thompson, started in Punta Arenas, Chile and ended in Seattle, Washington. Air mass source regions were identified on the basis of air mass back trajectories. For air masses in the southern hemisphere, aerosols sampled over the open ocean acted as cloud condensation nuclei at supersaturations between 0.5 and 1%, while coastal aerosols required higher supersaturations. In the pristine open ocean, observed aerosol concentrations were very low, typically below 200 cm-3, with an average particle diameter of approximately 0.4 μm. On the other hand, coastal aerosol concentrations were above 1000 cm-3 with an average particle diameter of 0.7 μm. Air masses originating in the northern hemisphere had much higher aerosol loads, between 500 and 2000 cm-3 over the ocean and above 4000 cm-3 at the coast. In both cases, the average particle diameters were approximately 0.5 μm. Measurements suggest that the northern hemisphere, substantially more polluted than the southern hemisphere, is characterized by alternating regions of high and medium aerosol number concentration. In addition, measurements of microorganism and organic matter concentration in the surface layer of the ocean water were conducted along the cruise track, to test the hypothesis that biogenic aerosol containing marine organic matter contribute to cloud activation potential. There was a significant correlation between mean aerosol diameter and prokaryote concentration in surface waters (r = 0.585, p < 0.01, n = 24), and between critical supersaturation and prokaryote concentration in surface waters (r = 0.538, p < 0.01, n = 24). This correlation indicates that larger aerosols occurred over water

  16. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  17. The ac power system testbed

    NASA Technical Reports Server (NTRS)

    Mildice, J.; Sundberg, R.

    1987-01-01

    The object of this program was to design, build, test, and deliver a high frequency (20 kHz) Power System Testbed which would electrically approximate a single, separable power channel of an IOC Space Station. That program is described, including the technical background, and the results are discussed showing that the major assumptions about the characteristics of this class of hardware (size, mass, efficiency, control, etc.) were substantially correct. This testbed equipment was completed and delivered and is being operated as part of the Space Station Power System Test Facility.

  18. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  19. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  20. High-contrast imaging testbed

    SciTech Connect

    Baker, K; Silva, D; Poyneer, L; Macintosh, B; Bauman, B; Palmer, D; Remington, T; Delgadillo-Lariz, M

    2008-01-23

    Several high-contrast imaging systems are currently under construction to enable the detection of extra-solar planets. In order for these systems to achieve their objectives, however, there is considerable developmental work and testing which must take place. Given the need to perform these tests, a spatially-filtered Shack-Hartmann adaptive optics system has been assembled to evaluate new algorithms and hardware configurations which will be implemented in these future high-contrast imaging systems. In this article, construction and phase measurements of a membrane 'woofer' mirror are presented. In addition, results from closed-loop operation of the assembled testbed with static phase plates are presented. The testbed is currently being upgraded to enable operation at speeds approaching 500 hz and to enable studies of the interactions between the woofer and tweeter deformable mirrors.

  1. Generalized Nanosatellite Avionics Testbed Lab

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt

    2015-01-01

    The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.

  2. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  3. Optical Coating Thermal Noise Testbed

    NASA Astrophysics Data System (ADS)

    Hartman, Michael T.; Eichholz, Johannes; Tanner, David B.; Mueller, Guido

    2015-04-01

    Interferometric gravitational-wave detectors measure the length strain of a passing gravitational-wave as differential arm length changes in kilometer-long Michelson interferometers. The second-generation detectors, such as Advanced LIGO (aLIGO), will achieve strain sensitivities which are limited by Brownian thermal noise in the optical coatings of the interferometers' arm-cavity mirror test masses. Brownian coating thermal noise (CTN) is the apparent motion on the mirror surface on the order of 10-17 -10-20 m resulting from thermal fluctuations in the coating and the coating's internal friction. The result is a source of length noise in optical resonators that is a function of the coating temperature and the coating material's mechanical loss. At the University of Florida we are constructing the THermal noise Optical Resonator (THOR), a testbed for the direct measurement of CTN in the aLIGO test mass coating as well as future coating candidates. The material properties of the coating (namely mechanical loss) are temperature dependent, making cryogenic mirrors a prospect for future gravitational-wave detectors. To explore this option we are simultaneously building a cryogenic CTN testbed, CryoTHOR. This is a presentation on the status of these testbeds. This work is supported by NSF Grants PHY-0969935 and PHY-1306594.

  4. Quantum well earth science testbed

    NASA Astrophysics Data System (ADS)

    Johnson, William R.; Hook, Simon J.; Mouroulis, Pantazis; Wilson, Daniel W.; Gunapala, Sarath D.; Hill, Cory J.; Mumolo, Jason M.; Eng, Bjorn T.

    2009-11-01

    A thermal hyperspectral imager is underdevelopment which utilizes the compact Dyson optical configuration and the broadband (8-12 μm) quantum well infrared photodetector (QWIP) focal plane array technology. The Dyson configuration uses a single monolithic prism-like grating design which allows for a high throughput instrument (F/1.6) with minimal ghosting, stray light and large swath width. The configuration has the potential to be the optimal high resolution imaging spectroscopy solution for aerial and space remote sensing applications due to its small form factor and relatively low power requirements. The planned instrument specifications are discussed as well as thermal design trade-offs. The current design uses a single high power cryocooler which allows operation of the QWIP at 40 K with adequate temperature stability. Calibration testing results (noise equivalent temperature difference, spectral linearity and spectral bandwidth) and laboratory emissivity plots from samples are shown using an operational testbed unit which has similar specifications as the final airborne system. Field testing of the testbed unit was performed to acquire plots of emissivity for various known standard minerals (quartz, opal, alunite). A comparison is made using data from the ASTER spectral library. The current single band (8-9 μm) testbed utilizes the high uniformity and operability of the QWIP array and shows excellent laboratory and field spectroscopic results.

  5. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  6. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  7. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  8. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  9. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-12-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources.

  10. NASA's telemedicine testbeds: Commercial benefit

    NASA Astrophysics Data System (ADS)

    Doarn, Charles R.; Whitten, Raymond

    1998-01-01

    The National Aeronautics and Space Administration (NASA) has been developing and applying telemedicine to support space flight since the Agency's beginning. Telemetry of physiological parameters from spacecraft to ground controllers is critical to assess the health status of humans in extreme and remote environments. Requisite systems to support medical care and maintain readiness will evolve as mission duration and complexity increase. Developing appropriate protocols and procedures to support multinational, multicultural missions is a key objective of this activity. NASA has created an Agency-wide strategic plan that focuses on the development and integration of technology into the health care delivery systems for space flight to meet these challenges. In order to evaluate technology and systems that can enhance inflight medical care and medical education, NASA has established and conducted several testbeds. Additionally, in June of 1997, NASA established a Commercial Space Center (CSC) for Medical Informatics and Technology Applications at Yale University School of Medicine. These testbeds and the CSC foster the leveraging of technology and resources between government, academia and industry to enhance health care. This commercial endeavor will influence both the delivery of health care in space and on the ground. To date, NASA's activities in telemedicine have provided new ideas in the application of telecommunications and information systems to health care. NASA's Spacebridge to Russia, an Internet-based telemedicine testbed, is one example of how telemedicine and medical education can be conducted using the Internet and its associated tools. Other NASA activities, including the development of a portable telemedicine workstation, which has been demonstrated on the Crow Indian Reservation and in the Texas Prison System, show promise in serving as significant adjuncts to the delivery of health care. As NASA continues to meet the challenges of space flight, the

  11. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea-ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Crosier, Jonathan; Bower, Keith N.; Darbyshire, Eoghan; Taylor, Jonathan W.; Liu, Dantong; Allan, James D.; Williams, Paul I.; Gallagher, Martin W.; Choularton, Thomas W.

    2016-04-01

    The Arctic sea-ice is intricately coupled to the atmosphere[1]. The decreasing sea-ice extent with the changing climate raises questions about how Arctic cloud structure will respond. Any effort to answer these questions is hindered by the scarcity of atmospheric observations in this region. Comprehensive cloud and aerosol measurements could allow for an improved understanding of the relationship between surface conditions and cloud structure; knowledge which could be key in validating weather model forecasts. Previous studies[2] have shown via remote sensing that cloudiness increases over the marginal ice zone (MIZ) and ocean with comparison to the sea-ice; however, to our knowledge, detailed in-situ data of this transition have not been previously presented. In 2013, the Aerosol-Cloud Coupling and Climate Interactions in the Arctic (ACCACIA) campaign was carried out in the vicinity of Svalbard, Norway to collect in-situ observations of the Arctic atmosphere and investigate this issue. Fitted with a suite of remote sensing, cloud and aerosol instrumentation, the FAAM BAe-146 aircraft was used during the spring segment of the campaign (Mar-Apr 2013). One case study (23rd Mar 2013) produced excellent coverage of the atmospheric changes when transitioning from sea-ice, through the MIZ, to the open ocean. Clear microphysical changes were observed, with the cloud liquid-water content increasing by almost four times over the transition. Cloud base, depth and droplet number also increased, whilst ice number concentrations decreased slightly. The surface warmed by ~13 K from sea-ice to ocean, with minor differences in aerosol particle number (of sizes corresponding to Cloud Condensation Nuclei or Ice Nucleating Particles) observed, suggesting that the primary driver of these microphysical changes was the increased heat fluxes and induced turbulence from the warm ocean surface as expected. References: [1] Kapsch, M.L., Graversen, R.G. and Tjernström, M. Springtime

  12. Control design for the SERC experimental testbeds

    NASA Technical Reports Server (NTRS)

    Jacques, Robert; Blackwood, Gary; Macmartin, Douglas G.; How, Jonathan; Anderson, Eric

    1992-01-01

    Viewgraphs on control design for the Space Engineering Research Center experimental testbeds are presented. Topics covered include: SISO control design and results; sensor and actuator location; model identification; control design; experimental results; preliminary LAC experimental results; active vibration isolation problem statement; base flexibility coupling into isolation feedback loop; cantilever beam testbed; and closed loop results.

  13. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  14. CRYOTE (Cryogenic Orbital Testbed) Concept

    NASA Technical Reports Server (NTRS)

    Gravlee, Mari; Kutter, Bernard; Wollen, Mark; Rhys, Noah; Walls, Laurie

    2009-01-01

    Demonstrating cryo-fluid management (CFM) technologies in space is critical for advances in long duration space missions. Current space-based cryogenic propulsion is viable for hours, not the weeks to years needed by space exploration and space science. CRYogenic Orbital TEstbed (CRYOTE) provides an affordable low-risk environment to demonstrate a broad array of critical CFM technologies that cannot be tested in Earth's gravity. These technologies include system chilldown, transfer, handling, health management, mixing, pressure control, active cooling, and long-term storage. United Launch Alliance is partnering with Innovative Engineering Solutions, the National Aeronautics and Space Administration, and others to develop CRYOTE to fly as an auxiliary payload between the primary payload and the Centaur upper stage on an Atlas V rocket. Because satellites are expensive, the space industry is largely risk averse to incorporating unproven systems or conducting experiments using flight hardware that is supporting a primary mission. To minimize launch risk, the CRYOTE system will only activate after the primary payload is separated from the rocket. Flying the testbed as an auxiliary payload utilizes Evolved Expendable Launch Vehicle performance excess to cost-effectively demonstrate enhanced CFM.

  15. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    PubMed

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756

  16. A Space Testbed for Photovoltaics

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Bailey, Sheila G.

    1998-01-01

    The Ohio Aerospace Institute and the NASA Lewis Research Center are designing and building a solar-cell calibration facility, the Photovoltaic Engineering Testbed (PET) to fly on the International Space Station to test advanced solar cell types in the space environment. A wide variety of advanced solar cell types have become available in the last decade. Some of these solar cells offer more than twice the power per unit area of the silicon cells used for the space station power system. They also offer the possibilities of lower cost, lighter weight, and longer lifetime. The purpose of the PET facility is to reduce the cost of validating new technologies and bringing them to spaceflight readiness. The facility will be used for three primary functions: calibration, measurement, and qualification. It is scheduled to be launched in June of 2002.

  17. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  18. Experiments Program for NASA's Space Communications Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Reinhart, Richard

    2012-01-01

    NASA developed a testbed for communications and navigation that was launched to the International Space Station in 2012. The testbed promotes new software defined radio (SDR) technologies and addresses associated operational concepts for space-based SDRs, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. The experiments program consists of a mix of in-house and external experiments from partners in industry, academia, and government. The experiments will investigate key challenges in communications, networking, and global positioning system navigation both on the ground and on orbit. This presentation will discuss some of the key opportunities and challenges for the testbed experiments program.

  19. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  20. A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group

    NASA Astrophysics Data System (ADS)

    Graves, W. R.; Holliday, J. R.; Rundle, J. B.

    2010-12-01

    According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage

  1. Inferring spatial clouds statistics from limited field-of-view, zenith observations

    SciTech Connect

    Sun, C.H.; Thorne, L.R.

    1996-04-01

    Many of the Cloud and Radiation Testbed (CART) measurements produce a time series of zenith observations, but spatial averages are often the desired data product. One possible approach to deriving spatial averages from temporal averages is to invoke Taylor`s hypothesis where and when it is valid. Taylor`s hypothesis states that when the turbulence is small compared with the mean flow, the covariance in time is related to the covariance in space by the speed of the mean flow. For clouds fields, Taylor`s hypothesis would apply when the {open_quotes}local{close_quotes} turbulence is small compared with advective flow (mean wind). The objective of this study is to determine under what conditions Taylor`s hypothesis holds or does not hold true for broken cloud fields.

  2. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  3. A Reconfigurable Testbed Environment for Spacecraft Autonomy

    NASA Technical Reports Server (NTRS)

    Biesiadecki, Jeffrey; Jain, Abhinandan

    1996-01-01

    A key goal of NASA's New Millennium Program is the development of technology for increased spacecraft on-board autonomy. Achievement of this objective requires the development of a new class of ground-based automony testbeds that can enable the low-cost and rapid design, test, and integration of the spacecraft autonomy software. This paper describes the development of an Autonomy Testbed Environment (ATBE) for the NMP Deep Space I comet/asteroid rendezvous mission.

  4. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  5. INDIGO: Building a DataCloud Framework to support Open Science

    NASA Astrophysics Data System (ADS)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  6. Reconstruction of passive open-path FTIR ambient spectra using meteorological measurements and its application for detection of aerosol cloud drift.

    PubMed

    Kira, Oz; Dubowski, Yael; Linker, Raphael

    2015-07-27

    Remote sensing of atmospheric aerosols is of great importance to public and environmental health. This research promotes a simple way of detecting an aerosol cloud using a passive Open Path FTIR (OP-FTIR) system, without utilizing radiative transfer models and without relying on an artificial light source. Meteorological measurements (temperature, relative humidity and solar irradiance), and chemometric methods (multiple linear regression and artificial neural networks) together with previous cloud-free OP-FTIR measurements were used to estimate the ambient spectrum in real time. The cloud detection process included a statistical comparison between the estimated cloud-free signal and the measured OP-FTIR signal. During the study we were able to successfully detect several aerosol clouds (water spray) in controlled conditions as well as during agricultural pesticide spraying in an orchard. PMID:26367691

  7. Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal; Rizzo, Maxime; Thompson, Patrick

    2009-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept study and a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC would provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 5-year mission lifetime. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables starlight suppression in broadband light from 480-960 nm. To demonstrate the VNC approach and advance it's technology readiness we have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  8. The CMS integration grid testbed

    SciTech Connect

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  9. A Kenyan Cloud School. Massive Open Online & Ongoing Courses for Blended and Lifelong Learning

    ERIC Educational Resources Information Center

    Jobe, William

    2013-01-01

    This research describes the predicted outcomes of a Kenyan Cloud School (KCS), which is a MOOC that contains all courses taught at the secondary school level in Kenya. This MOOC will consist of online, ongoing subjects in both English and Kiswahili. The KCS subjects offer self-testing and peer assessment to maximize scalability, and digital badges…

  10. Software Defined Networking challenges and future direction: A case study of implementing SDN features on OpenStack private cloud

    NASA Astrophysics Data System (ADS)

    Shamugam, Veeramani; Murray, I.; Leong, J. A.; Sidhu, Amandeep S.

    2016-03-01

    Cloud computing provides services on demand instantly, such as access to network infrastructure consisting of computing hardware, operating systems, network storage, database and applications. Network usage and demands are growing at a very fast rate and to meet the current requirements, there is a need for automatic infrastructure scaling. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Managing complex environments using traditional networks is time-consuming and expensive, especially in the case of generating virtual machines, migration and network configuration. To mitigate the challenges, network operations require efficient, flexible, agile and scalable software defined networks (SDN). This paper discuss various issues in SDN and suggests how to mitigate the network management related issues. A private cloud prototype test bed was setup to implement the SDN on the OpenStack platform to test and evaluate the various network performances provided by the various configurations.

  11. A Multi-Mission Testbed for Advanced Technologies

    NASA Technical Reports Server (NTRS)

    Chau, S. N.; Lang, M.

    2001-01-01

    The mission of the Center for Space Integrated Microsystem (CSIM) at the Jet Propulsion Laboratory is to develop advanced avionics systems for future deep space missions. The Advanced Micro Spacecraft (AMS) task is building a multi-mission testbed facility to enable the infusion of CSIM technologies into future missions. The testbed facility will also perform experimentation for advanced avionics technologies and architectures to meet challenging power, performance, mass, volume, reliability, and fault tolerance of future missions. The testbed facility has two levels of testbeds: (1) a Proof-of-Concept (POC) Testbed and (2) an Engineering Model Testbed. The methodology of the testbed development and the process of technology infusion are presented in a separate paper in this conference. This paper focuses only on the design, implementation, and application of the POC testbed. Additional information is contained in the original extended abstract.

  12. A Laboratory-Based Microwave Radio-Interferometry Testbed

    NASA Technical Reports Server (NTRS)

    Moriarity, M.; Simon, N.; Leach, K.; Piepmeier, J. R.

    2003-01-01

    The Goddard Radio lnterferometry Testbed (GRIT) is an adaptable platform for laboratory testing of Synthetic Thinned Array Radiometers (STAR). Using this testbed, we demonstrate that the Doppler radiometer can image a point source in an observed scene.

  13. Towards a testbed for malicious code detection

    SciTech Connect

    Lo, R.; Kerchen, P.; Crawford, R.; Ho, W.; Crossley, J.; Fink, G.; Levitt, K.; Olsson, R.; Archer, M. . Div. of Computer Science)

    1991-01-01

    This paper proposes an environment for detecting many types of malicious code, including computer viruses, Trojan horses, and time/logic bombs. This malicious code testbed (MCT) is based upon both static and dynamic analysis tools developed at the University of California, Davis, which have been shown to be effective against certain types of malicious code. The testbed extends the usefulness of these tools by using them in a complementary fashion to detect more general cases of malicious code. Perhaps more importantly, the MCT allows administrators and security analysts to check a program before installation, thereby avoiding any damage a malicious program might inflict. 5 refs., 2 figs., 2 tabs.

  14. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  15. A thermal spectral-spatial interferometric testbed

    NASA Astrophysics Data System (ADS)

    Savini, G.; Juanola-Parramon, R.; Stabbins, R.; Baccichet, N.; Donohoe, A.; Murphy, A.; O'Sullivan, C.

    2014-07-01

    We present an ongoing effort to achieve a Double Fourier Modulating (DFM) interferometer in the thermal infrared wavelength range. We describe a testbed designed to combine a sky simulator in the form of a miniaturized complex calibration source at the focus of a parabolic collimator with an interferometer baseline consisting of two parallel telescopes each mounted on a motorized linear stage. The two input arms are combined after one of them is modulated via a fast-scanning piezoelectric roof-top mirror. The optical design and layout of the testbed, the choice of interferometer parameters as well as the calibration scene adopted as source are described.

  16. The design and implementation of the LLNL gigabit testbed

    SciTech Connect

    Garcia, D.

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  17. Exploiting Open Environmental Data using Linked Data and Cloud Computing: the MELODIES project

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Gonçalves, Pedro; Caumont, Hervé; Koubarakis, Manolis; Perkins, Bethan

    2015-04-01

    The European Open Data Strategy establishes important new principles that ensure that European public sector data will be released at no cost (or marginal cost), in machine-readable, commonly-understood formats, and with liberal licences enabling wide reuse. These data encompass both scientific data about the environment (from Earth Observation and other fields) and other public sector information, including diverse topics such as demographics, health and crime. Many open geospatial datasets (e.g. land use) are already available through the INSPIRE directive and made available through infrastructures such as the Global Earth Observation System of Systems (GEOSS). The intention of the Open Data Strategy is to stimulate the growth of research and value-adding services that build upon these data streams; however, the potential value inherent in open data, and the benefits that can be gained by combining previously-disparate sources of information are only just starting to become understood. The MELODIES project (Maximising the Exploitation of Linked Open Data In Enterprise and Science) is developing eight innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. MELODIES (http://melodiesproject.eu) is a European FP7 project that is coordinated by the University of Reading and has sixteen partners (including nine SMEs) from eight European countries. It started in November 2013 and will run for three years. The project is therefore in its early stages and therefore we will value the opportunity that this workshop affords to present our plans and interact with the wider Linked Geospatial Data community. The project is developing eight new services[1] covering a range of domains including agriculture, urban ecosystems, land use management, marine information, desertification, crisis management and hydrology. These services will combine Earth

  18. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    NASA Astrophysics Data System (ADS)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  19. Contrast analysis and stability on the ExAO testbed

    SciTech Connect

    Evans, J; Thomas, S; Gavel, D; Dillon, D; Macintosh, B

    2008-06-10

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. The Laboratory for Adaptive Optics at UC Santa Cruz is investigating limits to high-contrast imaging in support of the Gemini Planet Imager. Previous contrast measurements were made with a simple single-opening prolate spheroid shaped pupil that produced a limited region of high-contrast, particularly when wavefront errors were corrected with the 1024-actuator Boston Micromachines MEMS deformable mirror currently in use on the testbed. A more sophisticated shaped pupil is now being used that has a much larger region of interest facilitating a better understanding of high-contrast measurements. In particular we examine the effect of heat sources in the testbed on PSF stability. We find that rms image motion scales as 0.02 {lambda}/D per watt when the heat source is near the pupil plane. As a result heat sources of greater than 5 watts should be avoided near pupil planes for GPI. The safest place to introduce heat is near a focal plane. Heat also can effect the standard deviation of the high-contrast region but in the final instrument other sources of error should be more significant.

  20. A Laboratory Testbed for Embedded Fuzzy Control

    ERIC Educational Resources Information Center

    Srivastava, S.; Sukumar, V.; Bhasin, P. S.; Arun Kumar, D.

    2011-01-01

    This paper presents a novel scheme called "Laboratory Testbed for Embedded Fuzzy Control of a Real Time Nonlinear System." The idea is based upon the fact that project-based learning motivates students to learn actively and to use their engineering skills acquired in their previous years of study. It also fosters initiative and focuses students'…

  1. Flight Projects Office Information Systems Testbed (FIST)

    NASA Technical Reports Server (NTRS)

    Liggett, Patricia

    1991-01-01

    Viewgraphs on the Flight Projects Office Information Systems Testbed (FIST) are presented. The goal is to perform technology evaluation and prototyping of information systems to support SFOC and JPL flight projects in order to reduce risk in the development of operational data systems for such projects.

  2. Space Interferometry System Testbed-3: architecture

    NASA Technical Reports Server (NTRS)

    Alvarez-Salazar, Oscar S.; Renaud, Goullioud; Azizi, Ali

    2004-01-01

    The Space Interferometry Mission's System testbed-3 has recently integrated its Precision Support Structure and spacecraft backpack on a pseudo free-free 0.5 Hz passive isolation system. The Precision Support Structure holds a 3-baseline stellar interferometer instrument.

  3. On the relationship between open cellular convective cloud patterns and the spatial distribution of precipitation

    NASA Astrophysics Data System (ADS)

    Yamaguchi, T.; Feingold, G.

    2014-10-01

    Precipitation is thought to be a necessary but insufficient condition for the transformation of stratocumulus-topped closed cellular convection to open cellular cumuliform convection. Here we test the hypothesis that the spatial distribution of precipitation is a key element of the closed-to-open cell transition. A series of idealized 3-dimensional simulations are conducted to evaluate the dependency of the transformation on the areal coverage of rain, and to explore the role of interactions between multiple rainy areas in the formation of the open cells. When rain is restricted to a small area, even substantial rain (order few mm day-1) does not result in a transition. With increasing areal coverage of the rain, the transition becomes possible provided that the rain rate is sufficiently large. When multiple small rain regions interact with each other, the transition occurs and spreads over a wider area, provided that the distance between the rain regions is short. When the distance between the rain areas is large, the transition eventually occurs, albeit slowly. For much longer distances between rain regions the system is anticipated to remain in a closed-cell state. These results suggest a connection to the recently hypothesized remote control of open-cell formation. Finally it is shown that phase trajectories of the mean and coefficient of variation of vertically integrated variables such as liquid water path align on one trajectory. This could be used as a diagnostic tool for global analyses of the statistics of closed- and open-cell occurrence and transitions between them.

  4. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  5. Managing autonomy levels in the SSM/PMAD testbed. [Space Station Power Management and Distribution

    NASA Technical Reports Server (NTRS)

    Ashworth, Barry R.

    1990-01-01

    It is pointed out that when autonomous operations are mixed with those of a manual nature, concepts concerning the boundary of operations and responsibility become clouded. The space station module power management and distribution (SSM/PMAD) automation testbed has the need for such mixed-mode capabilities. The concept of managing the SSM/PMAD testbed in the presence of changing levels of autonomy is examined. A knowledge-based approach to implementing autonomy management in the distributed SSM/PMAD utilizing a centralized planning system is presented. Its knowledge relations and system-wide interactions are discussed, along with the operational nature of the currently functioning SSM/PMAD knowledge-based systems.

  6. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  7. Sparse matrix methods research using the CSM testbed software system

    NASA Technical Reports Server (NTRS)

    Chu, Eleanor; George, J. Alan

    1989-01-01

    Research is described on sparse matrix techniques for the Computational Structural Mechanics (CSM) Testbed. The primary objective was to compare the performance of state-of-the-art techniques for solving sparse systems with those that are currently available in the CSM Testbed. Thus, one of the first tasks was to become familiar with the structure of the testbed, and to install some or all of the SPARSPAK package in the testbed. A suite of subroutines to extract from the data base the relevant structural and numerical information about the matrix equations was written, and all the demonstration problems distributed with the testbed were successfully solved. These codes were documented, and performance studies comparing the SPARSPAK technology to the methods currently in the testbed were completed. In addition, some preliminary studies were done comparing some recently developed out-of-core techniques with the performance of the testbed processor INV.

  8. THE HERSCHEL INVENTORY OF THE AGENTS OF GALAXY EVOLUTION IN THE MAGELLANIC CLOUDS, A HERSCHEL OPEN TIME KEY PROGRAM

    SciTech Connect

    Meixner, M.; Roman-Duval, J.; Seale, J.; Gordon, K.; Beck, T.; Boyer, M. L.; Panuzzo, P.; Hony, S.; Sauvage, M.; Okumura, K.; Chanial, P.; Babler, B.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Carlson, L. R.; Clayton, G. C.; and others

    2013-09-15

    We present an overview of the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 {mu}m with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for the LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the criteria used to establish a catalog for each waveband for the HERITAGE program. The 250 {mu}m band is the most sensitive and the source catalogs for this band have {approx}25,000 objects for the LMC and {approx}5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supernova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the community support

  9. The HERschel Inventory of the Agents of Galaxy Evolution in the Magellanic Clouds, a HERschel Open Time Key Program

    NASA Technical Reports Server (NTRS)

    Meixner, Margaret; Panuzzo, P.; Roman-Duval, J.; Engelbracht, C.; Babler, B.; Seale, J.; Hony, S.; Montiel, E.; Sauvage, M.; Gordon, K.; Misselt, K.; Okumura, K.; Chanial, P.; Beck, T.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Boyer, M. L.; Carlson, L. R.; Clayton, G. C.; Chen, C.-H. R.; Cormier, D.; Fukui, Y.; Galametz, M.; Galliano, F.

    2013-01-01

    We present an overview or the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 micron with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the critetia used to establish a catalog for each waveband for the HERITAGE program. The 250 micron band is the most sensitive and the source catalogs for this band have approx. 25,000 objects for the LMC and approx. 5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supemova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the conummity support

  10. Observed and simulated temperature dependence of the liquid water path of low clouds

    SciTech Connect

    Del Genio, A.D.; Wolf, A.B.

    1996-04-01

    Data being acquired at the Atmospheric Radiation Measurement (ARM) Southern great Plains (SGP) Cloud and Radiation Testbed (CART) site can be used to examine the factors determining the temperature dependence of cloud optical thickness. We focus on cloud liquid water and physical thickness variations which can be derived from existing ARM measurements.

  11. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Vaillancourt, Robert; Carlson, David; Evans, Thomas; Schundler, Elizabeth; Todd, Lori; Mottus, Kathleen

    2009-05-01

    OPTRA is developing an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach is intended as a referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill.

  12. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  13. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Vaillancourt, Robert; Carlson, David; Evans, Thomas; Schundler, Elizabeth; Todd, Lori; Mottus, Kathleen

    2010-04-01

    OPTRA has developed an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach is intended as a referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill. In this paper, we summarize the design and build and detail system characterization and test of a prototype I-OP-FTIR instrument. System characterization includes radiometric performance and spectral resolution. Results from a series of tomographic reconstructions of sulfur hexafluoride plumes in a laboratory setting are also presented.

  14. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Engel, James R.; Vaillancourt, Robert; Todd, Lori; Mottus, Kathleen

    2008-04-01

    OPTRA and University of North Carolina are developing an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach will be considered as a candidate referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill. In this paper, we summarize progress to date and overall system performance projections based on the instrument, spectroscopy, and tomographic reconstruction accuracy. We then present a preliminary optical design of the I-OP-FTIR.

  15. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  16. VCE testbed program planning and definition study

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.; Godston, J.

    1978-01-01

    The flight definition of the Variable Stream Control Engine (VSCE) was updated to reflect design improvements in the two key components: (1) the low emissions duct burner, and (2) the coannular exhaust nozzle. The testbed design was defined and plans for the overall program were formulated. The effect of these improvements was evaluated for performance, emissions, noise, weight, and length. For experimental large scale testing of the duct burner and coannular nozzle, a design definition of the VCE testbed configuration was made. This included selecting the core engine, determining instrumentation requirements, and selecting the test facilities, in addition to defining control system and assembly requirements. Plans for a comprehensive test program to demonstrate the duct burner and nozzle technologies were formulated. The plans include both aeroacoustic and emissions testing.

  17. Mini-mast CSI testbed user's guide

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  18. Overview of the Telescience Testbed Program

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad; Leiner, Barry M.

    1991-01-01

    The NASA's Telescience Testbed Program (TTP) conducted by the Ames Research Center is described with particular attention to the objectives, the approach used to achieve these objectives, and the expected benefits of the program. The goal of the TTP is to gain operational experience for the Space Station Freedom and the Earth Observing System programs, using ground testbeds, and to define the information and communication systems requirements for the development and operation of these programs. The results of TTP are expected to include the requirements for the remote coaching, command and control, monitoring and maintenance, payload design, and operations management. In addition, requirements for technologies such as workstations, software, video, automation, data management, and networking will be defined.

  19. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    PubMed Central

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  20. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience.

    PubMed

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  1. Developments at the Advanced Design Technologies Testbed

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    2003-01-01

    A report presents background and historical information, as of August 1998, on the Advanced Design Technologies Testbed (ADTT) at Ames Research Center. The ADTT is characterized as an activity initiated to facilitate improvements in aerospace design processes; provide a proving ground for product-development methods and computational software and hardware; develop bridging methods, software, and hardware that can facilitate integrated solutions to design problems; and disseminate lessons learned to the aerospace and information technology communities.

  2. The Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Hines, Braden; Bell, Charles; Shen, Tsae-Pyng; Bloemhof, Eric; Zhao, Feng; Regehr, Martin; Holmes, Howard; Irigoyen, Robert; Neat, Gregory

    2003-01-01

    The Micro-Arcsecond Metrology (MAM) testbed is a ground-based system of optical and electronic equipment for testing components, systems, and engineering concepts for the Space Interferometer Mission (SIM) and similar future missions, in which optical interferometers will be operated in outer space. In addition, the MAM testbed is of interest in its own right as a highly precise metrological system. The designs of the SIM interferometer and the MAM testbed reflect a requirement to measure both the position of the starlight central fringe and the change in the internal optical path of the interferometer with sufficient spatial resolution to generate astrometric data with angular resolution at the microarcsecond level. The internal path is to be measured by use of a small metrological laser beam of 1,319-nm wavelength, whereas the position of the starlight fringe is to be estimated by use of a charge-coupled-device (CCD) image detector sampling a large concentric annular beam. For the SIM to succeed, the optical path length determined from the interferometer fringes must be tracked by the metrological subsystem to within tens of picometers, through all operational motions of an interferometer delay line and siderostats. The purpose of the experiments performed on the MAM testbed is to demonstrate this agreement in a large-scale simulation that includes a substantial portion of the system in the planned configuration for operation in outer space. A major challenge in this endeavor is to align the metrological beam with the starlight beam in order to maintain consistency between the metrological and starlight subsystems at the system level. The MAM testbed includes an optical interferometer with a white light source, all major optical components of a stellar interferometer, and heterodyne metrological sensors. The aforementioned subsystems are installed in a large vacuum chamber in order to suppress atmospheric and thermal disturbances. The MAM is divided into two

  3. Evaluating Aerosol Process Modules within the Framework of the Aerosol Modeling Testbed

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Velu, V.; Gustafson, W. I.; Chapman, E.; Easter, R. C.; Shrivastava, M.; Singh, B.

    2012-12-01

    Factors that influence predictions of aerosol direct and indirect forcing, such as aerosol mass, composition, size distribution, hygroscopicity, and optical properties, still contain large uncertainties in both regional and global models. New aerosol treatments are usually implemented into a 3-D atmospheric model and evaluated using a limited number of measurements from a specific case study. Under this modeling paradigm, the performance and computational efficiency of several treatments for a specific aerosol process cannot be adequately quantified because many other processes among various modeling studies (e.g. grid configuration, meteorology, emission rates) are different as well. The scientific community needs to know the advantages and disadvantages of specific aerosol treatments when the meteorology, chemistry, and other aerosol processes are identical in order to reduce the uncertainties associated with aerosols predictions. To address these issues, an Aerosol Modeling Testbed (AMT) has been developed that systematically and objectively evaluates new aerosol treatments for use in regional and global models. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from the Community Atmosphere Model version 5 (CAM5) have also been ported to WRF so that they can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. Data from several campaigns, including the 2006

  4. Overview on In-Space Internet Node Testbed (ISINT)

    NASA Technical Reports Server (NTRS)

    Richard, Alan M.; Kachmar, Brian A.; Fabian, Theodore; Kerczewski, Robert J.

    2000-01-01

    The Satellite Networks and Architecture Branch has developed the In-Space Internet Node Technology testbed (ISINT) for investigating the use of commercial Internet products for NASA missions. The testbed connects two closed subnets over a tabletop Ka-band transponder by using commercial routers and modems. Since many NASA assets are in low Earth orbits (LEO's), the testbed simulates the varying signal strength, changing propagation delay, and varying connection times that are normally experienced when communicating to the Earth via a geosynchronous orbiting (GEO) communications satellite. Research results from using this testbed will be used to determine which Internet technologies are appropriate for NASA's future communication needs.

  5. ISS Update: ISTAR -- International Space Station Testbed for Analog Research

    NASA Video Gallery

    NASA Public Affairs Officer Kelly Humphries interviews Sandra Fletcher, EVA Systems Flight Controller. They discuss the International Space Station Testbed for Analog Research (ISTAR) activity that...

  6. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation. PMID:23230155

  7. Thermodynamic and cloud parameter retrieval using infrared spectral data

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Huang, Hung-Lung A.; Li, Jun; McGill, Matthew J.; Mango, Stephen A.

    2005-01-01

    High-resolution infrared radiance spectra obtained from near nadir observations provide atmospheric, surface, and cloud property information. A fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. The retrieval algorithm is presented along with its application to recent field experiment data from the NPOESS Airborne Sounding Testbed - Interferometer (NAST-I). The retrieval accuracy dependence on cloud properties is discussed. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with an accuracy of approximately 1.0 km. Preliminary NAST-I retrieval results from the recent Atlantic-THORPEX Regional Campaign (ATReC) are presented and compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL).

  8. Design of the Dual Conjugate Adaptive Optics Test-bed

    NASA Astrophysics Data System (ADS)

    Sharf, Inna; Bell, K.; Crampton, D.; Fitzsimmons, J.; Herriot, Glen; Jolissaint, Laurent; Lee, B.; Richardson, H.; van der Kamp, D.; Veran, Jean-Pierre

    In this paper, we describe the Multi-Conjugate Adaptive Optics laboratory test-bed presently under construction at the University of Victoria, Canada. The test-bench will be used to support research in the performance of multi-conjugate adaptive optics, turbulence simulators, laser guide stars and miniaturizing adaptive optics. The main components of the test-bed include two micro-machined deformable mirrors, a tip-tilt mirror, four wavefront sensors, a source simulator, a dual-layer turbulence simulator, as well as computational and control hardware. The paper will describe in detail the opto-mechanical design of the adaptive optics module, the design of the hot-air turbulence generator and the configuration chosen for the source simulator. Below, we present a summary of these aspects of the bench. The optical and mechanical design of the test-bed has been largely driven by the particular choice of the deformable mirrors. These are continuous micro-machined mirrors manufactured by Boston Micromachines Corporation. They have a clear aperture of 3.3 mm and are deformed with 140 actuators arranged in a square grid. Although the mirrors have an open-loop bandwidth of 6.6 KHz, their shape can be updated at a sampling rate of 100 Hz. In our optical design, the mirrors are conjugated at 0km and 10 km in the atmosphere. A planar optical layout was achieved by using four off-axis paraboloids and several folding mirrors. These optics will be mounted on two solid blocks which can be aligned with respect to each other. The wavefront path design accommodates 3 monochromatic guide stars that can be placed at either 90 km or at infinity. The design relies on the natural separation of the beam into 3 parts because of differences in locations of the guide stars in the field of view. In total four wavefront sensors will be procured from Adaptive Optics Associates (AOA) or built in-house: three for the guide stars and the fourth to collect data from the science source output in

  9. Contrasting sea-ice and open-water boundary layers during melt and freeze-up seasons: Some result from the Arctic Clouds in Summer Experiment.

    NASA Astrophysics Data System (ADS)

    Tjernström, Michael; Sotiropoulou, Georgia; Sedlar, Joseph; Achtert, Peggy; Brooks, Barbara; Brooks, Ian; Persson, Ola; Prytherch, John; Salsbury, Dominic; Shupe, Matthew; Johnston, Paul; Wolfe, Dan

    2016-04-01

    With more open water present in the Arctic summer, an understanding of atmospheric processes over open-water and sea-ice surfaces as summer turns into autumn and ice starts forming becomes increasingly important. The Arctic Clouds in Summer Experiment (ACSE) was conducted in a mix of open water and sea ice in the eastern Arctic along the Siberian shelf during late summer and early autumn 2014, providing detailed observations of the seasonal transition, from melt to freeze. Measurements were taken over both ice-free and ice-covered surfaces, offering an insight to the role of the surface state in shaping the lower troposphere and the boundary-layer conditions as summer turned into autumn. During summer, strong surface inversions persisted over sea ice, while well-mixed boundary layers capped by elevated inversions were frequent over open-water. The former were often associated with advection of warm air from adjacent open-water or land surfaces, whereas the latter were due to a positive buoyancy flux from the warm ocean surface. Fog and stratus clouds often persisted over the ice, whereas low-level liquid-water clouds developed over open water. These differences largely disappeared in autumn, when mixed-phase clouds capped by elevated inversions dominated in both ice-free and ice-covered conditions. Low-level-jets occurred ~20-25% of the time in both seasons. The observations indicate that these jets were typically initiated at air-mass boundaries or along the ice edge in autumn, while in summer they appeared to be inertial oscillations initiated by partial frictional decoupling as warm air was advected in over the sea ice. The start of the autumn season was related to an abrupt change in atmospheric conditions, rather than to the gradual change in solar radiation. The autumn onset appeared as a rapid cooling of the whole atmosphere and the freeze up followed as the warm surface lost heat to the atmosphere. While the surface type had a pronounced impact on boundary

  10. Application of Model-based Prognostics to a Pneumatic Valves Testbed

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.; Gorospe, George

    2014-01-01

    Pneumatic-actuated valves play an important role in many applications, including cryogenic propellant loading for space operations. Model-based prognostics emphasizes the importance of a model that describes the nominal and faulty behavior of a system, and how faulty behavior progresses in time, causing the end of useful life of the system. We describe the construction of a testbed consisting of a pneumatic valve that allows the injection of faulty behavior and controllable fault progression. The valve opens discretely, and is controlled through a solenoid valve. Controllable leaks of pneumatic gas in the testbed are introduced through proportional valves, allowing the testing and validation of prognostics algorithms for pneumatic valves. A new valve prognostics approach is developed that estimates fault progression and predicts remaining life based only on valve timing measurements. Simulation experiments demonstrate and validate the approach.

  11. Application developer's tutorial for the CSM testbed architecture

    NASA Technical Reports Server (NTRS)

    Underwood, Phillip; Felippa, Carlos A.

    1988-01-01

    This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.

  12. A Turbine-powered UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; High, James W.; Guerreiro, Nelson M.; Chambers, Ryan S.; Howard, Keith D.

    2007-01-01

    The latest version of the NASA Flying Controls Testbed (FLiC) integrates commercial-off-the-shelf components including airframe, autopilot, and a small turbine engine to provide a low cost experimental flight controls testbed capable of sustained speeds up to 200 mph. The series of flight tests leading up to the demonstrated performance of the vehicle in sustained, autopiloted 200 mph flight at NASA Wallops Flight Facility's UAV runway in August 2006 will be described. Earlier versions of the FLiC were based on a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate at Fort Eustis, Virginia and NASA Langley Research Center. The newer turbine powered platform (J-FLiC) builds on the successes using the relatively smaller, slower and less expensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches with the implementation of C-coded experimental controllers. Tracking video was taken during the test flights at Wallops and will be available for presentation at the conference. Analysis of flight data from both remotely piloted and autopiloted flights will be presented. Candidate experimental controllers for implementation will be discussed. It is anticipated that flight testing will resume in Spring 2007 and those results will be included, if possible.

  13. Gemini Planet Imager coronagraph testbed results

    NASA Astrophysics Data System (ADS)

    Sivaramakrishnan, Anand; Soummer, Rémi; Oppenheimer, Ben R.; Carr, G. Lawrence; Mey, Jacob L.; Brenner, Doug; Mandeville, Charles W.; Zimmerman, Neil; Macintosh, Bruce A.; Graham, James R.; Saddlemyer, Les; Bauman, Brian; Carlotti, Alexis; Pueyo, Laurent; Tuthill, Peter G.; Dorrer, Christophe; Roberts, Robin; Greenbaum, Alexandra

    2010-07-01

    The Gemini Planet Imager (GPI) is an extreme AO coronagraphic integral field unit YJHK spectrograph destined for first light on the 8m Gemini South telescope in 2011. GPI fields a 1500 channel AO system feeding an apodized pupil Lyot coronagraph, and a nIR non-common-path slow wavefront sensor. It targets detection and characterizion of relatively young (<2GYr), self luminous planets up to 10 million times as faint as their primary star. We present the coronagraph subsystem's in-lab performance, and describe the studies required to specify and fabricate the coronagraph. Coronagraphic pupil apodization is implemented with metallic half-tone screens on glass, and the focal plane occulters are deep reactive ion etched holes in optically polished silicon mirrors. Our JH testbed achieves H-band contrast below a million at separations above 5 resolution elements, without using an AO system. We present an overview of the coronagraphic masks and our testbed coronagraphic data. We also demonstrate the performance of an astrometric and photometric grid that enables coronagraphic astrometry relative to the primary star in every exposure, a proven technique that has yielded on-sky precision of the order of a milliarsecond.

  14. Telescience testbed in human space physiology

    NASA Astrophysics Data System (ADS)

    Watanabe, Satoru; Seo, Hisao; Iwase, Satoshi; Tanaka, Masafumi; Kaneko, Sayumi; Mano, Tadaaki; Matsui, Nobuo; Foldager, Niels; Bondepetersen, Flemming; Yamashita, Masamichi; Shoji, Takatoshi; Sudoh, Hideo

    The present telescience testbed study was conducted to evaluate the feasibility of physiological experimentation under restricted conditions such as during simulated weightlessness induced by using a water immersion facility, a reduced capacity of laboratory facilities, a delay and desynchronization of communication between investigator and operator, restrictions of different kinds of experiments practiced by only one operator following a limited time line and so on. The three day's experiments were carried out following the same protocols. The operators were changed every day, but was the same the first and the third day. The operators were both medical doctors but not all round experts in the physiological experimentation. The experimental objectives were: 1) ECG changes by changing water immersion levels, 2) blood pressure changes, 3) ultrasonic Echo-cardiographic changes, 4) laser Doppler skin blood flowmetry in a finger, 5) blood sampling to examine blood electrolytic and humoral changes. The effectiveness of the testbed experiment was assessed by evaluating the quality of the obtained data and estimating the friendliness of the operation of the telescience to investigators and operators.

  15. Near infrared missile warning testbed sensor

    NASA Astrophysics Data System (ADS)

    McDermott, D. J.; Johnson, R. S.; Montgomery, J. B.; Sanderson, R. B.; McCalmont, J. F.; Taylor, M. J.

    2008-04-01

    Multicolor discrimination is one of the most effective ways of improving the performance of infrared missile warning sensors, particularly for heavy clutter situations. A new tactical airborne multicolor missile warning testbed was developed and fielded as part of a continuing Air Force Research Laboratory (AFRL) initiative focusing on clutter and missile signature measurements for effective missile warning algorithms. The developed sensor test bed is a multi-camera system 1004x1004 FPA coupled with optimized spectral filters integrated with the optics; a reduced form factor microprocessor-based video data recording system operating at 48 Hz; and a real time field programmable gate array processor for algorithm and video data processing capable of 800B Multiply/Accumulates operations per second. A detailed radiometric calibration procedure was developed to overcome severe photon-limited operating conditions due to the sub-nanometer bandwidth of the spectral filters. This configuration allows the collection and real-time processing of temporally correlated, radiometrically calibrated video data in multiple spectral bands. The testbed was utilized to collect false alarm sources spectra and Man-Portable Air Defense System (MANPADS) signatures under a variety of atmospheric and solar illuminating conditions. Signatures of approximately 100 missiles have been recorded.

  16. Testbed for the development of intelligent robot control

    SciTech Connect

    Harrigan, R.W.

    1986-01-01

    The Sensor Driven Robot Systems Testbed has been constructed to provide a working environment to aid in the development of intelligent robot control software. The Testbed employs vision and force as the robot's means of interrogating its environment. The Testbed, which has been operational for approximately 24 months, consists of a PUMA-560 robot manipulator coupled to a 2-dimensional vision system and force and torque sensing wrist. Recent work within the Testbed environment has led to a highly modularized control software concept with emphasis on detection and resolution of error situations. The objective of the Testbed is to develop intelligent robot control concepts incorporating planning and error recovery which are transportable to a wide variety of robot applications. This project is an ongoing, longterm development project and, as such, this paper represents a status report of the development work.

  17. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    SciTech Connect

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  18. Expediting Experiments across Testbeds with AnyBed: A Testbed-Independent Topology Configuration System and Its Tool Set

    NASA Astrophysics Data System (ADS)

    Suzuki, Mio; Hazeyama, Hiroaki; Miyamoto, Daisuke; Miwa, Shinsuke; Kadobayashi, Youki

    Building an experimental network within a testbed has been a tiresome process for experimenters, due to the complexity of the physical resource assignment and the configuration overhead. Also, the process could not be expedited across testbeds, because the syntax of a configuration file varies depending on specific hardware and software. Re-configuration of an experimental topology for each testbed wastes time, an experimenter could not carry out his/her experiments during the limited lease time of a testbed at worst. In this paper, we propose the AnyBed: the experimental network-building system. The conceptual idea of AnyBed is “If experimental network topologies can be portable across any kinds of testbed, then, it would expedite building an experimental network on a testbed while manipulating experiments by each testbed support tool”. To achieve this concept, AnyBed divide an experimental network configuration into the logical and physical network topologies. Mapping these two topologies, AnyBed can build intended logical network topology on any PC clusters. We have evaluated the AnyBed implementation using two distinct clusters. The evaluation result shows a BGP topology with 150 nodes can be constructed on a large scale testbed in less than 113 seconds.

  19. The Magellan Final Report on Cloud Computing

    SciTech Connect

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  20. Cross layer optimization for cloud-based radio over optical fiber networks

    NASA Astrophysics Data System (ADS)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  1. Construction and Modeling of a Controls Testbed

    NASA Technical Reports Server (NTRS)

    Nagle, James C.; Homaifar, Abdollah; Nasser, Ahmed A.; Bikdash, Marwan

    1997-01-01

    This paper describes the construction and modeling of a control system testbed to be used for the comparison of various control methodologies. We specifically wish to test fuzzy logic control and compare performance of various fuzzy controllers, including Hybrid Fuzzy-PID (HFPID) and Hierarchical Hybrid Fuzzy-PID (HHFPID) to other controllers including localized rate feedback, LQR/LTR, and H2/H(sub infinity). The control problem is that of vibration suppression in a thin plate with inputs coming from accelerometers and outputs going to piezoelectric actuators or 'patches'. A model based on experimental modal analysis of the plate is conducted and compared with an analytical model. The analytical model uses a boundary condition which is a mix of clamped and simply supported.

  2. A land-surface Testbed for EOSDIS

    NASA Technical Reports Server (NTRS)

    Emery, William; Kelley, Tim

    1994-01-01

    The main objective of the Testbed project was to deliver satellite images via the Internet to scientific and educational users free of charge. The main method of operations was to store satellite images on a low cost tape library system, visually browse the raw satellite data, access the raw data filed, navigate the imagery through 'C' programming and X-Windows interface software, and deliver the finished image to the end user over the Internet by means of file transfer protocol methods. The conclusion is that the distribution of satellite imagery by means of the Internet is feasible, and the archiving of large data sets can be accomplished with low cost storage systems allowing multiple users.

  3. The NULLTIMATE Testbed: A Progress Report

    NASA Astrophysics Data System (ADS)

    Gabor, P.

    2010-10-01

    Nulling interferometry has been suggested as the underlying principle for an instrument which could provide direct detection and spectroscopy of Earth-like exoplanets, including searches for potential biomarkers (Darwin/TPF-I). Several aspects of this method require further research and development. The NULLTIMATE testbed at the Institut d’Astrophysique Spatiale in Orsay, France, is a new instrument, built in late 2008. It is designed to test different achromatic phase shifters (focus crossing, field reversal, dielectric plates) at 300 K using various sources ranging from 2 to 10 μm, with special attention to stabilization (optical path difference and beam intensity balance). Its operational parameters (null depth and stability) were tested with a monochromatic laser sources at 2.32 and 3.39 μm and with a supercontinuum source in the K band. This poster presents a progress report on its performance with a focus crossing achromatic phase shifter.

  4. Supersonic combustion engine testbed, heat lightning

    NASA Technical Reports Server (NTRS)

    Hoying, D.; Kelble, C.; Langenbahn, A.; Stahl, M.; Tincher, M.; Walsh, M.; Wisler, S.

    1990-01-01

    The design of a supersonic combustion engine testbed (SCET) aircraft is presented. The hypersonic waverider will utilize both supersonic combustion ramjet (SCRAMjet) and turbofan-ramjet engines. The waverider concept, system integration, electrical power, weight analysis, cockpit, landing skids, and configuration modeling are addressed in the configuration considerations. The subsonic, supersonic and hypersonic aerodynamics are presented along with the aerodynamic stability and landing analysis of the aircraft. The propulsion design considerations include: engine selection, turbofan ramjet inlets, SCRAMjet inlets and the SCRAMjet diffuser. The cooling requirements and system are covered along with the topics of materials and the hydrogen fuel tanks and insulation system. A cost analysis is presented and the appendices include: information about the subsonic wind tunnel test, shock expansion calculations, and an aerodynamic heat flux program.

  5. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

  6. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    SciTech Connect

    Maloney, T.M.; Voecks, G.E.

    1995-01-25

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}

  7. Time-multiplexed open-path TDLAS spectrometer for dynamic, sampling-free, interstitial H2 18O and H2 16O vapor detection in ice clouds

    NASA Astrophysics Data System (ADS)

    Kühnreich, B.; Wagner, S.; Habig, J. C.; Möhler, O.; Saathoff, H.; Ebert, V.

    2015-04-01

    An advanced in situ diode laser hygrometer for simultaneous, sampling-free detection of interstitial H2 16O and H2 18O vapor was developed and tested in the aerosol interaction and dynamics in atmosphere (AIDA) cloud chamber during dynamic cloud formation processes. The spectrometer to measure isotope-resolved water vapor concentrations comprises two rapidly time-multiplexed DFB lasers near 1.4 and 2.7 µm and an open-path White cell with 227-m absorption path length and 4-m mirror separation. A dynamic water concentration range from 2.6 ppb to 87 ppm for H2 16O and 87 ppt to 3.6 ppm for H2 18O could be achieved and was used to enable a fast and direct detection of dynamic isotope ratio changes during ice cloud formation in the AIDA chamber at temperatures between 190 and 230 K. Relative changes in the H2 18O/H2 16O isotope ratio of 1 % could be detected and resolved with a signal-to-noise ratio of 7. This converts to an isotope ratio resolution limit of 0.15 % at 1-s time resolution.

  8. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  9. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  10. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  11. The Goddard Space Flight Center (GSFC) robotics technology testbed

    NASA Technical Reports Server (NTRS)

    Schnurr, Rick; Obrien, Maureen; Cofer, Sue

    1989-01-01

    Much of the technology planned for use in NASA's Flight Telerobotic Servicer (FTS) and the Demonstration Test Flight (DTF) is relatively new and untested. To provide the answers needed to design safe, reliable, and fully functional robotics for flight, NASA/GSFC is developing a robotics technology testbed for research of issues such as zero-g robot control, dual arm teleoperation, simulations, and hierarchical control using a high level programming language. The testbed will be used to investigate these high risk technologies required for the FTS and DTF projects. The robotics technology testbed is centered around the dual arm teleoperation of a pair of 7 degree-of-freedom (DOF) manipulators, each with their own 6-DOF mini-master hand controllers. Several levels of safety are implemented using the control processor, a separate watchdog computer, and other low level features. High speed input/output ports allow the control processor to interface to a simulation workstation: all or part of the testbed hardware can be used in real time dynamic simulation of the testbed operations, allowing a quick and safe means for testing new control strategies. The NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) hierarchical control scheme, is being used as the reference standard for system design. All software developed for the testbed, excluding some of simulation workstation software, is being developed in Ada. The testbed is being developed in phases. The first phase, which is nearing completion, and highlights future developments is described.

  12. Search Cloud

    MedlinePlus

    ... this page: https://medlineplus.gov/cloud.html Search Cloud To use the sharing features on this page, ... Top 110 zoster vaccine Share the MedlinePlus search cloud with your users by embedding our search cloud ...

  13. Search Cloud

    MedlinePlus

    ... www.nlm.nih.gov/medlineplus/cloud.html Search Cloud To use the sharing features on this page, please enable JavaScript. Share the MedlinePlus search cloud with your users by embedding our search cloud ...

  14. Development of Hardware-in-the-loop Microgrid Testbed

    SciTech Connect

    Xiao, Bailu; Prabakar, Kumaraguru; Starke, Michael R; Liu, Guodong; Dowling, Kevin; Ollis, T Ben; Irminger, Philip; Xu, Yan; Dimitrovski, Aleksandar D

    2015-01-01

    A hardware-in-the-loop (HIL) microgrid testbed for the evaluation and assessment of microgrid operation and control system has been presented in this paper. The HIL testbed is composed of a real-time digital simulator (RTDS) for modeling of the microgrid, multiple NI CompactRIOs for device level control, a prototype microgrid energy management system (MicroEMS), and a relay protection system. The applied communication-assisted hybrid control system has been also discussed. Results of function testing of HIL controller, communication, and the relay protection system are presented to show the effectiveness of the proposed HIL microgrid testbed.

  15. Development of a space-systems network testbed

    NASA Technical Reports Server (NTRS)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    1988-01-01

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  16. Development of a space-systems network testbed

    NASA Astrophysics Data System (ADS)

    Lala, Jaynarayan; Alger, Linda; Adams, Stuart; Burkhardt, Laura; Nagle, Gail; Murray, Nicholas

    This paper describes a communications network testbed which has been designed to allow the development of architectures and algorithms that meet the functional requirements of future NASA communication systems. The central hardware components of the Network Testbed are programmable circuit switching communication nodes which can be adapted by software or firmware changes to customize the testbed to particular architectures and algorithms. Fault detection, isolation, and reconfiguration has been implemented in the Network with a hybrid approach which utilizes features of both centralized and distributed techniques to provide efficient handling of faults within the Network.

  17. Custom data support for the FAst -physics System Testbed and Research (FASTER) Project

    SciTech Connect

    Toto, T.; Jensen, M.; Vogelmann, A.; Wagener, R.; Liu, Y.; Lin, W.

    2010-03-15

    The multi-institution FAst -physics System Testbed and Research (FASTER) project, funded by the DOE Earth System Modeling program, aims to evaluate and improve the parameterizations of fast processes (those involving clouds, precipitation and aerosols) in global climate models, using a combination of numerical prediction models, single column models, cloud resolving models, large-eddy simulations, full global climate model output and ARM active and passive remote sensing and in-situ data. This poster presents the Custom Data Support effort for the FASTER project. The effort will provide tailored datasets, statistics, best estimates and quality control data, as needed and defined by FASTER participants, for use in evaluating and improving parameterizations of fast processes in GCMs. The data support will include custom gridding and averaging, for the model of interest, using high time resolution and pixel level data from continuous ARM observations and complementary datasets. In addition to the FASTER team, these datasets will be made available to the ARM Science Team. Initial efforts with respect to data product development, priorities, availability and distribution are summarized here with an emphasis on cloud, atmospheric state and aerosol properties as observed during the Spring 2000 Cloud IOP and the Spring 2003 Aerosol IOP at the ARM Southern Great Plains site.

  18. Comparison of millimeter-wave cloud radar measurements for the Fall 1997 Cloud IOP

    SciTech Connect

    Sekelsky, S.M.; Li, L.; Galloway, J.; McIntosh, R.E.; Miller, M.A.; Clothiaux, E.E.; Haimov, S.; Mace, G.; Sassen, K.

    1998-05-01

    One of the primary objectives of the Fall 1997 IOP was to intercompare Ka-band (350Hz) and W-band (95GHz) cloud radar observations and verify system calibrations. During September 1997, several cloud radars were deployed at the Southern Great Plains (SOP) Cloud and Radiation Testbed (CART) site, including the full time operation 35 GHz CART Millimeter-wave Cloud Radar (MMCR), the University of Massachusetts (UMass) single antenna 33GHz/95 GHz Cloud Profiling Radar System (CPRS), the 95 GHz Wyoming Cloud Radar (WCR) flown on the University of Wyoming King Air, the University of Utah 95 GHz radar and the dual-antenna Pennsylvania State University 94 GHz radar. In this paper the authors discuss several issues relevant to comparison of ground-based radars, including the detection and filtering of insect returns. Preliminary comparisons of ground-based Ka-band radar reflectivity data and comparisons with airborne radar reflectivity measurements are also presented.

  19. SMILES ice cloud products

    NASA Astrophysics Data System (ADS)

    MilláN, L.; Read, W.; Kasai, Y.; Lambert, A.; Livesey, N.; Mendrok, J.; Sagawa, H.; Sano, T.; Shiotani, M.; Wu, D. L.

    2013-06-01

    Upper tropospheric water vapor and clouds play an important role in Earth's climate, but knowledge of them, in particular diurnal variation in deep convective clouds, is limited. An essential variable to understand them is cloud ice water content. The Japanese Superconducting Submillimeter-Wave Limb-Emission Sounder (SMILES) on board the International Space Station (ISS) samples the atmosphere at different local times allowing the study of diurnal variability of atmospheric parameters. We describe a new ice cloud data set consisting of partial Ice Water Path and Ice Water Content. Preliminary comparisons with EOS-MLS, CloudSat-CPR and CALIOP-CALIPSO are presented. Then, the diurnal variation over land and over open ocean for partial ice water path is reported. Over land, a pronounced diurnal variation peaking strongly in the afternoon/early evening was found. Over the open ocean, little temporal dependence was encountered. This data set is publicly available for download in HDF5 format.

  20. Situational descriptions of behavioral procedures: the in situ testbed.

    PubMed Central

    Kemp, S M; Eckerman, D A

    2001-01-01

    We demonstrate the In Situ testbed, a system that aids in evaluating computational models of learning, including artificial neural networks. The testbed models contingencies of reinforcement rising an extension of Mechner's (1959) notational system for the description of behavioral procedures. These contingencies are input to the model under test. The model's output is displayed as cumulative records. The cumulative record can then be compared to one produced by a pigeon exposed to the same contingencies. The testbed is tried with three published models of learning. Each model is exposed to up to three reinforcement schedules (testing ends when the model does not produce acceptable cumulative records): continuous reinforcement and extinction, fixed ratio, and fixed interval. The In Sitt testbed appears to be a reliable and valid testing procedure for comparing models of learning. PMID:11394484

  1. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system is documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  2. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system are documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  3. Phoenix Missile Hypersonic Testbed (PMHT): System Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    A viewgraph presentation of the Phoenix Missile Hypersonic Testbed (PMHT) is shown. The contents include: 1) Need and Goals; 2) Phoenix Missile Hypersonic Testbed; 3) PMHT Concept; 4) Development Objectives; 5) Possible Research Payloads; 6) Possible Research Program Participants; 7) PMHT Configuration; 8) AIM-54 Internal Hardware Schematic; 9) PMHT Configuration; 10) New Guidance and Armament Section Profiles; 11) Nomenclature; 12) PMHT Stack; 13) Systems Concept; 14) PMHT Preflight Activities; 15) Notional Ground Path; and 16) Sample Theoretical Trajectories.

  4. Energy Aware Clouds

    NASA Astrophysics Data System (ADS)

    Orgerie, Anne-Cécile; de Assunção, Marcos Dias; Lefèvre, Laurent

    Cloud infrastructures are increasingly becoming essential components for providing Internet services. By benefiting from economies of scale, Clouds can efficiently manage and offer a virtually unlimited number of resources and can minimize the costs incurred by organizations when providing Internet services. However, as Cloud providers often rely on large data centres to sustain their business and offer the resources that users need, the energy consumed by Cloud infrastructures has become a key environmental and economical concern. This chapter presents an overview of techniques that can improve the energy efficiency of Cloud infrastructures. We propose a framework termed as Green Open Cloud, which uses energy efficient solutions for virtualized environments; the framework is validated on a reference scenario.

  5. Ames life science telescience testbed evaluation

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  6. Further progress in watermark evaluation testbed (WET)

    NASA Astrophysics Data System (ADS)

    Kim, Hyung C.; Lin, Eugene T.; Guitart, Oriol; Delp, Edward J., III

    2005-03-01

    While Digital Watermarking has received much attention in recent years, it is still a relatively young technology. There are few accepted tools/metrics that can be used to evaluate the suitability of a watermarking technique for a specific application. This lack of a universally adopted set of metrics/methods has motivated us to develop a web-based digital watermark evaluation system called the Watermark Evaluation Testbed or WET. There have been more improvements over the first version of WET. We implemented batch mode with a queue that allows for user submitted jobs. In addition to StirMark 3.1 as an attack module, we added attack modules based on StirMark 4.0. For a new image fidelity measure, we evaluate conditional entropy as an image fidelity measure for different watermarking algorithms and different attacks. Also, we show the results of curve fitting the Receiver Operating Characteristic (ROC) analysis data using the Parzen window density estimation. The curve fits the data closely while having only two parameters to estimate.

  7. Evaluation testbed for ATD performance prediction (ETAPP)

    NASA Astrophysics Data System (ADS)

    Ralph, Scott K.; Eaton, Ross; Snorrason, Magnús; Irvine, John; Vanstone, Steve

    2007-04-01

    Automatic target detection (ATD) systems process imagery to detect and locate targets in imagery in support of a variety of military missions. Accurate prediction of ATD performance would assist in system design and trade studies, collection management, and mission planning. A need exists for ATD performance prediction based exclusively on information available from the imagery and its associated metadata. We present a predictor based on image measures quantifying the intrinsic ATD difficulty on an image. The modeling effort consists of two phases: a learning phase, where image measures are computed for a set of test images, the ATD performance is measured, and a prediction model is developed; and a second phase to test and validate performance prediction. The learning phase produces a mapping, valid across various ATR algorithms, which is even applicable when no image truth is available (e.g., when evaluating denied area imagery). The testbed has plug-in capability to allow rapid evaluation of new ATR algorithms. The image measures employed in the model include: statistics derived from a constant false alarm rate (CFAR) processor, the Power Spectrum Signature, and others. We present performance predictors for two trained ATD classifiers, one constructed using using GENIE Pro TM, a tool developed at Los Alamos National Laboratory, and the other eCognition TM, developed by Definiens (http://www.definiens.com/products). We present analyses of the two performance predictions, and compare the underlying prediction models. The paper concludes with a discussion of future research.

  8. Optical testbed for the LISA phasemeter

    NASA Astrophysics Data System (ADS)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  9. A Hydrometeorological Testbed For Western Water Issues

    NASA Astrophysics Data System (ADS)

    Ralph, F. M.; Reynolds, D.; Martner, B. E.; Kingsmill, D. E.; White, A. B.; Whitaker, J. S.

    2002-12-01

    Based on experience gained between 1997 and 2002 in a series of three West Coast experiments focused on improved prediction of precipitation in land-falling Pacific winter storms, NOAA is leading the creation of a regional Hydrometeorological Testbed (HMT). The goal of this effort is to advance both the understanding of fundamental physical processes influencing primarily winter-season precipitation (rain and snow) in mountainous regions, and to improve quantitative precipitation forecasting, main-stem river flood warnings and flash-flood warning lead time in such regions. The focus will be on processes spanning the weather-climate connection, from the mesoscale to tropical-extratropical connections that modulate regional short-term climate anomalies influencing precipitation. The geographic area covered by the initial HMT encompasses the flood-prone Russian River and Sacramento River watersheds in northern California. While these watersheds represent some of the greatest flood risks in the nation, the scientific and operational results developed there will have bearing on winter season hydrometeorological prediction in many other locations. These goals will be addressed through a joint effort between scientists, weather forecasters, hydrologists and forecast users that will define both the needs and methodologies to tackle this important problem. Annual field activities will begin in the winter of 2002/03, building on results from earlier studies in the region. Priorities and leveraging opportunities for wider participation in the winter 2003/04 season will be explored in upcoming planning meetings where broad input is encouraged.

  10. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  11. Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2

    NASA Technical Reports Server (NTRS)

    Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.

    2003-01-01

    Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used

  12. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  13. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  14. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  15. Advanced turboprop testbed systems study. Volume 1: Testbed program objectives and priorities, drive system and aircraft design studies, evaluation and recommendations and wind tunnel test plans

    NASA Technical Reports Server (NTRS)

    Bradley, E. S.; Little, B. H.; Warnock, W.; Jenness, C. M.; Wilson, J. M.; Powell, C. W.; Shoaf, L.

    1982-01-01

    The establishment of propfan technology readiness was determined and candidate drive systems for propfan application were identified. Candidate testbed aircraft were investigated for testbed aircraft suitability and four aircraft selected as possible propfan testbed vehicles. An evaluation of the four candidates was performed and the Boeing KC-135A and the Gulfstream American Gulfstream II recommended as the most suitable aircraft for test application. Conceptual designs of the two recommended aircraft were performed and cost and schedule data for the entire testbed program were generated. The program total cost was estimated and a wind tunnel program cost and schedule is generated in support of the testbed program.

  16. NASA's Coastal and Ocean Airborne Science Testbed

    NASA Astrophysics Data System (ADS)

    Guild, L. S.; Dungan, J. L.; Edwards, M.; Russell, P. B.; Morrow, J. H.; Hooker, S.; Myers, J.; Kudela, R. M.; Dunagan, S.; Soulage, M.; Ellis, T.; Clinton, N. E.; Lobitz, B.; Martin, K.; Zell, P.; Berthold, R. W.; Smith, C.; Andrew, D.; Gore, W.; Torres, J.

    2011-12-01

    The Coastal and Ocean Airborne Science Testbed (COAST) Project is a NASA Earth-science flight mission that will advance coastal ecosystems research by providing a unique airborne payload optimized for remote sensing in the optically complex coastal zone. Teaming NASA Ames scientists and engineers with Biospherical Instruments, Inc. (San Diego) and UC Santa Cruz, the airborne COAST instrument suite combines a customized imaging spectrometer, sunphotometer system, and a new bio-optical radiometer package to obtain ocean/coastal/atmosphere data simultaneously in flight for the first time. The imaging spectrometer (Headwall) is optimized in the blue region of the spectrum to emphasize remote sensing of marine and freshwater ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data will be accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Based on optical detectors called microradiometers, the NASA Ocean Biology and Biogeochemistry Calibration and Validation (cal/val) Office team has deployed advanced commercial off-the-shelf instrumentation that provides in situ measurements of the apparent optical properties at the land/ocean boundary including optically shallow aquatic ecosystems (e.g., lakes, estuaries, coral reefs). A complimentary microradiometer instrument package (Biospherical Instruments, Inc.), optimized for use above water, will be flown for the first time with the airborne instrument suite. Details of the October 2011 COAST airborne mission over Monterey Bay demonstrating this new airborne instrument suite capability will be presented, with associated preliminary data on coastal ocean color products, coincident spatial and temporal data on aerosol optical depth and water vapor column content, as well as derived exact water-leaving radiances.

  17. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  18. X.509 Authentication/Authorization in FermiCloud

    SciTech Connect

    Kim, Hyunwoo; Timm, Steven

    2014-11-11

    We present a summary of how X.509 authentication and authorization are used with OpenNebula in FermiCloud. We also describe a history of why the X.509 authentication was needed in FermiCloud, and review X.509 authorization options, both internal and external to OpenNebula. We show how these options can be and have been used to successfully run scientific workflows on federated clouds, which include OpenNebula on FermiCloud and Amazon Web Services as well as other community clouds. We also outline federation options being used by other commercial and open-source clouds and cloud research projects.

  19. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Astrophysics Data System (ADS)

    Kortenkamp, David; Izygon, Michel; Lawler, Dennis; Schreckenghost, Debra; Bonasso, R. Peter; Wang, Lui; Kennedy, Kriss

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  20. Development of the CSI phase-3 evolutionary model testbed

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Davis, D. A.; Tan, M. K.

    1994-01-01

    This report documents the development effort for the reconfiguration of the Controls-Structures Integration (CSI) Evolutionary Model (CEM) Phase-2 testbed into the CEM Phase-3 configuration. This step responds to the need to develop and test CSI technologies associated with typical planned earth science and remote sensing platforms. The primary objective of the CEM Phase-3 ground testbed is to simulate the overall on-orbit dynamic behavior of the EOS AM-1 spacecraft. Key elements of the objective include approximating the low-frequency appendage dynamic interaction of EOS AM-1, allowing for the changeout of components, and simulating the free-free on-orbit environment using an advanced suspension system. The fundamentals of appendage dynamic interaction are reviewed. A new version of the multiple scaling method is used to design the testbed to have the full-scale geometry and dynamics of the EOS AM-1 spacecraft, but at one-tenth the weight. The testbed design is discussed, along with the testing of the solar array, high gain antenna, and strut components. Analytical performance comparisons show that the CEM Phase-3 testbed simulates the EOS AM-1 spacecraft with good fidelity for the important parameters of interest.

  1. Kite: Status of the External Metrology Testbed for SIM

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar; Azizi, Alireza; Moser, Steven; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-01-01

    Kite is a system level testbed for the External Metrology system of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducial that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to tens of picometers in order to correct for thermal measurements, the Kite testbed was build to test both the metrology gauges and out ability to optically model the system at these levels. The Kite testbed is an over-constraint system where 6 lengths are measured, but only 5 are needed to determine the system. The agreement in the over-constrained length needs to be on the order of 140 pm for the SIM Wide-Angle observing scenario and 8 pm for the Narrow-Angle observing scenario. We demonstrate that we have met the Wide-Angle goal with our current setup. For the Narrow-Angle case, we have only reached the goal for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more SIM like.

  2. Cloud regimes as phase transitions

    NASA Astrophysics Data System (ADS)

    Stechmann, Samuel N.; Hottovy, Scott

    2016-06-01

    Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes -- open versus closed cells -- fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. With this new conceptual viewpoint, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions.

  3. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  4. Laser Metrology in the Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    An, Xin; Marx, D.; Goullioud, Renaud; Zhao, Feng

    2004-01-01

    The Space Interferometer Mission (SIM), scheduled for launch in 2009, is a space-born visible light stellar interferometer capable of micro-arcsecond-level astrometry. The Micro-Arcsecond Metrology testbed (MAM) is the ground-based testbed that incorporates all the functionalities of SIM minus the telescope, for mission-enabling technology development and verification. MAM employs a laser heterodyne metrology system using the Sub-Aperture Vertex-to-Vertex (SAVV) concept. In this paper, we describe the development and modification of the SAVV metrology launchers and the metrology instrument electronics, precision alignments and pointing control, locating cyclic error sources in the MAM testbed and methods to mitigate the cyclic errors, as well as the performance under the MAM performance metrics.

  5. Experimental Test-Bed for Intelligent Passive Array Research

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Torres, Miguel; David, Sunil; Isom, Adam; Cotto, Jose; Sharaiha, Samer

    2004-01-01

    This document describes the test-bed designed for the investigation of passive direction finding, recognition, and classification of speech and sound sources using sensor arrays. The test-bed forms the experimental basis of the Intelligent Small-Scale Spatial Direction Finder (ISS-SDF) project, aimed at furthering digital signal processing and intelligent sensor capabilities of sensor array technology in applications such as rocket engine diagnostics, sensor health prognostics, and structural anomaly detection. This form of intelligent sensor technology has potential for significant impact on NASA exploration, earth science and propulsion test capabilities. The test-bed consists of microphone arrays, power and signal distribution modules, web-based data acquisition, wireless Ethernet, modeling, simulation and visualization software tools. The Acoustic Sensor Array Modeler I (ASAM I) is used for studying steering capabilities of acoustic arrays and testing DSP techniques. Spatial sound distribution visualization is modeled using the Acoustic Sphere Analysis and Visualization (ASAV-I) tool.

  6. Progress on an external occulter testbed at flight Fresnel numbers

    NASA Astrophysics Data System (ADS)

    Kim, Yunjong; Sirbu, Dan; Galvin, Michael; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-01-01

    An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we have designed and built a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. The occulter testbed uses 78 m optical propagation distance to realize the flight Fresnel numbers. We will use an etched silicon mask as the occulter. The occulter is illuminated by a diverging laser beam to reduce the aberrations from the optics before the occulter. Here, we present first light result of a sample design operating at a flight Fresnel number and the mechanical design of the testbed. We compare the experimental results with simulations that predict the ultimate contrast performance.

  7. Design optimization of the JPL Phase B testbed

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Salama, M.; Wette, M.; Chu, Cheng-Chih

    1993-01-01

    Increasingly complex spacecraft will benefit from integrated design and optimization of structural, optical, and control subsystems. Integrated design optimization will allow designers to make tradeoffs in objectives and constraints across these subsystems. The location, number, and types of passive and active devices distributed along the structure can have a dramatic impact on overall system performance. In addition, the manner in which structural mass is distributed can also serve as an effective mechanism for attenuating disturbance transmission between source and sensitive system components. This paper presents recent experience using optimization tools that have been developed for addressing some of these issues on a challenging testbed design problem. This particular testbed is one of a series of testbeds at the Jet Propulsion Laboratory under the sponsorship of the NASA Control Structure Interaction (CSI) Program to demonstrate nanometer level optical pathlength control on a flexible truss structure that emulates a spaceborne interferometer.

  8. A Testbed for Deploying Distributed State Estimation in Power Grid

    SciTech Connect

    Jin, Shuangshuang; Chen, Yousu; Rice, Mark J.; Liu, Yan; Gorton, Ian

    2012-07-22

    Abstract—With the increasing demand, scale and data information of power systems, fast distributed applications are becoming more important in power system operation and control. This paper proposes a testbed for evaluating power system distributed applications, considering data exchange among distributed areas. A high-performance computing (HPC) version of distributed state estimation is implemented and used as a distributed application example. The IEEE 118-bus system is used to deploy the parallel distributed state estimation, and the MeDICi middleware is used for data communication. The performance of the testbed demonstrates its capability to evaluate parallel distributed state estimation by leveraging the HPC paradigm. This testbed can also be applied to evaluate other distributed applications.

  9. Telescience testbed pilot program, volume 2: Program results

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.

  10. High Contrast Imaging Testbed for the Terrestrial Planet Finder Coronagraph

    NASA Technical Reports Server (NTRS)

    Lowmman, Andrew E.; Trauger, John T.; Gordon, Brian; Green, Joseph J.; Moody, Dwight; Niessner, Albert F.; Shi, Fang

    2004-01-01

    The Terrestrial Planet Finder (TPF) mission is planning to launch a visible coronagraphic space telescope in 2014. To achieve TPF science goals, the coronagraph must have extreme levels of wavefront correction (less than 1 Angstrom rms over controllable spatial frequencies) and stability to get the necessary suppression of diffracted starlight (approximately l0(exp -10)) contrast at an angular separation approximately 4 (lamda)/D). TPF Coronagraph's primary platform for experimentation is the High Contrast Imaging Testbed, which will provide laboratory validation of key technologies as well as demonstration of a flight-traceable approach to implementation. Precision wavefront control in the testbed is provided by a high actuator density deformable mirror. Diffracted light control is achieved through use of occulting or apodizing masks and stops. Contrast measurements will establish the technical feasibility of TPF requirements, while model and error budget validation will demonstrate implementation viability. This paper describes the current testbed design, development approach, and recent experimental results.

  11. Experiences with the JPL telerobot testbed: Issues and insights

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Balaram, Bob; Beahan, John

    1989-01-01

    The Jet Propulsion Laboratory's (JPL) Telerobot Testbed is an integrated robotic testbed used to develop, implement, and evaluate the performance of advanced concepts in autonomous, tele-autonomous, and tele-operated control of robotic manipulators. Using the Telerobot Testbed, researchers demonstrated several of the capabilities and technological advances in the control and integration of robotic systems which have been under development at JPL for several years. In particular, the Telerobot Testbed was recently employed to perform a near completely automated, end-to-end, satellite grapple and repair sequence. The task of integrating existing as well as new concepts in robot control into the Telerobot Testbed has been a very difficult and timely one. Now that researchers have completed the first major milestone (i.e., the end-to-end demonstration) it is important to reflect back upon experiences and to collect the knowledge that has been gained so that improvements can be made to the existing system. It is also believed that the experiences are of value to the others in the robotics community. Therefore, the primary objective here will be to use the Telerobot Testbed as a case study to identify real problems and technological gaps which exist in the areas of robotics and in particular systems integration. Such problems have surely hindered the development of what could be reasonably called an intelligent robot. In addition to identifying such problems, researchers briefly discuss what approaches have been taken to resolve them or, in several cases, to circumvent them until better approaches can be developed.

  12. UltraSciencenet: High- Performance Network Research Test-Bed

    SciTech Connect

    Rao, Nageswara S; Wing, William R; Poole, Stephen W; Hicks, Susan Elaine; DeNap, Frank A; Carter, Steven M; Wu, Qishi

    2009-04-01

    The high-performance networking requirements for next generation large-scale applications belong to two broad classes: (a) high bandwidths, typically multiples of 10Gbps, to support bulk data transfers, and (b) stable bandwidths, typically at much lower bandwidths, to support computational steering, remote visualization, and remote control of instrumentation. Current Internet technologies, however, are severely limited in meeting these demands because such bulk bandwidths are available only in the backbone, and stable control channels are hard to realize over shared connections. The UltraScience Net (USN) facilitates the development of such technologies by providing dynamic, cross-country dedicated 10Gbps channels for large data transfers, and 150 Mbps channels for interactive and control operations. Contributions of the USN project are two-fold: (a) Infrastructure Technologies for Network Experimental Facility: USN developed and/or demonstrated a number of infrastructure technologies needed for a national-scale network experimental facility. Compared to Internet, USN's data-plane is different in that it can be partitioned into isolated layer-1 or layer-2 connections, and its control-plane is different in the ability of users and applications to setup and tear down channels as needed. Its design required several new components including a Virtual Private Network infrastructure, a bandwidth and channel scheduler, and a dynamic signaling daemon. The control-plane employs a centralized scheduler to compute the channel allocations and a signaling daemon to generate configuration signals to switches. In a nutshell, USN demonstrated the ability to build and operate a stable national-scale switched network. (b) Structured Network Research Experiments: A number of network research experiments have been conducted on USN that cannot be easily supported over existing network facilities, including test-beds and production networks. It settled an open matter by demonstrating

  13. Telescience Testbed Pilot Project - Evaluation environment for Space Station operations

    NASA Technical Reports Server (NTRS)

    Wiskerchen, Michael J.; Leiner, Barry M.

    1988-01-01

    The objectives of the Telescience Testbed Pilot Program (TTPP) are discussed. The purpose of the TTPP, which involves 15 universities in cooperation with various NASA centers, is to demonstrate the utility of a user-oriented rapid prototyping testbed approach to developing and refining science requirements and validation concepts and approaches for the information systems of the Space Station era and beyond. It is maintained that the TTPP provides an excellent environment, with low programmatic schedule and budget risk, for testing and evaluating new operations concepts and technologies.

  14. Telescience testbed pilot program, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.

  15. The Wide-Field Imaging Interferometry Testbed: Recent Progress

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.

    2010-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) at NASA's Goddard Space Flight Center was designed to demonstrate the practicality and application of techniques for wide-field spatial-spectral ("double Fourier") interferometry. WIIT is an automated system, and it is now producing substantial amounts of high-quality data from its state-of-the-art operating environment, Goddard's Advanced Interferometry and Metrology Lab. In this paper, we discuss the characterization and operation of the testbed and present the most recent results. We also outline future research directions. A companion paper within this conference discusses the development of new wide-field double Fourier data analysis algorithms.

  16. The Living With a Star Space Environment Testbed Program

    NASA Technical Reports Server (NTRS)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  17. Testbeds for Assessing Critical Scenarios in Power Control Systems

    NASA Astrophysics Data System (ADS)

    Dondossola, Giovanna; Deconinck, Geert; Garrone, Fabrizio; Beitollahi, Hakem

    The paper presents a set of control system scenarios implemented in two testbeds developed in the context of the European Project CRUTIAL - CRitical UTility InfrastructurAL Resilience. The selected scenarios refer to power control systems encompassing information and communication security of SCADA systems for grid teleoperation, impact of attacks on inter-operator communications in power emergency conditions, impact of intentional faults on the secondary and tertiary control in power grids with distributed generators. Two testbeds have been developed for assessing the effect of the attacks and prototyping resilient architectures.

  18. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  19. Testbed model and data assimilation for ARM. Progress report No. 3, 1 September 1992--30 April 1993

    SciTech Connect

    Louis, J.F.

    1993-04-28

    The ultimate objectives of this research are to further develop ALFA (AER Local Forecast and Assimilation) model originally designed at AER for local weather prediction and apply it to several related purposes in connection with the Atmospheric Radiation Measurement (ARM) program: (a) to provide a testbed that simulates a global climate model in order to facilitate the development and testing of new cloud parametrizations and radiation models; (b) to assimilate the ARM data continuously at the scale of a climate model, using the adjoint method, thus providing the initial conditions and verification data for testing parametrizations; (c) to study the sensitivity of a radiation scheme to cloud parameters, again using the adjoint method, thus demonstrating the usefulness of the testbed model. The data assimilation uses a variational technique that minimizes the difference between the model results and the observation during the analysis period. The adjoint model is used to compute the gradient of a measure of the model errors with respect to nudging terms that are added to the equations to force the model output closer to the data.

  20. An Integrated Testbed for Cooperative Perception with Heterogeneous Mobile and Static Sensors

    PubMed Central

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  1. A programmable laboratory testbed in support of evaluation of functional brain activation and connectivity.

    PubMed

    Barbour, Randall L; Graber, Harry L; Xu, Yong; Pei, Yaling; Schmitz, Christoph H; Pfeil, Douglas S; Tyagi, Anandita; Andronica, Randy; Lee, Daniel C; Barbour, San-Lian S; Nichols, J David; Pflieger, Mark E

    2012-03-01

    An important determinant of the value of quantitative neuroimaging studies is the reliability of the derived information, which is a function of the data collection conditions. Near infrared spectroscopy (NIRS) and electroencelphalography are independent sensing domains that are well suited to explore principal elements of the brain's response to neuroactivation, and whose integration supports development of compact, even wearable, systems suitable for use in open environments. In an effort to maximize the translatability and utility of such resources, we have established an experimental laboratory testbed that supports measures and analysis of simulated macroscopic bioelectric and hemodynamic responses of the brain. Principal elements of the testbed include 1) a programmable anthropomorphic head phantom containing a multisignal source array embedded within a matrix that approximates the background optical and bioelectric properties of the brain, 2) integrated translatable headgear that support multimodal studies, and 3) an integrated data analysis environment that supports anatomically based mapping of experiment-derived measures that are directly and not directly observable. Here, we present a description of system components and fabrication, an overview of the analysis environment, and findings from a representative study that document the ability to experimentally validate effective connectivity models based on NIRS tomography. PMID:22438333

  2. An integrated testbed for cooperative perception with heterogeneous mobile and static sensors.

    PubMed

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  3. xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud[OPEN

    PubMed Central

    Merchant, Nirav

    2016-01-01

    Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today’s pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant’s Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. PMID:27020957

  4. Smart Antenna UKM Testbed for Digital Beamforming System

    NASA Astrophysics Data System (ADS)

    Islam, Mohammad Tariqul; Misran, Norbahiah; Yatim, Baharudin

    2009-12-01

    A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH) array antenna and software reconfigurable digital beamforming system (DBS). The antenna is developed based on using the novel LIEH microstrip patch element design arranged into [InlineEquation not available: see fulltext.] uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance [InlineEquation not available: see fulltext.] floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88-2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  5. A Portable MIMO Testbed and Selected Channel Measurements

    NASA Astrophysics Data System (ADS)

    Goud, Paul, Jr.; Hang, Robert; Truhachev, Dmitri; Schlegel, Christian

    2006-12-01

    A portable[InlineEquation not available: see fulltext.] multiple-input multiple-output (MIMO) testbed that is based on field programmable gate arrays (FPGAs) and which operates in the 902-928 MHz industrial, scientific, and medical (ISM) band has been developed by the High Capacity Digital Communications (HCDC) Laboratory at the University of Alberta. We present a description of the HCDC testbed along with MIMO channel capacities that were derived from measurements taken with the HCDC testbed for three special locations: a narrow corridor, an athletics field that is surrounded by a metal fence, and a parkade. These locations are special because the channel capacities are different from what is expected for a typical indoor or outdoor channel. For two of the cases, a ray-tracing analysis has been performed and the simulated channel capacity values closely match the values calculated from the measured data. A ray-tracing analysis, however, requires accurate geometrical measurements and sophisticated modeling for each specific location. A MIMO testbed is ideal for quickly obtaining accurate channel capacity information.

  6. Extending the Information Commons: From Instructional Testbed to Internet2

    ERIC Educational Resources Information Center

    Beagle, Donald

    2002-01-01

    The author's conceptualization of an Information Commons (IC) is revisited and elaborated in reaction to Bailey and Tierney's article. The IC's role as testbed for instructional support and knowledge discovery is explored, and progress on pertinent research is reviewed. Prospects for media-rich learning environments relate the IC to the…

  7. Operation Duties on the F-15B Research Testbed

    NASA Technical Reports Server (NTRS)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  8. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  9. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme. PMID:27137302

  10. Developmental Cryogenic Active Telescope Testbed, a Wavefront Sensing and Control Testbed for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Leboeuf, Claudia M.; Davila, Pamela S.; Redding, David C.; Morell, Armando; Lowman, Andrew E.; Wilson, Mark E.; Young, Eric W.; Pacini, Linda K.; Coulter, Dan R.

    1998-01-01

    As part of the technology validation strategy of the next generation space telescope (NGST), a system testbed is being developed at GSFC, in partnership with JPL and Marshall Space Flight Center (MSFC), which will include all of the component functions envisioned in an NGST active optical system. The system will include an actively controlled, segmented primary mirror, actively controlled secondary, deformable, and fast steering mirrors, wavefront sensing optics, wavefront control algorithms, a telescope simulator module, and an interferometric wavefront sensor for use in comparing final obtained wavefronts from different tests. The developmental. cryogenic active telescope testbed (DCATT) will be implemented in three phases. Phase 1 will focus on operating the testbed at ambient temperature. During Phase 2, a cryocapable segmented telescope will be developed and cooled to cryogenic temperature to investigate the impact on the ability to correct the wavefront and stabilize the image. In Phase 3, it is planned to incorporate industry developed flight-like components, such as figure controlled mirror segments, cryogenic, low hold power actuators, or different wavefront sensing and control hardware or software. A very important element of the program is the development and subsequent validation of the integrated multidisciplinary models. The Phase 1 testbed objectives, plans, configuration, and design will be discussed.

  11. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  12. Arctic Clouds

    Atmospheric Science Data Center

    2013-04-19

    ...     View Larger Image Stratus clouds are common in the Arctic during the summer months, and are ... formats available at JPL August 23, 2000 - Stratus clouds help modulate the arctic climate. project:  ...

  13. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  14. Trusted Computing Strengthens Cloud Authentication

    PubMed Central

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  15. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  16. Openings

    PubMed Central

    Selwyn, Peter A.

    2015-01-01

    Reviewing his clinic patient schedule for the day, a physician reflects on the history of a young woman he has been caring for over the past 9 years. What starts out as a routine visit then turns into a unique opening for communication and connection. A chance glimpse out the window of the exam room leads to a deeper meditation on parenthood, survival, and healing, not only for the patient but also for the physician. How many missed opportunities have we all had, without even realizing it, to allow this kind of fleeting but profound opening? PMID:26195687

  17. Openings.

    PubMed

    Selwyn, Peter A

    2015-01-01

    Reviewing his clinic patient schedule for the day, a physician reflects on the history of a young woman he has been caring for over the past 9 years. What starts out as a routine visit then turns into a unique opening for communication and connection. A chance glimpse out the window of the exam room leads to a deeper meditation on parenthood, survival, and healing, not only for the patient but also for the physician. How many missed opportunities have we all had, without even realizing it, to allow this kind of fleeting but profound opening? PMID:26195687

  18. Climate zones for maritime clouds

    SciTech Connect

    White, A.B.; Ruffieux, D.; Fairall, C.W.

    1995-04-01

    In this paper we use a commercially available lidar ceilometer to investigate how the basic structure of marine boundary-layer clouds varies for four different marine climate regimes. We obtained most of the data used in this analysis from ship-based ceilometer measurements recorded during several different atmospheric and oceanographic field programs conducted in the Atlantic and Pacific oceans. For comparison, we show the results obtained at a mid-latitude continental location and at an ice camp on the Arctic ice shelf. For each analyzed case, we use an extended time series to generate meaningful cloud base and cloud fraction statistics. The Vaisala CT 12K ceilometer uses a GaAs diode laser to produce short (150 ns), high-intensity pulses of infrared radiation (904 nm wavelength). The return signals from a large number of consecutive pulses are coherently summed to boost the signal-to-noise ratio. Each resulting 30-s profile of backscattered power (15-m resolution) is analyzed to detect cloud layers using a specified cloud detection limit. In addition to measurements of cloud base, the ceilometer can also provide information on cloud fraction using a time series of the {open_quotes}cloud{close_quotes} or {open_quotes} no cloud{close_quotes} status reported in the 30-s data.

  19. Telescience testbed pilot program, volume 3: Experiment summaries

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.

  20. A single-axis testbed for slewing control experiments

    NASA Technical Reports Server (NTRS)

    Hamilton, Jonathan; Lee, Gordon K. F.; Juang, Jer-Nan

    1990-01-01

    A simple single-axis testbed is described, and initial experimental results are presented to illustrate collocated and noncollocated control for this structure. The testbed is made up of a pair of single-axis flexible beams attached to a DC servo motor. An optical encoder and strain gauges provide hub and beam position information, respectively. The system is driven by an IBM PC system; with a motor controller, a programmable digital filter processes position error information through user-selected gains and pole-zero configurations. A 25-kHz data acquisition system provides the necessary interface between processor and motor. The control approaches currently being investigated include collocated PD control and noncollocated phase compensation.

  1. The advanced orbiting systems testbed program: Results to date

    NASA Technical Reports Server (NTRS)

    Newsome, Penny A.; Otranto, John F.

    1993-01-01

    The Consultative Committee for Space Data Systems Recommendations for Packet Telemetry and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. Central to the AOST Program are the development of an end-to-end Testbed and its use in a comprehensive testing program. Other Program activities include flight-qualifiable component development, supporting studies, and knowledge dissemination. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations. The results presented in this paper include architectural issues, a draft proposed standardized test suite and flight-qualifiable components.

  2. Development of a FDIR Validation Test-Bed

    NASA Astrophysics Data System (ADS)

    Andersson, Jan; Cederman, Daniel; Habinc, Sandi; Dellandrea, Brice; Nodet, Jean-Christian; Guettache, Farid; Furano, Gianluca

    2014-08-01

    This paper describes work being performed by Aeroflex Gaisler and Thales Alenia Space France for the European Space Agency to develop an extension of the existing avionics system testbed facility in ESTEC's Avionics Lab. The work is funded by the European Space Agency under contract 4000109928/13/NL/AK. The resulting FDIR (Fault Detection, Isolation and Recovery) testbed will allow to test concepts, strategy mechanisms and tools related to FDIR. The resulting facility will have the capabilities to support nominal and off-nominal test cases and to support tools for post testing and post simulation analysis. Ultimately the purpose of the output of this activity is to provide a tool for assessment and validation at laboratory level. This paper describes an on-going development; at the time of writing the activity is in the preliminary design phase.

  3. FDIR Validation Test-Bed Development and Results

    NASA Astrophysics Data System (ADS)

    Karlsson, Alexander; Sakthivel, Anandhavel; Aberg, Martin; Andersson, Jan; Habinc, Sandi; Dellandrea, Brice; Nodet, Jean-Christian; Guettache, Farid; Furano, Gianluca

    2015-09-01

    This paper describes work being performed by Cobham Gaisler and Thales Alenia Space France for the European Space Agency to develop an extension of the existing avionics system testbed facility in ESTEC's Avionics Lab. The work is funded by the European Space Agency under contract 4000109928/13/NL/AK. The resulting FDIR (Fault Detection, Isolation and Recovery) testbed will allow to test concepts, strategy mechanisms and tools related to FDIR. The resulting facility will have the capabilities to support nominal and off-nominal test cases and to support tools for post testing and post simulation analysis. Ultimately the purpose of the output of this activity is to provide a tool for assessment and validation at laboratory level. This paper describes an on-going development; at the time of writing the activity is in the validation phase.

  4. Amplitude variations on the Extreme Adaptive Optics testbed

    SciTech Connect

    Evans, J; Thomas, S; Dillon, D; Gavel, D; Phillion, D; Macintosh, B

    2007-08-14

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. At the Laboratory for Adaptive Optics on the Extreme Adaptive Optics testbed, we have already demonstrated wavefront control of better than 1 nm rms within controllable spatial frequencies. Corresponding contrast measurements, however, are limited by amplitude variations, including those introduced by the micro-electrical-mechanical-systems (MEMS) deformable mirror. Results from experimental measurements and wave optic simulations of amplitude variations on the ExAO testbed are presented. We find systematic intensity variations of about 2% rms, and intensity variations with the MEMS to be 6%. Some errors are introduced by phase and amplitude mixing because the MEMS is not conjugate to the pupil, but independent measurements of MEMS reflectivity suggest that some error is introduced by small non-uniformities in the reflectivity.

  5. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  6. High energy laser testbed for accurate beam pointing control

    NASA Astrophysics Data System (ADS)

    Kim, Dojong; Kim, Jae Jun; Frist, Duane; Nagashima, Masaki; Agrawal, Brij

    2010-02-01

    Precision laser beam pointing is a key technology in High Energy Laser systems. In this paper, a laboratory High Energy Laser testbed developed at the Naval Postgraduate School is introduced. System identification is performed and a mathematical model is constructed to estimate system performance. New beam pointing control algorithms are designed based on this mathematical model. It is shown in both computer simulation and experiment that the adaptive filter algorithm can improve the pointing performance of the system.

  7. Remotely Accessible Testbed for Software Defined Radio Development

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2012-01-01

    Previous development testbeds have assumed that the developer was physically present in front of the hardware being used. No provision for remote operation of basic functions (power on/off or reset) was made, because the developer/operator was sitting in front of the hardware, and could just push the button manually. In this innovation, a completely remotely accessible testbed has been created, with all diagnostic equipment and tools set up for remote access, and using standardized interfaces so that failed equipment can be quickly replaced. In this testbed, over 95% of the operating hours were used for testing without the developer being physically present. The testbed includes a pair of personal computers, one running Linux and one running Windows. A variety of peripherals is connected via Ethernet and USB (universal serial bus) interfaces. A private internal Ethernet is used to connect to test instruments and other devices, so that the sole connection to the outside world is via the two PCs. An important design consideration was that all of the instruments and interfaces used stable, long-lived industry standards, such as Ethernet, USB, and GPIB (general purpose interface bus). There are no plug-in cards for the two PCs, so there are no problems with finding replacement computers with matching interfaces, device drivers, and installation. The only thing unique to the two PCs is the locally developed software, which is not specific to computer or operating system version. If a device (including one of the computers) were to fail or become unavailable (e.g., a test instrument needed to be recalibrated), replacing it is a straightforward process with a standard, off-the-shelf device.

  8. System integration of a Telerobotic Demonstration System (TDS) testbed

    NASA Technical Reports Server (NTRS)

    Myers, John K.

    1987-01-01

    The concept for and status of a telerobotic demonstration system testbed that integrates teleoperation and robotics is described. The components of the telerobotic system are described and the ongoing projects are discussed. The system can be divided into two sections: the autonomous subsystems, and the additional interface and support subsystems including teleoperations. The workings of each subsystem by itself and how the subsystems integrate into a complete system is discussed.

  9. Optical modeling in Testbed Environment for Space Situational Awareness (TESSA).

    PubMed

    Nikolaev, Sergei

    2011-08-01

    We describe optical systems modeling in the Testbed Environment for Space Situational Awareness (TESSA) simulator. We begin by presenting a brief outline of the overall TESSA architecture and focus on components for modeling optical sensors. Both image generation and image processing stages are described in detail, highlighting the differences in modeling ground- and space-based sensors. We conclude by outlining the applicability domains for the TESSA simulator, including potential real-life scenarios. PMID:21833092

  10. The Photovoltaic Engineering Testbed: Design options and trade-offs

    NASA Astrophysics Data System (ADS)

    Landis, Geoffrey A.; Sexton, Andrew; Abramczyk, Richard; Francz, Joseph; Johnson, D. B.; Yang, Liu; Minjares, Daniel; Myers, James

    2000-01-01

    The Photovoltaic Engineering Testbed (PET) is a space-exposure test facility to fly on the International Space Station to calibrate, test, and qualify advanced solar cell types in the space environment. The purpose is to reduce the cost of validating new technologies and bringing them to spaceflight readiness by measuring them in the in-space environment. This paper reviews engineering options considered for flying PET on the International Space Station, and presents the current status of development. .