Science.gov

Sample records for open cloud testbed

  1. Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus

    NASA Astrophysics Data System (ADS)

    Baun, Christian; Kunze, Marcel

    Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.

  2. A boundary-layer cloud study using Southern Great Plains Cloud and radiation testbed (CART) data

    SciTech Connect

    Albrecht, B.; Mace, G.; Dong, X.; Syrett, W.

    1996-04-01

    Boundary layer clouds-stratus and fairweather cumulus - are closely coupled involves the radiative impact of the clouds on the surface energy budget and the strong dependence of cloud formation and maintenance on the turbulent fluxes of heat and moisture in the boundary layer. The continuous data collection at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site provides a unique opportunity to study components of the coupling processes associated with boundary layer clouds and to provide descriptions of cloud and boundary layer structure that can be used to test parameterizations used in climate models. But before the CART data can be used for process studies and parameterization testing, it is necessary to evaluate and validate data and to develop techniques for effectively combining the data to provide meaningful descriptions of cloud and boundary layer characteristics. In this study we use measurements made during an intensive observing period we consider a case where low-level stratus were observed at the site for about 18 hours. This case is being used to examine the temporal evolution of cloud base, cloud top, cloud liquid water content, surface radiative fluxes, and boundary layer structure. A method for inferring cloud microphysics from these parameters is currently being evaluated.

  3. Clouds over Open Ocean

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The heavy concentration of these cirrocumulus and nimbostratus clouds over open ocean - location unknown, indicate that a heavy downpouring of rain is occuring on the Earth's surface below. Towering anvils, seen rising high above the base cloud cover and casting long shadows, also indicate high winds and possible tornado activity.

  4. Clouds over Open Ocean

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The heavy concentration of these cirrocumulus and nimbostratus clouds over open ocean - location unknown, indicate that a heavy downpouring of rain is occuring on the Earth's surface below. Towering anvils, rising above the base cloud cover, also indicate high winds and possible tornado activity.

  5. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  6. Clouds and Radiation Testbed Data Environment: Site data system and experiment center

    SciTech Connect

    Melton, R.B.; Bobrowski, S.F. ); Campbell, A.P. ); Corbet, J.M. ); Edwards, D.M. ); Kanciruk, P. ); Tichler, J.L. (Brookhaven National Lab

    1992-01-01

    The Department of Energy (DOE) has initiated the Atmospheric Radiation Measurement (ARM) program as a research effort to reduce the uncertainties found in general circulation and other models due to the effects of clouds and solar radiation (DOE 1990, Patrinos, et al. 1990). This program will provide an experimental testbed for the study of important atmospheric effects, particularly cloud and radiative processes, and testing of parameterizations of the processes for use in atmospheric models. The design of the testbed, known as the Clouds and Radiation Testbed (CART), calls for five long-term field data collection sites as well as a mobile set of instrumentation to be used in short-term field campaigns. The first of the sites is expected to begin operation in April of 1992. Within the ARM Program, an experiment has been defined as the prospective test of a model, i.e., the test of a model's predictive capability. An experiment is specified by identifying the model or models to be tested, the model input requirements, the measurements needed for comparison to model outputs, and the measurements needed to diagnose model performance. The identification of required measurements includes the specification of data fusion or other techniques to be used in converting the basic instrument observations into the required set of measurements. The CART Data Environment (CDE) is the element of the testbed which acquires the basic observations from the instruments and processes them to meet the measurement requirements of ARM experiments.

  7. Clouds and Radiation Testbed Data Environment: Site data system and experiment center

    SciTech Connect

    Melton, R.B.; Bobrowski, S.F.; Campbell, A.P.; Corbet, J.M.; Edwards, D.M.; Kanciruk, P.; Tichler, J.L.

    1992-01-01

    The Department of Energy (DOE) has initiated the Atmospheric Radiation Measurement (ARM) program as a research effort to reduce the uncertainties found in general circulation and other models due to the effects of clouds and solar radiation (DOE 1990, Patrinos, et al. 1990). This program will provide an experimental testbed for the study of important atmospheric effects, particularly cloud and radiative processes, and testing of parameterizations of the processes for use in atmospheric models. The design of the testbed, known as the Clouds and Radiation Testbed (CART), calls for five long-term field data collection sites as well as a mobile set of instrumentation to be used in short-term field campaigns. The first of the sites is expected to begin operation in April of 1992. Within the ARM Program, an experiment has been defined as the prospective test of a model, i.e., the test of a model`s predictive capability. An experiment is specified by identifying the model or models to be tested, the model input requirements, the measurements needed for comparison to model outputs, and the measurements needed to diagnose model performance. The identification of required measurements includes the specification of data fusion or other techniques to be used in converting the basic instrument observations into the required set of measurements. The CART Data Environment (CDE) is the element of the testbed which acquires the basic observations from the instruments and processes them to meet the measurement requirements of ARM experiments.

  8. Clouds over Open Ocean

    NASA Technical Reports Server (NTRS)

    1981-01-01

    These cirrocumulus clouds photographed over open ocean - location unknown, seem to be in a state of agitation as in the early stages of storm development. The window frame at the top of this scene obscures almost half of the image and the panel seen in the middle of the scene is a corner of one of the payload bay closure doors.

  9. Open Ocean, Sunglint and Clouds

    NASA Technical Reports Server (NTRS)

    1990-01-01

    This interesting and well composed view of open ocean, sunglint and clouds, location unknown, illustrates the reflective nature of water. Smooth water surfaces have a very high reflective surface like a mirror but rough water surfaces with lots of choppy waves, have a very low reflective potential and appear as mostly dark patches. The combination of low sun angle, clouds, shadows and sea states produces an attractive composition of contrasts.

  10. Site/Systems Operations, Maintenance and Facilities Management of the Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site

    SciTech Connect

    Wu, Susan

    2005-08-01

    This contract covered the site/systems operations, maintenance, and facilities management of the DOE Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) Site.

  11. The GridEcon Platform: A Business Scenario Testbed for Commercial Cloud Services

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn; Guo, Li; Fleming, Alan; Courcoubetis, Costas

    Within this paper, we present the GridEcon Platform, a testbed for designing and evaluating economics-aware services in a commercial Cloud computing setting. The Platform is based on the idea that the exact working of such services is difficult to predict in the context of a market and, therefore, an environment for evaluating its behavior in an emulated market is needed. To identify the components of the GridEcon Platform, a number of economics-aware services and their interactions have been envisioned. The two most important components of the platform are the Marketplace and the Workflow Engine. The Workflow Engine allows the simple composition of a market environment by describing the service interactions between economics-aware services. The Marketplace allows trading goods using different market mechanisms. The capabilities of these components of the GridEcon Platform in conjunction with the economics-aware services are described in this paper in detail. The validation of an implemented market mechanism and a capacity planning service using the GridEcon Platform also demonstrated the usefulness of the GridEcon Platform.

  12. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-12-31

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  13. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-01-01

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  14. Scaling the CERN OpenStack cloud

    NASA Astrophysics Data System (ADS)

    Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.

    2015-12-01

    CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.

  15. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.; Wasmer, F.; Pentecost, E.D.

    1992-03-01

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth`s atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy`s Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climate change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described.

  16. CanOpen on RASTA: The Integration of the CanOpen IP Core in the Avionics Testbed

    NASA Astrophysics Data System (ADS)

    Furano, Gianluca; Guettache, Farid; Magistrati, Giorgio; Tiotto, Gabriele; Ortega, Carlos Urbina; Valverde, Alberto

    2013-08-01

    This paper presents the work done within the ESA Estec Data Systems Division, targeting the integration of the CanOpen IP Core with the existing Reference Architecture Test-bed for Avionics (RASTA). RASTA is the reference testbed system of the ESA Avionics Lab, designed to integrate the main elements of a typical Data Handling system. It aims at simulating a scenario where a Mission Control Center communicates with on-board computers and systems through a TM/TC link, thus providing the data management through qualified processors and interfaces such as Leon2 core processors, CAN bus controllers, MIL-STD-1553 and SpaceWire. This activity aims at the extension of the RASTA with two boards equipped with HurriCANe controller, acting as CANOpen slaves. CANOpen software modules have been ported on the RASTA system I/O boards equipped with Gaisler GR-CAN controller and acts as master communicating with the CCIPC boards. CanOpen serves as upper application layer for based on CAN defined within the CAN-in-Automation standard and can be regarded as the definitive standard for the implementation of CAN-based systems solutions. The development and integration of CCIPC performed by SITAEL S.p.A., is the first application that aims to bring the CANOpen standard for space applications. The definition of CANOpen within the European Cooperation for Space Standardization (ECSS) is under development.

  17. An open science cloud for scientific research

    NASA Astrophysics Data System (ADS)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  18. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  19. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  20. Analytical study of the effects of the Low-Level Jet on moisture convergence and vertical motion fields at the Southern Great Plains Cloud and Radiation Testbed site

    SciTech Connect

    Bian, X.; Zhong, S.; Whiteman, C.D.; Stage, S.A.

    1996-04-01

    The Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) is located in a region that is strongly affected by a prominent meteorological phenomenon--the Great Plains Low-Level Jet (LLJ). Observations have shown that the LLJ plays a vital role in spring and summertime cloud formation and precipitation over the Great Plains. An improved understanding of the LLJ characteristics and its impact on the environment is necessary for addressing the fundamental issue of development and testing of radiational transfer and cloud parameterization schemes for the general circulation models (GCMs) using data from the SGP CART site. A climatological analysis of the summertime LLJ over the SGP has been carried out using hourly observations from the National Oceanic and Atmospheric Administration (NOAA) Wind Profiler Demonstration Network and from the ARM June 1993 Intensive Observation Period (IOP). The hourly data provide an enhanced temporal and spatial resolution relative to earlier studies which used 6- and 12-hourly rawinsonde observations at fewer stations.

  1. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-01

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com.

  2. Implementation of Raman lidar for profiling of atmospheric water vapor and aerosols at the Southern Great Plains Cloud and Radiation Testbed Site

    SciTech Connect

    Goldsmith, J.E.M.; Bisson, S.E.; Blair, F.H.; Whiteman, D.N.; Melfi, S.H.

    1995-04-01

    There are clearly identified scientific requirements for continuous profiling of atmospheric water vapor at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site. Research conducted at several laboratories, including our own collaboration in a previous Instrument Development Project for the Atmospheric Radiation Measurement (ARM) Program, has demonstrated the suitability of Raman lidar for providing measurements that are an excellent match to those requirements. We are currently building a rugged Raman lidar system that will reside permanently at the CART site and that is computer-automated to reduce the requirements for operator interaction. In addition to the design goal of profiling water vapor through most of the troposphere during nighttime and through the boundary layer during daytime, the lidar is intended to provide quantitative characterizations of aerosols and clouds, including depolarization measurements for particle phase studies. Raman lidar systems detect selected species by monitoring the wavelength-shifted molecular return produced by Raman scattering from the chosen molecule or molecules. For water-vapor measurements, the nitrogen Raman signal is observed simultaneously with the water-vapor Raman signal; proper ratioing of the signals yields the water-vapor mixing ratio. Similarly, when the backscatter signal at the laser wavelength (which contains contributions from both Rayleigh and aerosol scattering) is also recorded simultaneously, the ratio of the backscatter signal to the nitrogen Raman signal yields a quantitative measurement of the aerosol scattering ratio. A variety of aerosol and cloud parameters can be derived from this measurement. In aerosol-free regions of the atmosphere, temperature profiles can be derived from the density measurements obtained from the nitrogen Raman signal.

  3. A comparison of radiometric fluxes influenced by parameterization cirrus clouds with observed fluxes at the Southern Great Plains (SGP) cloud and radiation testbed (CART) site

    SciTech Connect

    Mace, G.G.; Ackerman, T.P.; George, A.T.

    1996-04-01

    The data from the Atmospheric Radiation Measurement (ARM) Program`s Southern Great plains Site (SCP) is a valuable resource. We have developed an operational data processing and analysis methodology that allows us to examine continuously the influence of clouds on the radiation field and to test new and existing cloud and radiation parameterizations.

  4. Precipitation-generated oscillations in open cellular cloud fields.

    PubMed

    Feingold, Graham; Koren, Ilan; Wang, Hailong; Xue, Huiwen; Brewer, Wm Alan

    2010-08-12

    Cloud fields adopt many different patterns that can have a profound effect on the amount of sunlight reflected back to space, with important implications for the Earth's climate. These cloud patterns can be observed in satellite images of the Earth and often exhibit distinct cell-like structures associated with organized convection at scales of tens of kilometres. Recent evidence has shown that atmospheric aerosol particles-through their influence on precipitation formation-help to determine whether cloud fields take on closed (more reflective) or open (less reflective) cellular patterns. The physical mechanisms controlling the formation and evolution of these cells, however, are still poorly understood, limiting our ability to simulate realistically the effects of clouds on global reflectance. Here we use satellite imagery and numerical models to show how precipitating clouds produce an open cellular cloud pattern that oscillates between different, weakly stable states. The oscillations are a result of precipitation causing downward motion and outflow from clouds that were previously positively buoyant. The evaporating precipitation drives air down to the Earth's surface, where it diverges and collides with the outflows of neighbouring precipitating cells. These colliding outflows form surface convergence zones and new cloud formation. In turn, the newly formed clouds produce precipitation and new colliding outflow patterns that are displaced from the previous ones. As successive cycles of this kind unfold, convergence zones alternate with divergence zones and new cloud patterns emerge to replace old ones. The result is an oscillating, self-organized system with a characteristic cell size and precipitation frequency.

  5. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-01

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com. PMID:23305951

  6. Optical interferometer testbed

    NASA Technical Reports Server (NTRS)

    Blackwood, Gary H.

    1991-01-01

    Viewgraphs on optical interferometer testbed presented at the MIT Space Research Engineering Center 3rd Annual Symposium are included. Topics covered include: space-based optical interferometer; optical metrology; sensors and actuators; real time control hardware; controlled structures technology (CST) design methodology; identification for MIMO control; FEM/ID correlation for the naked truss; disturbance modeling; disturbance source implementation; structure design: passive damping; low authority control; active isolation of lightweight mirrors on flexible structures; open loop transfer function of mirror; and global/high authority control.

  7. Fast Physics Testbed for the FASTER Project

    SciTech Connect

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  8. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  9. Open reading frame phylogenetic analysis on the cloud.

    PubMed

    Hung, Che-Lun; Lin, Chun-Yuan

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  10. Cloud-Based Model Calibration Using OpenStudio: Preprint

    SciTech Connect

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  11. Tidal disruption of open clusters in their parent molecular clouds

    NASA Technical Reports Server (NTRS)

    Long, Kevin

    1989-01-01

    A simple model of tidal encounters has been applied to the problem of an open cluster in a clumpy molecular cloud. The parameters of the clumps are taken from the Blitz, Stark, and Long (1988) catalog of clumps in the Rosette molecular cloud. Encounters are modeled as impulsive, rectilinear collisions between Plummer spheres, but the tidal approximation is not invoked. Mass and binding energy changes during an encounter are computed by considering the velocity impulses given to individual stars in a random realization of a Plummer sphere. Mean rates of mass and binding energy loss are then computed by integrating over many encounters. Self-similar evolutionary calculations using these rates indicate that the disruption process is most sensitive to the cluster radius and relatively insensitive to cluster mass. The calculations indicate that clusters which are born in a cloud similar to the Rosette with a cluster radius greater than about 2.5 pc will not survive long enough to leave the cloud. The majority of clusters, however, have smaller radii and will survive the passage through their parent cloud.

  12. Earth Limb and Clouds over Open Ocean, Location Unknown

    NASA Technical Reports Server (NTRS)

    1981-01-01

    A northern hemisphere tropical storm over open ocean - location unknown, can be seen forming by the loosly defined cyclonic spiral gyre within the cumulonimbus clouds. The storm can be readily identified as being in the northern hemisphere by the counter-clockwise rotation of the gyre. Because of the Earth's rotation induced coriolis effect, all northern hemisphere cyclonic circulations rotate in a counter-clockwise spiral and all those in the southern hemisphere rotate in a clockwise spiral.

  13. The EPOS Vision for the Open Science Cloud

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  14. Earth Limb and Hurricane Clouds over Open Ocean, Location Unknown

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A northern hemisphere tropical storm or hurricane over open ocean - location unknown, can be seen in the early stages of forming by the loosly defined cyclonic spiral gyre and eye within the cumulonimbus clouds. The storm can be readily identified as being in the northern hemisphere by the counter- clockwise rotation of the gyre. Because of the Earth's rotation induced coriolis effect, all northern hemisphere cyclonic circulations rotate in a counter-clockwise spiral and all those in the southern hemisphere rotate in a clockwise spiral.

  15. Earth Limb and Hurricane Clouds over Open Ocean, Location Unknown

    NASA Technical Reports Server (NTRS)

    1985-01-01

    A northern hemisphere tropical storm or hurricane over open ocean - location unknown, can be seen in the early stages of forming by the loosly defined cyclonic spiral gyre and eye within the cumulonimbus clouds. The storm can be readily identified as being in the northern hemisphere by the counter- clockwise rotation of the gyre. Because of the Earth's rotation induced coriolis effect, all northern hemisphere cyclonic circulations rotate in a counter-clockwise spiral and all those in the southern hemisphere rotate in a clockwise spiral.

  16. Cooling Earth's temperature by seeding marine stratocumulus clouds for increasing cloud cover by closing open cells

    NASA Astrophysics Data System (ADS)

    Daniel, R.

    2008-12-01

    The transition from open to closed cellular convection in marine stratocumulus is very sensitive to small concentrations of cloud condensation nuclei (CCN) aerosols. Addition of small amounts of CCN (about 100 cm-3) to the marine boundary layer (MBL) can close the open cells and by that increase the cloud cover from about 40% to nearly 100%, with negative radiative forcing exceeding 100 wm-2. We show satellite measurements that demonstrate this sensitivity by inadvertent experiments of old and diluted ship tracks. With the methodology suggested by Salter and Latham for spraying sub-micron sea water drops that serve as CCN, it is possible to close sufficiently large area of open cells for achieving the negative radiative forcing that is necessary to balance the greenhouse gases positive forcing. We show calculations of the feasibility of such an undertaking, and suggest that this is an economically feasible method with the least potential risks, when compared to seeding marine stratocumulus for enhancing their albedo or with seeding the stratosphere with bright or dark aerosols. Global Circulation models coupled with the ocean and the ice are necessary to calculate the impact and the possible side effects.

  17. Open Source Software Reuse in the Airborne Cloud Computing Environment

    NASA Astrophysics Data System (ADS)

    Khudikyan, S. E.; Hart, A. F.; Hardman, S.; Freeborn, D.; Davoodi, F.; Resneck, G.; Mattmann, C. A.; Crichton, D. J.

    2012-12-01

    Earth science airborne missions play an important role in helping humans understand our climate. A challenge for airborne campaigns in contrast to larger NASA missions is that their relatively modest budgets do not permit the ground-up development of data management tools. These smaller missions generally consist of scientists whose primary focus is on the algorithmic and scientific aspects of the mission, which often leaves data management software and systems to be addressed as an afterthought. The Airborne Cloud Computing Environment (ACCE), developed by the Jet Propulsion Laboratory (JPL) to support Earth Science Airborne Program, is a reusable, multi-mission data system environment for NASA airborne missions. ACCE provides missions with a cloud-enabled platform for managing their data. The platform consists of a comprehensive set of robust data management capabilities that cover everything from data ingestion and archiving, to algorithmic processing, and to data delivery. Missions interact with this system programmatically as well as via browser-based user interfaces. The core components of ACCE are largely based on Apache Object Oriented Data Technology (OODT), an open source information integration framework at the Apache Software Foundation (ASF). Apache OODT is designed around a component-based architecture that allows for selective combination of components to create highly configurable data management systems. The diverse and growing community that currently contributes to Apache OODT fosters on-going growth and maturation of the software. ACCE's key objective is to reduce cost and risks associated with developing data management systems for airborne missions. Software reuse plays a prominent role in mitigating these problems. By providing a reusable platform based on open source software, ACCE enables airborne missions to allocate more resources to their scientific goals, thereby opening the doors to increased scientific discovery.

  18. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  19. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  20. Aspects of the quality of data from the Southern Great Plains (SGP) cloud and radiation testbed (CART) site broadband radiation sensors

    SciTech Connect

    Splitt, M.E.; Wesely, M.L.

    1996-04-01

    A systmatic evaluation of the performance of broadband radiometers at the Radiation Testbed (CART) site is needed to estimate the uncertainties of the irradiance observations. Here, net radiation observed with the net radiometer in the enrgy balance Bowen ratio station at the Central facility is compared with the net radiation computed as the sum of component irradiances recorded by nearby pyranameters and pyrgeometers. In addition, data obtained from the central facility pyranometers, pyrgeometers, and pyrheliometers are examined for April 1994, when intensive operations periods were being carried out. The data used in this study are from central facility radiometers in a solar and infrared observation station, and EBBR station, the so-called `BSRN` set of upward pointing radiometers, and a set of radiometers pointed down at the 25-m level of a 60-m tower.

  1. The Fizeau Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Zhang, Xiaolei; Carpenter, Kenneth G.; Lyon, Richard G,; Huet, Hubert; Marzouk, Joe; Solyar, Gregory

    2003-01-01

    The Fizeau Interferometer Testbed (FIT) is a collaborative effort between NASA's Goddard Space Flight Center, the Naval Research Laboratory, Sigma Space Corporation, and the University of Maryland. The testbed will be used to explore the principles of and the requirements for the full, as well as the pathfinder, Stellar Imager mission concept. It has a long term goal of demonstrating closed-loop control of a sparse array of numerous articulated mirrors to keep optical beams in phase and optimize interferometric synthesis imaging. In this paper we present the optical and data acquisition system design of the testbed, and discuss the wavefront sensing and control algorithms to be used. Currently we have completed the initial design and hardware procurement for the FIT. The assembly and testing of the Testbed will be underway at Goddard's Instrument Development Lab in the coming months.

  2. Modeling Aerosol-Cloud Interactions in Marine Open- and Closed-Cell Stratocumulus

    NASA Astrophysics Data System (ADS)

    Wang, H.; Feingold, G.

    2008-12-01

    Satellite imagery shows the recurrence of striking images of cellular structures exhibiting both closed- and open-cell patterns in marine stratocumulus fields. The open-cell region has much lower cloud albedo than closed cells. Aside from that, previous observational and modeling studies have suggested that open- and closed-cell regions are different in many other aspects, such as concentration of cloud condensation nuclei (CCN), cloud droplet number and size, precipitation efficiency, and cloud dynamics. In this work, aerosol- cloud interactions and dynamical feedbacks are investigated within a large eddy simulation (LES) modeling framework to study the activation, cloud scavenging, mixing and transport of CCN in the open- and closed- cell boundary layer and near the open/closed-cell boundaries. The model domain size of 120 km by 60 km is large enough to represent mesoscale organizations that are associated with different cellular structures and that are promoted by CCN perturbation from ship emissions. Simulation results show that depletion of CCN by collision and coalescence in clouds is critical to the formation of precipitation and open-cell structure in a stratocumulus deck. Once the open cellular structure has formed in the clean environment, a substantial increase of CCN transported from a neighboring polluted environment or from ship emissions do not close it during the 12-hour simulation due to the lack of dynamical and moisture support in the open-cell cloud-free region. However, the contaminated open cells are not able to self-sustain as a result of shutoff of precipitation. This points to the critical role of precipitation-triggered circulations in maintaining an open-cellular structure.

  3. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682

  4. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  5. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  6. MIT's interferometer CST testbed

    NASA Technical Reports Server (NTRS)

    Hyde, Tupper; Kim, ED; Anderson, Eric; Blackwood, Gary; Lublin, Leonard

    1990-01-01

    The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.

  7. MIT's interferometer CST testbed

    NASA Astrophysics Data System (ADS)

    Hyde, Tupper; Kim, Ed; Anderson, Eric; Blackwood, Gary; Lublin, Leonard

    1990-12-01

    The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.

  8. INFN Tier-1 Testbed Facility

    NASA Astrophysics Data System (ADS)

    Gregori, Daniele; Cavalli, Alessandro; dell'Agnello, Luca; Dal Pra, Stefano; Prosperini, Andrea; Ricci, Pierpaolo; Ronchieri, Elisabetta; Sapunenko, Vladimir

    2012-12-01

    INFN-CNAF, located in Bologna, is the Information Technology Center of National Institute of Nuclear Physics (INFN). In the framework of the Worldwide LHC Computing Grid, INFN-CNAF is one of the eleven worldwide Tier-1 centers to store and reprocessing Large Hadron Collider (LHC) data. The Italian Tier-1 provides the resources of storage (i.e., disk space for short term needs and tapes for long term needs) and computing power that are needed for data processing and analysis to the LHC scientific community. Furthermore, INFN Tier-1 houses computing resources for other particle physics experiments, like CDF at Fermilab, SuperB at Frascati, as well as for astro particle and spatial physics experiments. The computing center is a very complex infrastructure, the hardaware layer include the network, storage and farming area, while the software layer includes open source and proprietary software. Software updating and new hardware adding can unexpectedly deteriorate the production activity of the center: therefore a testbed facility has been set up in order to reproduce and certify the various layers of the Tier-1. In this article we describe the testbed and the checks performed.

  9. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program is developing initial recommendations for requirements and design approaches for the information systems of the Space Station era. During this quarter, drafting of the final reports of the various participants was initiated. Several drafts are included in this report as the University technical reports.

  10. Use of AVHRR-derived spectral reflectances to estimate surface albedo across the Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Qiu, J.; Gao, W.

    1997-03-01

    Substantial variations in surface albedo across a large area cause difficulty in estimating regional net solar radiation and atmospheric absorption of shortwave radiation when only ground point measurements of surface albedo are used to represent the whole area. Information on spatial variations and site-wide averages of surface albedo, which vary with the underlying surface type and conditions and the solar zenith angle, is important for studies of clouds and atmospheric radiation over a large surface area. In this study, a bidirectional reflectance model was used to inversely retrieve surface properties such as leaf area index and then the bidirectional reflectance distribution was calculated by using the same radiation model. The albedo was calculated by converting the narrowband reflectance to broadband reflectance and then integrating over the upper hemisphere.

  11. Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun

    NASA Technical Reports Server (NTRS)

    Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.

    2008-01-01

    Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.

  12. Continuation: The EOSDIS testbed data system

    NASA Technical Reports Server (NTRS)

    Emery, Bill; Kelley, Timothy D.

    1995-01-01

    The continuation of the EOSDIS testbed ('Testbed') has materialized from a multi-task system to a fully functional stand-alone data archive distribution center that once was only X-Windows driven to a system that is accessible by all types of users and computers via the World Wide Web. Throughout the past months, the Testbed has evolved into a completely new system. The current system is now accessible through Netscape, Mosaic, and all other servers that can contact the World Wide Web. On October 1, 1995 we will open to the public and we expect that the statistics of the type of user, where they are located, and what they are looking for will drastically change. What is the most important change in the Testbed has been the Web interface. This interface will allow more users access to the system and walk them through the data types with more ease than before. All of the callbacks are written in such a way that icons can be used to easily move around in the programs interface. The homepage offers the user the opportunity to go and get more information about each satellite data type and also information on free programs. These programs are grouped into categories for types of computers that the programs are compiled for, along with information on how to FTP the programs back to the end users computer. The heart of the Testbed is still the acquisition of satellite data. From the Testbed homepage, the user selects the 'access to data system' icon, which will take them to the world map and allow them to select an area that they would like coverage on by simply clicking that area of the map. This creates a new map where other similar choices can be made to get the latitude and longitude of the region the satellite data will cover. Once a selection has been made the search parameters page will appear to be filled out. Afterwards, the browse image will be called for once the search is completed and the images for viewing can be selected. There are several other option pages

  13. Establishment of an NWP testbed using ARM data

    SciTech Connect

    O'Connor, E.; Liu, Y.; Hogan, R.

    2010-03-15

    The aim of the FAst-physics System TEstbed and Research (FASTER) project is to evaluate and improve the parameterizations of fast physics (involving clouds, precipitation, aerosol) in numerical models using ARM measurements. One objective within FASTER is to evaluate model representations of fast physics with long-term continuous cloud observations by use of an 'NWP testbed'. This approach was successful in the European Cloudnet project. NWP model data (NCEP, ECMWF, etc.) is routinely output at ARM sites, and model evaluation can potentially be achieved in quasi-real time. In this poster, we will outline our progress in the development of the NWP testbed and discuss the successful integration of ARM algorithms, such as ARSCL, with algorithms and lessons learned from Cloudnet. Preliminary results will be presented of the evaluation of the ECMWF, NCEP, and UK Met Office models over the SGP site using this approach.

  14. High speed quantum communication testbed

    NASA Astrophysics Data System (ADS)

    Williams, Carl J.; Tang, Xiao; Heikkero, Mikko; Rouzaud, Julie; Lu, Richang; Goedecke, Andreas; Migdall, Alan L.; Mink, Alan; Nakassis, Anastase; Pibida, Leticia S.; Wen, Jesse; Hagley, Edward; Clark, Charles W.

    2002-12-01

    We describe the status of the NIST Quantum Communication Testbed (QCT) facility. QCT is a facility for exploring quantum communication in an environment similar to that projected for early commercial implementations: quantum cryptographic key exchange on a gigabit/second free-space optical (FSO) channel. Its purpose is to provide an open platform for testing and validating performance in the application, network, and physical layers of quantum communications systems. The channel uses modified commercial FSO equipment to link two buildings on the Gaithersburg, MD campus of the National Institute of Standards and Technology (NIST), separated by approximately 600 meters. At the time of writing, QCT is under construction; it will eventually be made available to the research community as a user facility. This paper presents the basic design considerations underlying QCT, and reports the status of the project.

  15. Point Cloud Visualization in AN Open Source 3d Globe

    NASA Astrophysics Data System (ADS)

    De La Calle, M.; Gómez-Deck, D.; Koehler, O.; Pulido, F.

    2011-09-01

    During the last years the usage of 3D applications in GIS is becoming more popular. Since the appearance of Google Earth, users are familiarized with 3D environments. On the other hand, nowadays computers with 3D acceleration are common, broadband access is widespread and the public information that can be used in GIS clients that are able to use data from the Internet is constantly increasing. There are currently several libraries suitable for this kind of applications. Based on these facts, and using libraries that are already developed and connected to our own developments, we are working on the implementation of a real 3D GIS with analysis capabilities. Since a 3D GIS such as this can be very interesting for tasks like LiDAR or Laser Scanner point clouds rendering and analysis, special attention is given to get an optimal handling of very large data sets. Glob3 will be a multidimensional GIS in which 3D point clouds could be explored and analysed, even if they are consist of several million points.The latest addition to our visualization libraries is the development of a points cloud server that works regardless of the cloud's size. The server receives and processes petitions from a 3d client (for example glob3, but could be any other, such as one based on WebGL) and delivers the data in the form of pre-processed tiles, depending on the required level of detail.

  16. The Palomar Testbed Interferometer

    NASA Technical Reports Server (NTRS)

    Colavita, M. M.; Wallace, J. K.; Hines, B. E.; Gursel, Y.; Malbet, F.; Palmer, D. L.; Pan, X. P.; Shao, M.; Yu, J. W.; Boden, A. F.

    1999-01-01

    The Palomar Testbed Interferometer (PTI) is a long-baseline infrared interferometer located at Palomar Observatory, California. It was built as a testbed for interferometric techniques applicable to the Keck Interferometer. First fringes were obtained in 1995 July. PTI implements a dual-star architecture, tracking two stars simultaneously for phase referencing and narrow-angle astrometry. The three fixed 40 cm apertures can be combined pairwise to provide baselines to 110 m. The interferometer actively tracks the white-light fringe using an array detector at 2.2 microns and active delay lines with a range of +/-38 m. Laser metrology of the delay lines allows for servo control, and laser metrology of the complete optical path enables narrow-angle astrometric measurements. The instrument is highly automated, using a multiprocessing computer system for instrument control and sequencing.

  17. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  18. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program (TTPP) is intended to develop initial recommendations for requirements and design approaches for the information system of the Space Station era. Multiple scientific experiments are being performed, each exploring advanced technologies and technical approaches and each emulating some aspect of Space Station era science. The aggregate results of the program will serve to guide the development of future NASA information systems.

  19. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License

  20. Marshall Avionics Testbed System (MAST)

    NASA Technical Reports Server (NTRS)

    Smith, Wayne D.

    1989-01-01

    Work accomplished in the summer of 1989 in association with the NASA/ASEE Summer Faculty Research Fellowship Program at Marshall Space Flight Center is summarized. The project was aimed at developing detailed specifications for the Marshall Avionics System Testbed (MAST). This activity was to include the definition of the testbed requirements and the development of specifications for a set of standard network nodes for connecting the testbed to a variety of networks. The project was also to include developing a timetable for the design, implementation, programming and testing of the testbed. Specifications of both hardware and software components for the system were to be included.

  1. Advanced turboprop testbed systems study

    NASA Technical Reports Server (NTRS)

    Goldsmith, I. M.

    1982-01-01

    The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.

  2. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  3. LISA Optical Bench Testbed

    NASA Astrophysics Data System (ADS)

    Lieser, M.; d'Arcio, L.; Barke, S.; Bogenstahl, J.; Diekmann, C.; Diepholz, I.; Fitzsimons, E. D.; Gerberding, O.; Henning, J.-S.; Hewitson, M.; Hey, F. G.; Hogenhuis, H.; Killow, C. J.; Lucarelli, S.; Nikolov, S.; Perreur-Lloyd, M.; Pijnenburg, J.; Robertson, D. I.; Sohmer, A.; Taylor, A.; Tröbs, M.; Ward, H.; Weise, D.; Heinzel, G.; Danzmann, K.

    2013-01-01

    The optical bench (OB) is a part of the LISA spacecraft, situated between the telescope and the testmass. For measuring the inter-spacecraft distances there are several interferometers on the OB. The elegant breadboard of the OB for LISA is developed for the European Space Agency (ESA) by EADS Astrium, TNO Science & Industry, University of Glasgow and the Albert Einstein Intitute (AEI), the performance tests then will be done at the AEI. Here we present the testbed that will be used for the performance tests with the focus on the thermal environment and the laser infrastructure.

  4. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA s Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing an aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in realtime in a sequence similar to what would occur in the NAS. The ATC applications that can be studied are the Aeronautical Telecommunications Network s (ATN) Context Management (CM) and Controller Pilot Data Link Communications (CPDLC). The Surveillance applications are Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Services - Broadcast (TIS-B).

  5. Enabling Open Cloud Markets Through WS-Agreement Extensions

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn

    Research into computing resource markets has mainly considered the question of which market mechanisms provide a fair resource allocation. However, while developing such markets, the definition of the unit of trade (i.e. the definition of resource) has not been given much attention. In this paper, we analyze the requirements for tradable resource goods. Based on the results, we suggest a detailed goods definition, which is easy to understand, can be used with many market mechanisms, and addresses the needs of a Cloud resource market. The goods definition captures the complete system resource, including hardware specifications, software specifications, the terms of use, and a pricing function. To demonstrate the usefulness of such a standardized goods definition, we demonstrate its application in the form of a WS-Agreement template for a number of market mechanisms for commodity system resources.

  6. Building a Parallel Cloud Storage System using OpenStack’s Swift Object Store and Transformative Parallel I/O

    SciTech Connect

    Burns, Andrew J.; Lora, Kaleb D.; Martinez, Esteban; Shorter, Martel L.

    2012-07-30

    Our project consists of bleeding-edge research into replacing the traditional storage archives with a parallel, cloud-based storage solution. It used OpenStack's Swift Object Store cloud software. It's Benchmarked Swift for write speed and scalability. Our project is unique because Swift is typically used for reads and we are mostly concerned with write speeds. Cloud Storage is a viable archive solution because: (1) Container management for larger parallel archives might ease the migration workload; (2) Many tools that are written for cloud storage could be utilized for local archive; and (3) Current large cloud storage practices in industry could be utilized to manage a scalable archive solution.

  7. Autonomous Flying Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2005-01-01

    The Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis,Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights.

  8. Optical Network Testbeds Workshop

    SciTech Connect

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  9. Holodeck Testbed Project

    NASA Technical Reports Server (NTRS)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  10. Long Duration Sorbent Testbed

    NASA Technical Reports Server (NTRS)

    Howard, David F.; Knox, James C.; Long, David A.; Miller, Lee; Cmaric, Gregory; Thomas, John

    2016-01-01

    The Long Duration Sorbent Testbed (LDST) is a flight experiment demonstration designed to expose current and future candidate carbon dioxide removal system sorbents to an actual crewed space cabin environment to assess and compare sorption working capacity degradation resulting from long term operation. An analysis of sorbent materials returned to Earth after approximately one year of operation in the International Space Station's (ISS) Carbon Dioxide Removal Assembly (CDRA) indicated as much as a 70% loss of working capacity of the silica gel desiccant material at the extreme system inlet location, with a gradient of capacity loss down the bed. The primary science objective is to assess the degradation of potential sorbents for exploration class missions and ISS upgrades when operated in a true crewed space cabin environment. A secondary objective is to compare degradation of flight test to a ground test unit with contaminant dosing to determine applicability of ground testing.

  11. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  12. Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula

    NASA Astrophysics Data System (ADS)

    Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.

    2015-12-01

    Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.

  13. PROOF on the Cloud for ALICE using PoD and OpenNebula

    NASA Astrophysics Data System (ADS)

    Berzano, D.; Bagnasco, S.; Brunetti, R.; Lusso, S.

    2012-06-01

    In order to optimize the use and management of computing centres, their conversion to cloud facilities is becoming increasingly popular. In a medium to large cloud facility, many different virtual clusters may concur for the same resources: unused resources can be freed either by turning off idle virtual machines, or by lowering resources assigned to a virtual machine at runtime. PROOF, a ROOT-based parallel and interactive analysis framework, is officially endorsed in the computing model of the ALICE experiment as complementary to the Grid, and it has become very popular over the last three years. The locality of PROOF-based analysis facilities forces system administrators to scavenge resources, yet the chaotic nature of user analysis tasks deems them unstable and inconstantly used, making PROOF a typical use-case for HPC cloud computing. Currently, PoD dynamically and easily provides a PROOF-enabled cluster by submitting agents to a job scheduler. Unfortunately, a Tier-2 does not comfortably share the same queue between interactive and batch jobs, due to the very large average time to completion of the latter: an elastic cloud approach would enable interactive virtual machines to temporarily subtract resources to the batch ones, without a noticeable impact on them. In this work we describe our setup of a dynamic PROOF-based cloud analysis facility based on PoD and OpenNebula, orchestrated by a simple and lightweight control daemon that makes virtualization transparent for the user.

  14. Uav-Based Photogrammetric Point Clouds - Tree STEM Mapping in Open Stands in Comparison to Terrestrial Laser Scanner Point Clouds

    NASA Astrophysics Data System (ADS)

    Fritz, A.; Kattenborn, T.; Koch, B.

    2013-08-01

    In both ecology and forestry, there is a high demand for structural information of forest stands. Forest structures, due to their heterogeneity and density, are often difficult to assess. Hence, a variety of technologies are being applied to account for this "difficult to come by" information. Common techniques are aerial images or ground- and airborne-Lidar. In the present study we evaluate the potential use of unmanned aerial vehicles (UAVs) as a platform for tree stem detection in open stands. A flight campaign over a test site near Freiburg, Germany covering a target area of 120 × 75 [m2] was conducted. The dominant tree species of the site is oak (quercus robur) with almost no understory growth. Over 1000 images with a tilt angle of 45° were shot. The flight pattern applied consisted of two antipodal staggered flight routes at a height of 55 [m] above the ground. We used a Panasonic G3 consumer camera equipped with a 14-42 [mm] standard lens and a 16.6 megapixel sensor. The data collection took place in leaf-off state in April 2013. The area was prepared with artificial ground control points for transformation of the structure-from-motion (SFM) point cloud into real world coordinates. After processing, the results were compared with a terrestrial laser scanner (TLS) point cloud of the same area. In the 0.9 [ha] test area, 102 individual trees above 7 [cm] diameter at breast height were located on in the TLS-cloud. We chose the software CMVS/PMVS-2 since its algorithms are developed with focus on dense reconstruction. The processing chain for the UAV-acquired images consists of six steps: a. cleaning the data: removing of blurry, under- or over exposed and off-site images; b. applying the SIFT operator [Lowe, 2004]; c. image matching; d. bundle adjustment; e. clustering; and f. dense reconstruction. In total, 73 stems were considered as reconstructed and located within one meter of the reference trees. In general stems were far less accurate and complete as

  15. Updated Electronic Testbed System

    NASA Technical Reports Server (NTRS)

    Brewer, Kevin L.

    2001-01-01

    As we continue to advance in exploring space frontiers, technology must also advance. The need for faster data recovery and data processing is crucial. In this, the less equipment used, and lighter that equipment is, the better. Because integrated circuits become more sensitive in high altitude, experimental verification and quantification is required. The Center for Applied Radiation Research (CARR) at Prairie View A&M University was awarded a grant by NASA to participate in the NASA ER-2 Flight Program, the APEX balloon flight program, and the Student Launch Program. These programs are to test anomalous errors in integrated circuits due to single event effects (SEE). CARR had already begun experiments characterizing the SEE behavior of high speed and high density SRAM's. The research center built a error testing system using a PC-104 computer unit, an Iomega Zip drive for storage, a test board with the components under test, and a latchup detection and reset unit. A test program was written to continuously monitor a stored data pattern in the SRAM chip and record errors. The devices under test were eight 4Mbit memory chips totaling 4Mbytes of memory. CARR was successful at obtaining data using the Electronic TestBed System (EBS) in various NASA ER-2 test flights. These series of high altitude flights of up to 70,000 feet, were effective at yielding the conditions which single event effects usually occur. However, the data received from the series of flights indicated one error per twenty-four hours. Because flight test time is very expensive, the initial design proved not to be cost effective. The need for orders of magnitude with more memory became essential. Therefore, a project which could test more memory within a given time was created. The goal of this project was not only to test more memory within a given time, but also to have a system with a faster processing speed, and which used less peripherals. This paper will describe procedures used to build an

  16. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  17. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  18. Adaptive Signal Processing Testbed

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1991-09-01

    The design and implementation of a system for the acquisition, processing, and analysis of signal data is described. The initial application for the system is the development and analysis of algorithms for excision of interfering tones from direct sequence spread spectrum communication systems. The system is called the Adaptive Signal Processing Testbed (ASPT) and is an integrated hardware and software system built around the TMS320C30 chip. The hardware consists of a radio frequency data source, digital receiver, and an adaptive signal processor implemented on a Sun workstation. The software components of the ASPT consists of a number of packages including the Sun driver package; UNIX programs that support software development on the TMS320C30 boards; UNIX programs that provide the control, user interaction, and display capabilities for the data acquisition, processing, and analysis components of the ASPT; and programs that perform the ASPT functions including data acquisition, despreading, and adaptive filtering. The performance of the ASPT system is evaluated by comparing actual data rates against their desired values. A number of system limitations are identified and recommendations are made for improvements.

  19. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  20. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  1. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations. In neurosurgery, the needle used in the standard stereotactic CT (Computational Tomography) or MRI (Magnetic Resonance Imaging) guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled 'Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification' is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  2. Young open clusters in the Milky Way and Small Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Martayan, C.

    2010-01-01

    NGC 6611, Trumpler 14, Trumpler 15, Trumpler 16, and Collinder 232 are very young open clusters located in star-formation regions in the Eagle Nebula and Carina in the Milky Way, and NGC 346 in the Small Magellanic Cloud. With different instrumentation and techniques, it has been possible to detect and classify new Herbig Ae/Be and classical Be stars and to provide new tests/comparisons of the Be stars' appearance models. Special (He-strong) stars in these star-formation regions are also discussed.

  3. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  4. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  5. High-contrast imaging testbed

    SciTech Connect

    Baker, K; Silva, D; Poyneer, L; Macintosh, B; Bauman, B; Palmer, D; Remington, T; Delgadillo-Lariz, M

    2008-01-23

    Several high-contrast imaging systems are currently under construction to enable the detection of extra-solar planets. In order for these systems to achieve their objectives, however, there is considerable developmental work and testing which must take place. Given the need to perform these tests, a spatially-filtered Shack-Hartmann adaptive optics system has been assembled to evaluate new algorithms and hardware configurations which will be implemented in these future high-contrast imaging systems. In this article, construction and phase measurements of a membrane 'woofer' mirror are presented. In addition, results from closed-loop operation of the assembled testbed with static phase plates are presented. The testbed is currently being upgraded to enable operation at speeds approaching 500 hz and to enable studies of the interactions between the woofer and tweeter deformable mirrors.

  6. Generalized Nanosatellite Avionics Testbed Lab

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt

    2015-01-01

    The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.

  7. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  8. The NASA/OAST telerobot testbed architecture

    NASA Technical Reports Server (NTRS)

    Matijevic, J. R.; Zimmerman, W. F.; Dolinsky, S.

    1989-01-01

    Through a phased development such as a laboratory-based research testbed, the NASA/OAST Telerobot Testbed provides an environment for system test and demonstration of the technology which will usefully complement, significantly enhance, or even replace manned space activities. By integrating advanced sensing, robotic manipulation and intelligent control under human-interactive supervision, the Testbed will ultimately demonstrate execution of a variety of generic tasks suggestive of space assembly, maintenance, repair, and telescience. The Testbed system features a hierarchical layered control structure compatible with the incorporation of evolving technologies as they become available. The Testbed system is physically implemented in a computing architecture which allows for ease of integration of these technologies while preserving the flexibility for test of a variety of man-machine modes. The development currently in progress on the functional and implementation architectures of the NASA/OAST Testbed and capabilities planned for the coming years are presented.

  9. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  10. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  11. NASA's telemedicine testbeds: Commercial benefit

    NASA Astrophysics Data System (ADS)

    Doarn, Charles R.; Whitten, Raymond

    1998-01-01

    The National Aeronautics and Space Administration (NASA) has been developing and applying telemedicine to support space flight since the Agency's beginning. Telemetry of physiological parameters from spacecraft to ground controllers is critical to assess the health status of humans in extreme and remote environments. Requisite systems to support medical care and maintain readiness will evolve as mission duration and complexity increase. Developing appropriate protocols and procedures to support multinational, multicultural missions is a key objective of this activity. NASA has created an Agency-wide strategic plan that focuses on the development and integration of technology into the health care delivery systems for space flight to meet these challenges. In order to evaluate technology and systems that can enhance inflight medical care and medical education, NASA has established and conducted several testbeds. Additionally, in June of 1997, NASA established a Commercial Space Center (CSC) for Medical Informatics and Technology Applications at Yale University School of Medicine. These testbeds and the CSC foster the leveraging of technology and resources between government, academia and industry to enhance health care. This commercial endeavor will influence both the delivery of health care in space and on the ground. To date, NASA's activities in telemedicine have provided new ideas in the application of telecommunications and information systems to health care. NASA's Spacebridge to Russia, an Internet-based telemedicine testbed, is one example of how telemedicine and medical education can be conducted using the Internet and its associated tools. Other NASA activities, including the development of a portable telemedicine workstation, which has been demonstrated on the Crow Indian Reservation and in the Texas Prison System, show promise in serving as significant adjuncts to the delivery of health care. As NASA continues to meet the challenges of space flight, the

  12. Control design for the SERC experimental testbeds

    NASA Technical Reports Server (NTRS)

    Jacques, Robert; Blackwood, Gary; Macmartin, Douglas G.; How, Jonathan; Anderson, Eric

    1992-01-01

    Viewgraphs on control design for the Space Engineering Research Center experimental testbeds are presented. Topics covered include: SISO control design and results; sensor and actuator location; model identification; control design; experimental results; preliminary LAC experimental results; active vibration isolation problem statement; base flexibility coupling into isolation feedback loop; cantilever beam testbed; and closed loop results.

  13. CRYOTE (Cryogenic Orbital Testbed) Concept

    NASA Technical Reports Server (NTRS)

    Gravlee, Mari; Kutter, Bernard; Wollen, Mark; Rhys, Noah; Walls, Laurie

    2009-01-01

    Demonstrating cryo-fluid management (CFM) technologies in space is critical for advances in long duration space missions. Current space-based cryogenic propulsion is viable for hours, not the weeks to years needed by space exploration and space science. CRYogenic Orbital TEstbed (CRYOTE) provides an affordable low-risk environment to demonstrate a broad array of critical CFM technologies that cannot be tested in Earth's gravity. These technologies include system chilldown, transfer, handling, health management, mixing, pressure control, active cooling, and long-term storage. United Launch Alliance is partnering with Innovative Engineering Solutions, the National Aeronautics and Space Administration, and others to develop CRYOTE to fly as an auxiliary payload between the primary payload and the Centaur upper stage on an Atlas V rocket. Because satellites are expensive, the space industry is largely risk averse to incorporating unproven systems or conducting experiments using flight hardware that is supporting a primary mission. To minimize launch risk, the CRYOTE system will only activate after the primary payload is separated from the rocket. Flying the testbed as an auxiliary payload utilizes Evolved Expendable Launch Vehicle performance excess to cost-effectively demonstrate enhanced CFM.

  14. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  15. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  16. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  17. A Space Testbed for Photovoltaics

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Bailey, Sheila G.

    1998-01-01

    The Ohio Aerospace Institute and the NASA Lewis Research Center are designing and building a solar-cell calibration facility, the Photovoltaic Engineering Testbed (PET) to fly on the International Space Station to test advanced solar cell types in the space environment. A wide variety of advanced solar cell types have become available in the last decade. Some of these solar cells offer more than twice the power per unit area of the silicon cells used for the space station power system. They also offer the possibilities of lower cost, lighter weight, and longer lifetime. The purpose of the PET facility is to reduce the cost of validating new technologies and bringing them to spaceflight readiness. The facility will be used for three primary functions: calibration, measurement, and qualification. It is scheduled to be launched in June of 2002.

  18. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  19. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea-ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Crosier, Jonathan; Bower, Keith N.; Darbyshire, Eoghan; Taylor, Jonathan W.; Liu, Dantong; Allan, James D.; Williams, Paul I.; Gallagher, Martin W.; Choularton, Thomas W.

    2016-04-01

    The Arctic sea-ice is intricately coupled to the atmosphere[1]. The decreasing sea-ice extent with the changing climate raises questions about how Arctic cloud structure will respond. Any effort to answer these questions is hindered by the scarcity of atmospheric observations in this region. Comprehensive cloud and aerosol measurements could allow for an improved understanding of the relationship between surface conditions and cloud structure; knowledge which could be key in validating weather model forecasts. Previous studies[2] have shown via remote sensing that cloudiness increases over the marginal ice zone (MIZ) and ocean with comparison to the sea-ice; however, to our knowledge, detailed in-situ data of this transition have not been previously presented. In 2013, the Aerosol-Cloud Coupling and Climate Interactions in the Arctic (ACCACIA) campaign was carried out in the vicinity of Svalbard, Norway to collect in-situ observations of the Arctic atmosphere and investigate this issue. Fitted with a suite of remote sensing, cloud and aerosol instrumentation, the FAAM BAe-146 aircraft was used during the spring segment of the campaign (Mar-Apr 2013). One case study (23rd Mar 2013) produced excellent coverage of the atmospheric changes when transitioning from sea-ice, through the MIZ, to the open ocean. Clear microphysical changes were observed, with the cloud liquid-water content increasing by almost four times over the transition. Cloud base, depth and droplet number also increased, whilst ice number concentrations decreased slightly. The surface warmed by ~13 K from sea-ice to ocean, with minor differences in aerosol particle number (of sizes corresponding to Cloud Condensation Nuclei or Ice Nucleating Particles) observed, suggesting that the primary driver of these microphysical changes was the increased heat fluxes and induced turbulence from the warm ocean surface as expected. References: [1] Kapsch, M.L., Graversen, R.G. and Tjernström, M. Springtime

  20. Experiments Program for NASA's Space Communications Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Reinhart, Richard

    2012-01-01

    NASA developed a testbed for communications and navigation that was launched to the International Space Station in 2012. The testbed promotes new software defined radio (SDR) technologies and addresses associated operational concepts for space-based SDRs, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. The experiments program consists of a mix of in-house and external experiments from partners in industry, academia, and government. The experiments will investigate key challenges in communications, networking, and global positioning system navigation both on the ground and on orbit. This presentation will discuss some of the key opportunities and challenges for the testbed experiments program.

  1. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  2. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  3. STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.

    PubMed

    Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756

  4. STORMSeq: An Open-Source, User-Friendly Pipeline for Processing Personal Genomics Data in the Cloud

    PubMed Central

    Karczewski, Konrad J.; Fernald, Guy Haskin; Martin, Alicia R.; Snyder, Michael; Tatonetti, Nicholas P.; Dudley, Joel T.

    2014-01-01

    The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5–10 hours to process a full exome sequence and $30 and 3–8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756

  5. The CMS integration grid testbed

    SciTech Connect

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  6. Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal; Rizzo, Maxime; Thompson, Patrick

    2009-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept study and a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC would provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 5-year mission lifetime. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables starlight suppression in broadband light from 480-960 nm. To demonstrate the VNC approach and advance it's technology readiness we have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  7. Inferring spatial clouds statistics from limited field-of-view, zenith observations

    SciTech Connect

    Sun, C.H.; Thorne, L.R.

    1996-04-01

    Many of the Cloud and Radiation Testbed (CART) measurements produce a time series of zenith observations, but spatial averages are often the desired data product. One possible approach to deriving spatial averages from temporal averages is to invoke Taylor`s hypothesis where and when it is valid. Taylor`s hypothesis states that when the turbulence is small compared with the mean flow, the covariance in time is related to the covariance in space by the speed of the mean flow. For clouds fields, Taylor`s hypothesis would apply when the {open_quotes}local{close_quotes} turbulence is small compared with advective flow (mean wind). The objective of this study is to determine under what conditions Taylor`s hypothesis holds or does not hold true for broken cloud fields.

  8. A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group

    NASA Astrophysics Data System (ADS)

    Graves, W. R.; Holliday, J. R.; Rundle, J. B.

    2010-12-01

    According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage

  9. OpenTopography: Addressing Big Data Challenges Using Cloud Computing, HPC, and Data Analytics

    NASA Astrophysics Data System (ADS)

    Crosby, C. J.; Nandigam, V.; Phan, M.; Youn, C.; Baru, C.; Arrowsmith, R.

    2014-12-01

    OpenTopography (OT) is a geoinformatics-based data facility initiated in 2009 for democratizing access to high-resolution topographic data, derived products, and tools. Hosted at the San Diego Supercomputer Center (SDSC), OT utilizes cyberinfrastructure, including large-scale data management, high-performance computing, and service-oriented architectures to provide efficient Web based access to large, high-resolution topographic datasets. OT collocates data with processing tools to enable users to quickly access custom data and derived products for their application. OT's ongoing R&D efforts aim to solve emerging technical challenges associated with exponential growth in data, higher order data products, as well as user base. Optimization of data management strategies can be informed by a comprehensive set of OT user access metrics that allows us to better understand usage patterns with respect to the data. By analyzing the spatiotemporal access patterns within the datasets, we can map areas of the data archive that are highly active (hot) versus the ones that are rarely accessed (cold). This enables us to architect a tiered storage environment consisting of high performance disk storage (SSD) for the hot areas and less expensive slower disk for the cold ones, thereby optimizing price to performance. From a compute perspective, OT is looking at cloud based solutions such as the Microsoft Azure platform to handle sudden increases in load. An OT virtual machine image in Microsoft's VM Depot can be invoked and deployed quickly in response to increased system demand. OT has also integrated SDSC HPC systems like the Gordon supercomputer into our infrastructure tier to enable compute intensive workloads like parallel computation of hydrologic routing on high resolution topography. This capability also allows OT to scale to HPC resources during high loads to meet user demand and provide more efficient processing. With a growing user base and maturing scientific user

  10. INDIGO: Building a DataCloud Framework to support Open Science

    NASA Astrophysics Data System (ADS)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  11. Reconstruction of passive open-path FTIR ambient spectra using meteorological measurements and its application for detection of aerosol cloud drift.

    PubMed

    Kira, Oz; Dubowski, Yael; Linker, Raphael

    2015-07-27

    Remote sensing of atmospheric aerosols is of great importance to public and environmental health. This research promotes a simple way of detecting an aerosol cloud using a passive Open Path FTIR (OP-FTIR) system, without utilizing radiative transfer models and without relying on an artificial light source. Meteorological measurements (temperature, relative humidity and solar irradiance), and chemometric methods (multiple linear regression and artificial neural networks) together with previous cloud-free OP-FTIR measurements were used to estimate the ambient spectrum in real time. The cloud detection process included a statistical comparison between the estimated cloud-free signal and the measured OP-FTIR signal. During the study we were able to successfully detect several aerosol clouds (water spray) in controlled conditions as well as during agricultural pesticide spraying in an orchard.

  12. A Kenyan Cloud School. Massive Open Online & Ongoing Courses for Blended and Lifelong Learning

    ERIC Educational Resources Information Center

    Jobe, William

    2013-01-01

    This research describes the predicted outcomes of a Kenyan Cloud School (KCS), which is a MOOC that contains all courses taught at the secondary school level in Kenya. This MOOC will consist of online, ongoing subjects in both English and Kiswahili. The KCS subjects offer self-testing and peer assessment to maximize scalability, and digital badges…

  13. A satellite observation test bed for cloud parameterization development

    NASA Astrophysics Data System (ADS)

    Lebsock, M. D.; Suselj, K.

    2015-12-01

    We present an observational test-bed of cloud and precipitation properties derived from CloudSat, CALIPSO, and the the A-Train. The focus of the test-bed is on marine boundary layer clouds including stratocumulus and cumulus and the transition between these cloud regimes. Test-bed properties include the cloud cover and three dimensional cloud fraction along with the cloud water path and precipitation water content, and associated radiative fluxes. We also include the subgrid scale distribution of cloud and precipitation, and radiaitive quantities, which must be diagnosed by a model parameterization. The test-bed further includes meterological variables from the Modern Era Retrospective-analysis for Research and Applications (MERRA). MERRA variables provide the initialization and forcing datasets to run a parameterization in Single Column Model (SCM) mode. We show comparisons of an Eddy-Diffusivity/Mass-FLux (EDMF) parameterization coupled to micorphsycis and macrophysics packages run in SCM mode with observed clouds. Comparsions are performed regionally in areas of climatological subsidence as well stratified by dynamical and thermodynamical variables. Comparisons demonstrate the ability of the EDMF model to capture the observed transitions between subtropical stratocumulus and cumulus cloud regimes.

  14. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  15. The design and implementation of the LLNL gigabit testbed

    SciTech Connect

    Garcia, D.

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  16. COLUMBUS as Engineering Testbed for Communications and Multimedia Equipment

    NASA Astrophysics Data System (ADS)

    Bank, C.; Anspach von Broecker, G. O.; Kolloge, H.-G.; Richters, M.; Rauer, D.; Urban, G.; Canovai, G.; Oesterle, E.

    2002-01-01

    The paper presents ongoing activities to prepare COLUMBUS for communications and multimedia technology experiments. For this purpose, Astrium SI, Bremen, has studied several options how to best combine the given system architecture with flexible and state-of-the-art interface avionics and software. These activities have been conducted in coordination with, and partially under contract of, DLR and ESA/ESTEC. Moreover, Astrium SI has realized three testbeds for multimedia software and hardware testing under own funding. The experimental core avionics unit - about a half double rack - establishes the core of a new multi-user experiment facility for this type of investigation onboard COLUMBUS, which shall be available to all users of COLUMBUS. It allows for the connection of 2nd generation payload, that is payload requiring broadband data transfer and near-real-time access by the Principal Investigator on ground, to test highly interactive and near-realtime payload operation. The facility is also foreseen to test new equipment to provide the astronauts onboard the ISS/COLUMBUS with bi- directional hi-fi voice and video connectivity to ground, private voice coms and e-mail, and a multimedia workstation for ops training and recreation. Connection to an appropriate Wide Area Network (WAN) on Earth is possible. The facility will include a broadband data transmission front-end terminal, which is mounted externally on the COLUMBUS module. This Equipment provides high flexibility due to the complete transparent transmit and receive chains, the steerable multi-frequency antenna system and its own thermal and power control and distribution. The Equipment is monitored and controlled via the COLUMBUS internal facility. It combines several new hardware items, which are newly developed for the next generation of broadband communication satellites and operates in Ka -Band with the experimental ESA data relay satellite ARTEMIS. The equipment is also TDRSS compatible; the open loop

  17. Distributed computing testbed for a remote experimental environment

    SciTech Connect

    Butner, D.N.; Casper, T.A.; Howard, B.C.; Henline, P.A.; Davis, S.L.; Barnes, D.; Greenwood, D.E.

    1995-09-18

    Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on the DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.

  18. Contrast analysis and stability on the ExAO testbed

    SciTech Connect

    Evans, J; Thomas, S; Gavel, D; Dillon, D; Macintosh, B

    2008-06-10

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. The Laboratory for Adaptive Optics at UC Santa Cruz is investigating limits to high-contrast imaging in support of the Gemini Planet Imager. Previous contrast measurements were made with a simple single-opening prolate spheroid shaped pupil that produced a limited region of high-contrast, particularly when wavefront errors were corrected with the 1024-actuator Boston Micromachines MEMS deformable mirror currently in use on the testbed. A more sophisticated shaped pupil is now being used that has a much larger region of interest facilitating a better understanding of high-contrast measurements. In particular we examine the effect of heat sources in the testbed on PSF stability. We find that rms image motion scales as 0.02 {lambda}/D per watt when the heat source is near the pupil plane. As a result heat sources of greater than 5 watts should be avoided near pupil planes for GPI. The safest place to introduce heat is near a focal plane. Heat also can effect the standard deviation of the high-contrast region but in the final instrument other sources of error should be more significant.

  19. A Laboratory Testbed for Embedded Fuzzy Control

    ERIC Educational Resources Information Center

    Srivastava, S.; Sukumar, V.; Bhasin, P. S.; Arun Kumar, D.

    2011-01-01

    This paper presents a novel scheme called "Laboratory Testbed for Embedded Fuzzy Control of a Real Time Nonlinear System." The idea is based upon the fact that project-based learning motivates students to learn actively and to use their engineering skills acquired in their previous years of study. It also fosters initiative and focuses students'…

  20. Flight Projects Office Information Systems Testbed (FIST)

    NASA Technical Reports Server (NTRS)

    Liggett, Patricia

    1991-01-01

    Viewgraphs on the Flight Projects Office Information Systems Testbed (FIST) are presented. The goal is to perform technology evaluation and prototyping of information systems to support SFOC and JPL flight projects in order to reduce risk in the development of operational data systems for such projects.

  1. Experiences with the Bay Area Gigabit Network Testbed

    NASA Technical Reports Server (NTRS)

    Johnson, Marjory J.

    1995-01-01

    The Bay Area Gigabit Network Testbed (BAGNet) is a high-performance ATM (155 Mbps) testbed located within the San Francisco Bay Area in northern California. BAGNet is a metropolitan-area network, spanning an area of approximately 50 square miles. There are fifteen sites participating in the testbed, with up to four hosts per site. Although BAGNet is an applications-oriented testbed, much of our effort has been directed towards getting the testbed running and understanding the factors that impact performance of an ATM network. We present some of our experiences in this paper.

  2. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  3. Software Defined Networking challenges and future direction: A case study of implementing SDN features on OpenStack private cloud

    NASA Astrophysics Data System (ADS)

    Shamugam, Veeramani; Murray, I.; Leong, J. A.; Sidhu, Amandeep S.

    2016-03-01

    Cloud computing provides services on demand instantly, such as access to network infrastructure consisting of computing hardware, operating systems, network storage, database and applications. Network usage and demands are growing at a very fast rate and to meet the current requirements, there is a need for automatic infrastructure scaling. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Managing complex environments using traditional networks is time-consuming and expensive, especially in the case of generating virtual machines, migration and network configuration. To mitigate the challenges, network operations require efficient, flexible, agile and scalable software defined networks (SDN). This paper discuss various issues in SDN and suggests how to mitigate the network management related issues. A private cloud prototype test bed was setup to implement the SDN on the OpenStack platform to test and evaluate the various network performances provided by the various configurations.

  4. Sparse matrix methods research using the CSM testbed software system

    NASA Technical Reports Server (NTRS)

    Chu, Eleanor; George, J. Alan

    1989-01-01

    Research is described on sparse matrix techniques for the Computational Structural Mechanics (CSM) Testbed. The primary objective was to compare the performance of state-of-the-art techniques for solving sparse systems with those that are currently available in the CSM Testbed. Thus, one of the first tasks was to become familiar with the structure of the testbed, and to install some or all of the SPARSPAK package in the testbed. A suite of subroutines to extract from the data base the relevant structural and numerical information about the matrix equations was written, and all the demonstration problems distributed with the testbed were successfully solved. These codes were documented, and performance studies comparing the SPARSPAK technology to the methods currently in the testbed were completed. In addition, some preliminary studies were done comparing some recently developed out-of-core techniques with the performance of the testbed processor INV.

  5. Exploiting Open Environmental Data using Linked Data and Cloud Computing: the MELODIES project

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Gonçalves, Pedro; Caumont, Hervé; Koubarakis, Manolis; Perkins, Bethan

    2015-04-01

    The European Open Data Strategy establishes important new principles that ensure that European public sector data will be released at no cost (or marginal cost), in machine-readable, commonly-understood formats, and with liberal licences enabling wide reuse. These data encompass both scientific data about the environment (from Earth Observation and other fields) and other public sector information, including diverse topics such as demographics, health and crime. Many open geospatial datasets (e.g. land use) are already available through the INSPIRE directive and made available through infrastructures such as the Global Earth Observation System of Systems (GEOSS). The intention of the Open Data Strategy is to stimulate the growth of research and value-adding services that build upon these data streams; however, the potential value inherent in open data, and the benefits that can be gained by combining previously-disparate sources of information are only just starting to become understood. The MELODIES project (Maximising the Exploitation of Linked Open Data In Enterprise and Science) is developing eight innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. MELODIES (http://melodiesproject.eu) is a European FP7 project that is coordinated by the University of Reading and has sixteen partners (including nine SMEs) from eight European countries. It started in November 2013 and will run for three years. The project is therefore in its early stages and therefore we will value the opportunity that this workshop affords to present our plans and interact with the wider Linked Geospatial Data community. The project is developing eight new services[1] covering a range of domains including agriculture, urban ecosystems, land use management, marine information, desertification, crisis management and hydrology. These services will combine Earth

  6. DEVELOPMENT OF A FACILITY MONITORING TESTBED

    SciTech Connect

    A. M. MIELKE; C. M. BOYLE; ET AL

    2001-06-01

    The Advanced Surveillance Technology (AST) project at Los Alamos National Laboratory (LANL), funded by the Nonproliferation Research and Engineering Group (NN-20) of the National Nuclear Security Administration (NNSA), is fielding a facility monitoring application testbed at the National High Magnetic Field Laboratory-Pulsed Field Laboratory (NHMFL-PFL). This application is designed to utilize continuous remote monitoring technology to provide an additional layer of personnel safety assurance and equipment fault prediction capability in the laboratory. Various off-the-shelf surveillance sensor technologies are evaluated. In this testbed environment, several of the deployed monitoring sensors have detected transient precursor equipment-fault events. Additionally the prototype remote monitoring system employs specialized video state recognition software to determine whether the operations occurring within the facility are acceptable, given the observed equipment status. By integrating the Guardian reasoning system developed at LANL, anomalous facility events trigger alarms signaling personnel to the likelihood of an equipment failure or unsafe operation.

  7. Mini-mast CSI testbed user's guide

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  8. VCE testbed program planning and definition study

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.; Godston, J.

    1978-01-01

    The flight definition of the Variable Stream Control Engine (VSCE) was updated to reflect design improvements in the two key components: (1) the low emissions duct burner, and (2) the coannular exhaust nozzle. The testbed design was defined and plans for the overall program were formulated. The effect of these improvements was evaluated for performance, emissions, noise, weight, and length. For experimental large scale testing of the duct burner and coannular nozzle, a design definition of the VCE testbed configuration was made. This included selecting the core engine, determining instrumentation requirements, and selecting the test facilities, in addition to defining control system and assembly requirements. Plans for a comprehensive test program to demonstrate the duct burner and nozzle technologies were formulated. The plans include both aeroacoustic and emissions testing.

  9. Observed and simulated temperature dependence of the liquid water path of low clouds

    SciTech Connect

    Del Genio, A.D.; Wolf, A.B.

    1996-04-01

    Data being acquired at the Atmospheric Radiation Measurement (ARM) Southern great Plains (SGP) Cloud and Radiation Testbed (CART) site can be used to examine the factors determining the temperature dependence of cloud optical thickness. We focus on cloud liquid water and physical thickness variations which can be derived from existing ARM measurements.

  10. Variable Dynamic Testbed Vehicle: Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Lee, A. Y.; Le, N. T.; Marriott, A. T.

    1997-01-01

    The Variable Dynamic Testbed Vehicle (VDTV) concept has been proposed as a tool to evaluate collision avoidance systems and to perform driving-related human factors research. The goal of this study is to analytically investigate to what extent a VDTV with adjustable front and rear anti-roll bar stiffnesses, programmable damping rates, and four-wheel-steering can emulate the lateral dynamics of a broad range of passenger vehicles.

  11. Commissioning Results on the JWST Testbed Telescope

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.; Acton, D. Scott

    2006-01-01

    The one-meter 18 segment JWST Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate commissioning operations for the JWST Observatory. Eight different commissioning activities were tested on the TBT: telescope focus sweep, segment ID and Search, image array, global alignment, image stacking, coarse phasing, fine phasing, and multi-field phasing. This paper describes recent commissioning results from experiments performed on the TBT.

  12. SSERVI Analog Regolith Simulant Testbed Facility

    NASA Astrophysics Data System (ADS)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  13. THE HERSCHEL INVENTORY OF THE AGENTS OF GALAXY EVOLUTION IN THE MAGELLANIC CLOUDS, A HERSCHEL OPEN TIME KEY PROGRAM

    SciTech Connect

    Meixner, M.; Roman-Duval, J.; Seale, J.; Gordon, K.; Beck, T.; Boyer, M. L.; Panuzzo, P.; Hony, S.; Sauvage, M.; Okumura, K.; Chanial, P.; Babler, B.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Carlson, L. R.; Clayton, G. C.; and others

    2013-09-15

    We present an overview of the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 {mu}m with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for the LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the criteria used to establish a catalog for each waveband for the HERITAGE program. The 250 {mu}m band is the most sensitive and the source catalogs for this band have {approx}25,000 objects for the LMC and {approx}5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supernova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the community support

  14. The HERschel Inventory of the Agents of Galaxy Evolution in the Magellanic Clouds, a HERschel Open Time Key Program

    NASA Technical Reports Server (NTRS)

    Meixner, Margaret; Panuzzo, P.; Roman-Duval, J.; Engelbracht, C.; Babler, B.; Seale, J.; Hony, S.; Montiel, E.; Sauvage, M.; Gordon, K.; Misselt, K.; Okumura, K.; Chanial, P.; Beck, T.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Boyer, M. L.; Carlson, L. R.; Clayton, G. C.; Chen, C.-H. R.; Cormier, D.; Fukui, Y.; Galametz, M.; Galliano, F.

    2013-01-01

    We present an overview or the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 micron with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the critetia used to establish a catalog for each waveband for the HERITAGE program. The 250 micron band is the most sensitive and the source catalogs for this band have approx. 25,000 objects for the LMC and approx. 5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supemova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the conummity support

  15. The Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Hines, Braden; Bell, Charles; Shen, Tsae-Pyng; Bloemhof, Eric; Zhao, Feng; Regehr, Martin; Holmes, Howard; Irigoyen, Robert; Neat, Gregory

    2003-01-01

    The Micro-Arcsecond Metrology (MAM) testbed is a ground-based system of optical and electronic equipment for testing components, systems, and engineering concepts for the Space Interferometer Mission (SIM) and similar future missions, in which optical interferometers will be operated in outer space. In addition, the MAM testbed is of interest in its own right as a highly precise metrological system. The designs of the SIM interferometer and the MAM testbed reflect a requirement to measure both the position of the starlight central fringe and the change in the internal optical path of the interferometer with sufficient spatial resolution to generate astrometric data with angular resolution at the microarcsecond level. The internal path is to be measured by use of a small metrological laser beam of 1,319-nm wavelength, whereas the position of the starlight fringe is to be estimated by use of a charge-coupled-device (CCD) image detector sampling a large concentric annular beam. For the SIM to succeed, the optical path length determined from the interferometer fringes must be tracked by the metrological subsystem to within tens of picometers, through all operational motions of an interferometer delay line and siderostats. The purpose of the experiments performed on the MAM testbed is to demonstrate this agreement in a large-scale simulation that includes a substantial portion of the system in the planned configuration for operation in outer space. A major challenge in this endeavor is to align the metrological beam with the starlight beam in order to maintain consistency between the metrological and starlight subsystems at the system level. The MAM testbed includes an optical interferometer with a white light source, all major optical components of a stellar interferometer, and heterodyne metrological sensors. The aforementioned subsystems are installed in a large vacuum chamber in order to suppress atmospheric and thermal disturbances. The MAM is divided into two

  16. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Vaillancourt, Robert; Carlson, David; Evans, Thomas; Schundler, Elizabeth; Todd, Lori; Mottus, Kathleen

    2009-05-01

    OPTRA is developing an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach is intended as a referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill.

  17. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  18. Evaluating Aerosol Process Modules within the Framework of the Aerosol Modeling Testbed

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Velu, V.; Gustafson, W. I.; Chapman, E.; Easter, R. C.; Shrivastava, M.; Singh, B.

    2012-12-01

    Factors that influence predictions of aerosol direct and indirect forcing, such as aerosol mass, composition, size distribution, hygroscopicity, and optical properties, still contain large uncertainties in both regional and global models. New aerosol treatments are usually implemented into a 3-D atmospheric model and evaluated using a limited number of measurements from a specific case study. Under this modeling paradigm, the performance and computational efficiency of several treatments for a specific aerosol process cannot be adequately quantified because many other processes among various modeling studies (e.g. grid configuration, meteorology, emission rates) are different as well. The scientific community needs to know the advantages and disadvantages of specific aerosol treatments when the meteorology, chemistry, and other aerosol processes are identical in order to reduce the uncertainties associated with aerosols predictions. To address these issues, an Aerosol Modeling Testbed (AMT) has been developed that systematically and objectively evaluates new aerosol treatments for use in regional and global models. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from the Community Atmosphere Model version 5 (CAM5) have also been ported to WRF so that they can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. Data from several campaigns, including the 2006

  19. Overview on In-Space Internet Node Testbed (ISINT)

    NASA Technical Reports Server (NTRS)

    Richard, Alan M.; Kachmar, Brian A.; Fabian, Theodore; Kerczewski, Robert J.

    2000-01-01

    The Satellite Networks and Architecture Branch has developed the In-Space Internet Node Technology testbed (ISINT) for investigating the use of commercial Internet products for NASA missions. The testbed connects two closed subnets over a tabletop Ka-band transponder by using commercial routers and modems. Since many NASA assets are in low Earth orbits (LEO's), the testbed simulates the varying signal strength, changing propagation delay, and varying connection times that are normally experienced when communicating to the Earth via a geosynchronous orbiting (GEO) communications satellite. Research results from using this testbed will be used to determine which Internet technologies are appropriate for NASA's future communication needs.

  20. ISS Update: ISTAR -- International Space Station Testbed for Analog Research

    NASA Video Gallery

    NASA Public Affairs Officer Kelly Humphries interviews Sandra Fletcher, EVA Systems Flight Controller. They discuss the International Space Station Testbed for Analog Research (ISTAR) activity that...

  1. From Particles and Point Clouds to Voxel Models: High Resolution Modeling of Dynamic Landscapes in Open Source GIS

    NASA Astrophysics Data System (ADS)

    Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.

    2012-12-01

    Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.

  2. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Engel, James R.; Vaillancourt, Robert; Todd, Lori; Mottus, Kathleen

    2008-04-01

    OPTRA and University of North Carolina are developing an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach will be considered as a candidate referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill. In this paper, we summarize progress to date and overall system performance projections based on the instrument, spectroscopy, and tomographic reconstruction accuracy. We then present a preliminary optical design of the I-OP-FTIR.

  3. Imaging open-path Fourier transform infrared spectrometer for 3D cloud profiling

    NASA Astrophysics Data System (ADS)

    Rentz Dupuis, Julia; Mansur, David J.; Vaillancourt, Robert; Carlson, David; Evans, Thomas; Schundler, Elizabeth; Todd, Lori; Mottus, Kathleen

    2010-04-01

    OPTRA has developed an imaging open-path Fourier transform infrared (I-OP-FTIR) spectrometer for 3D profiling of chemical and biological agent simulant plumes released into test ranges and chambers. An array of I-OP-FTIR instruments positioned around the perimeter of the test site, in concert with advanced spectroscopic algorithms, enables real time tomographic reconstruction of the plume. The approach is intended as a referee measurement for test ranges and chambers. This Small Business Technology Transfer (STTR) effort combines the instrumentation and spectroscopic capabilities of OPTRA, Inc. with the computed tomographic expertise of the University of North Carolina, Chapel Hill. In this paper, we summarize the design and build and detail system characterization and test of a prototype I-OP-FTIR instrument. System characterization includes radiometric performance and spectral resolution. Results from a series of tomographic reconstructions of sulfur hexafluoride plumes in a laboratory setting are also presented.

  4. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    PubMed Central

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  5. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience.

    PubMed

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  6. Wavefront Control Testbed (WCT) Experiment Results

    NASA Technical Reports Server (NTRS)

    Burns, Laura A.; Basinger, Scott A.; Campion, Scott D.; Faust, Jessica A.; Feinberg, Lee D.; Hayden, William L.; Lowman, Andrew E.; Ohara, Catherine M.; Petrone, Peter P., III

    2004-01-01

    The Wavefront Control Testbed (WCT) was created to develop and test wavefront sensing and control algorithms and software for the segmented James Webb Space Telescope (JWST). Last year, we changed the system configuration from three sparse aperture segments to a filled aperture with three pie shaped segments. With this upgrade we have performed experiments on fine phasing with line-of-sight and segment-to-segment jitter, dispersed fringe visibility and grism angle;. high dynamic range tilt sensing; coarse phasing with large aberrations, and sampled sub-aperture testing. This paper reviews the results of these experiments.

  7. Dynamic federation of grid and cloud storage

    NASA Astrophysics Data System (ADS)

    Furano, Fabrizio; Keeble, Oliver; Field, Laurence

    2016-09-01

    The Dynamic Federations project ("Dynafed") enables the deployment of scalable, distributed storage systems composed of independent storage endpoints. While the Uniform Generic Redirector at the heart of the project is protocol-agnostic, we have focused our effort on HTTP-based protocols, including S3 and WebDAV. The system has been deployed on testbeds covering the majority of the ATLAS and LHCb data, and supports geography-aware replica selection. The work done exploits the federation potential of HTTP to build systems that offer uniform, scalable, catalogue-less access to the storage and metadata ensemble and the possibility of seamless integration of other compatible resources such as those from cloud providers. Dynafed can exploit the potential of the S3 delegation scheme, effectively federating on the fly any number of S3 buckets from different providers and applying a uniform authorization to them. This feature has been used to deploy in production the BOINC Data Bridge, which uses the Uniform Generic Redirector with S3 buckets to harmonize the BOINC authorization scheme with the Grid/X509. The Data Bridge has been deployed in production with good results. We believe that the features of a loosely coupled federation of open-protocolbased storage elements open many possibilities of smoothly evolving the current computing models and of supporting new scientific computing projects that rely on massive distribution of data and that would appreciate systems that can more easily be interfaced with commercial providers and can work natively with Web browsers and clients.

  8. Application of Model-based Prognostics to a Pneumatic Valves Testbed

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.; Gorospe, George

    2014-01-01

    Pneumatic-actuated valves play an important role in many applications, including cryogenic propellant loading for space operations. Model-based prognostics emphasizes the importance of a model that describes the nominal and faulty behavior of a system, and how faulty behavior progresses in time, causing the end of useful life of the system. We describe the construction of a testbed consisting of a pneumatic valve that allows the injection of faulty behavior and controllable fault progression. The valve opens discretely, and is controlled through a solenoid valve. Controllable leaks of pneumatic gas in the testbed are introduced through proportional valves, allowing the testing and validation of prognostics algorithms for pneumatic valves. A new valve prognostics approach is developed that estimates fault progression and predicts remaining life based only on valve timing measurements. Simulation experiments demonstrate and validate the approach.

  9. Application developer's tutorial for the CSM testbed architecture

    NASA Technical Reports Server (NTRS)

    Underwood, Phillip; Felippa, Carlos A.

    1988-01-01

    This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.

  10. Thermodynamic and cloud parameter retrieval using infrared spectral data

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Huang, Hung-Lung A.; Li, Jun; McGill, Matthew J.; Mango, Stephen A.

    2005-01-01

    High-resolution infrared radiance spectra obtained from near nadir observations provide atmospheric, surface, and cloud property information. A fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. The retrieval algorithm is presented along with its application to recent field experiment data from the NPOESS Airborne Sounding Testbed - Interferometer (NAST-I). The retrieval accuracy dependence on cloud properties is discussed. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with an accuracy of approximately 1.0 km. Preliminary NAST-I retrieval results from the recent Atlantic-THORPEX Regional Campaign (ATReC) are presented and compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL).

  11. A Turbine-powered UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; High, James W.; Guerreiro, Nelson M.; Chambers, Ryan S.; Howard, Keith D.

    2007-01-01

    The latest version of the NASA Flying Controls Testbed (FLiC) integrates commercial-off-the-shelf components including airframe, autopilot, and a small turbine engine to provide a low cost experimental flight controls testbed capable of sustained speeds up to 200 mph. The series of flight tests leading up to the demonstrated performance of the vehicle in sustained, autopiloted 200 mph flight at NASA Wallops Flight Facility's UAV runway in August 2006 will be described. Earlier versions of the FLiC were based on a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate at Fort Eustis, Virginia and NASA Langley Research Center. The newer turbine powered platform (J-FLiC) builds on the successes using the relatively smaller, slower and less expensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches with the implementation of C-coded experimental controllers. Tracking video was taken during the test flights at Wallops and will be available for presentation at the conference. Analysis of flight data from both remotely piloted and autopiloted flights will be presented. Candidate experimental controllers for implementation will be discussed. It is anticipated that flight testing will resume in Spring 2007 and those results will be included, if possible.

  12. Telescience testbed in human space physiology

    NASA Astrophysics Data System (ADS)

    Watanabe, Satoru; Seo, Hisao; Iwase, Satoshi; Tanaka, Masafumi; Kaneko, Sayumi; Mano, Tadaaki; Matsui, Nobuo; Foldager, Niels; Bondepetersen, Flemming; Yamashita, Masamichi; Shoji, Takatoshi; Sudoh, Hideo

    The present telescience testbed study was conducted to evaluate the feasibility of physiological experimentation under restricted conditions such as during simulated weightlessness induced by using a water immersion facility, a reduced capacity of laboratory facilities, a delay and desynchronization of communication between investigator and operator, restrictions of different kinds of experiments practiced by only one operator following a limited time line and so on. The three day's experiments were carried out following the same protocols. The operators were changed every day, but was the same the first and the third day. The operators were both medical doctors but not all round experts in the physiological experimentation. The experimental objectives were: 1) ECG changes by changing water immersion levels, 2) blood pressure changes, 3) ultrasonic Echo-cardiographic changes, 4) laser Doppler skin blood flowmetry in a finger, 5) blood sampling to examine blood electrolytic and humoral changes. The effectiveness of the testbed experiment was assessed by evaluating the quality of the obtained data and estimating the friendliness of the operation of the telescience to investigators and operators.

  13. Sparse aperture mask wavefront sensor testbed results

    NASA Astrophysics Data System (ADS)

    Subedi, Hari; Zimmerman, Neil T.; Kasdin, N. Jeremy; Riggs, A. J. E.

    2016-07-01

    Coronagraphic exoplanet detection at very high contrast requires the estimation and control of low-order wave- front aberrations. At Princeton High Contrast Imaging Lab (PHCIL), we are working on a new technique that integrates a sparse-aperture mask (SAM) with a shaped pupil coronagraph (SPC) to make precise estimates of these low-order aberrations. We collect the starlight rejected from the coronagraphic image plane and interfere it using a sparse aperture mask (SAM) at the relay pupil to estimate the low-order aberrations. In our previous work we numerically demonstrated the efficacy of the technique, and proposed a method to sense and control these differential aberrations in broadband light. We also presented early testbed results in which the SAM was used to sense pointing errors. In this paper, we will briefly overview the SAM wavefront sensor technique, explain the design of the completed testbed, and report the experimental estimation results of the dominant low-order aberrations such as tip/tit, astigmatism and focus.

  14. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    SciTech Connect

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  15. Testbed for the development of intelligent robot control

    SciTech Connect

    Harrigan, R.W.

    1986-01-01

    The Sensor Driven Robot Systems Testbed has been constructed to provide a working environment to aid in the development of intelligent robot control software. The Testbed employs vision and force as the robot's means of interrogating its environment. The Testbed, which has been operational for approximately 24 months, consists of a PUMA-560 robot manipulator coupled to a 2-dimensional vision system and force and torque sensing wrist. Recent work within the Testbed environment has led to a highly modularized control software concept with emphasis on detection and resolution of error situations. The objective of the Testbed is to develop intelligent robot control concepts incorporating planning and error recovery which are transportable to a wide variety of robot applications. This project is an ongoing, longterm development project and, as such, this paper represents a status report of the development work.

  16. Expediting Experiments across Testbeds with AnyBed: A Testbed-Independent Topology Configuration System and Its Tool Set

    NASA Astrophysics Data System (ADS)

    Suzuki, Mio; Hazeyama, Hiroaki; Miyamoto, Daisuke; Miwa, Shinsuke; Kadobayashi, Youki

    Building an experimental network within a testbed has been a tiresome process for experimenters, due to the complexity of the physical resource assignment and the configuration overhead. Also, the process could not be expedited across testbeds, because the syntax of a configuration file varies depending on specific hardware and software. Re-configuration of an experimental topology for each testbed wastes time, an experimenter could not carry out his/her experiments during the limited lease time of a testbed at worst. In this paper, we propose the AnyBed: the experimental network-building system. The conceptual idea of AnyBed is “If experimental network topologies can be portable across any kinds of testbed, then, it would expedite building an experimental network on a testbed while manipulating experiments by each testbed support tool”. To achieve this concept, AnyBed divide an experimental network configuration into the logical and physical network topologies. Mapping these two topologies, AnyBed can build intended logical network topology on any PC clusters. We have evaluated the AnyBed implementation using two distinct clusters. The evaluation result shows a BGP topology with 150 nodes can be constructed on a large scale testbed in less than 113 seconds.

  17. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation. PMID:23230155

  18. Contrasting sea-ice and open-water boundary layers during melt and freeze-up seasons: Some result from the Arctic Clouds in Summer Experiment.

    NASA Astrophysics Data System (ADS)

    Tjernström, Michael; Sotiropoulou, Georgia; Sedlar, Joseph; Achtert, Peggy; Brooks, Barbara; Brooks, Ian; Persson, Ola; Prytherch, John; Salsbury, Dominic; Shupe, Matthew; Johnston, Paul; Wolfe, Dan

    2016-04-01

    With more open water present in the Arctic summer, an understanding of atmospheric processes over open-water and sea-ice surfaces as summer turns into autumn and ice starts forming becomes increasingly important. The Arctic Clouds in Summer Experiment (ACSE) was conducted in a mix of open water and sea ice in the eastern Arctic along the Siberian shelf during late summer and early autumn 2014, providing detailed observations of the seasonal transition, from melt to freeze. Measurements were taken over both ice-free and ice-covered surfaces, offering an insight to the role of the surface state in shaping the lower troposphere and the boundary-layer conditions as summer turned into autumn. During summer, strong surface inversions persisted over sea ice, while well-mixed boundary layers capped by elevated inversions were frequent over open-water. The former were often associated with advection of warm air from adjacent open-water or land surfaces, whereas the latter were due to a positive buoyancy flux from the warm ocean surface. Fog and stratus clouds often persisted over the ice, whereas low-level liquid-water clouds developed over open water. These differences largely disappeared in autumn, when mixed-phase clouds capped by elevated inversions dominated in both ice-free and ice-covered conditions. Low-level-jets occurred ~20-25% of the time in both seasons. The observations indicate that these jets were typically initiated at air-mass boundaries or along the ice edge in autumn, while in summer they appeared to be inertial oscillations initiated by partial frictional decoupling as warm air was advected in over the sea ice. The start of the autumn season was related to an abrupt change in atmospheric conditions, rather than to the gradual change in solar radiation. The autumn onset appeared as a rapid cooling of the whole atmosphere and the freeze up followed as the warm surface lost heat to the atmosphere. While the surface type had a pronounced impact on boundary

  19. A numerical testbed for the characterization and optimization of aerosol remote sensing

    NASA Astrophysics Data System (ADS)

    Wang, J.; Xu, X.; Ding, S.; Zeng, J.; Spurr, R. J.; Liu, X.; Chance, K.; Holben, B. N.; Dubovik, O.; Mishchenko, M. I.

    2013-12-01

    Remote sensing of aerosols from satellite and ground-based platforms provides key datasets to help understand the effect of air-borne particulates on air quality, visibility, surface temperature, clouds, and precipitation. However, global measurements of aerosol parameters have only been generated in the last decade or so, with the advent of dedicated low-earth-orbit sun-synchronous satellite sensors such as those of NASA's Earth Observation System (EOS). Many EOS sensors are now past their design lifetimes. Meanwhile, a number of aerosol-related satellite missions are planned for the future, and several of these will have measurements of polarization. A common question often arises: How can a sensor be optimally configured (in terms of spectral wavelength ranges, viewing angles, and measurement quantities such as radiance and polarization) to best fulfill the scientific requirements within the mission's budget constraints? To address these kind of questions in a cost-effective manner, a numerical testbed for remote sensing aerosols is an important requirement. This testbed is a tool that can generate an objective assessment of aerosol information content anticipated from any (planned or real) instrument configuration. Here, we present a numerical testbed that combines the inverse optimal estimation theory with a forward model containing linearized particle scattering and radiative transfer code. Specifically, the testbed comprises the following components: (1) a linearized vector radiative transfer model that computes the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering parameters at each layer and over the column; (2) linearized Mie and T-matrix electromagnetic scattering codes to compute the macroscopic aerosol single scattering optical properties and their sensitivities with respect to refractive index, size, and shape; (3) a linearized land surface model that uses the Lambertian, Ross-Thick, and Li

  20. Supersonic combustion engine testbed, heat lightning

    NASA Technical Reports Server (NTRS)

    Hoying, D.; Kelble, C.; Langenbahn, A.; Stahl, M.; Tincher, M.; Walsh, M.; Wisler, S.

    1990-01-01

    The design of a supersonic combustion engine testbed (SCET) aircraft is presented. The hypersonic waverider will utilize both supersonic combustion ramjet (SCRAMjet) and turbofan-ramjet engines. The waverider concept, system integration, electrical power, weight analysis, cockpit, landing skids, and configuration modeling are addressed in the configuration considerations. The subsonic, supersonic and hypersonic aerodynamics are presented along with the aerodynamic stability and landing analysis of the aircraft. The propulsion design considerations include: engine selection, turbofan ramjet inlets, SCRAMjet inlets and the SCRAMjet diffuser. The cooling requirements and system are covered along with the topics of materials and the hydrogen fuel tanks and insulation system. A cost analysis is presented and the appendices include: information about the subsonic wind tunnel test, shock expansion calculations, and an aerodynamic heat flux program.

  1. The JPL Phase B interferometer testbed

    NASA Technical Reports Server (NTRS)

    Eldred, Daniel B.; Oneal, Mike

    1993-01-01

    Future NASA missions with large optical systems will require alignment stability at the nanometer level. However, design studies indicate that vibration resulting from on-board disturbances can cause jitter at levels three to four orders of magnitude greater than this. Feasibility studies have shown that a combination of three distinct control layers will be required for these missions, including disturbance isolation, active and passive structural vibration suppression, and active optical pathlength compensation. The CSI technology challenge is to develop these design and control approaches that can reduce vibrations in the optical train by a factor of 1000 to 10,000. The focus of the paper is on describing the Phase B Testbed structure and facility, as the experimental results are included in other papers presented at this same conference.

  2. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

  3. A land-surface Testbed for EOSDIS

    NASA Technical Reports Server (NTRS)

    Emery, William; Kelley, Tim

    1994-01-01

    The main objective of the Testbed project was to deliver satellite images via the Internet to scientific and educational users free of charge. The main method of operations was to store satellite images on a low cost tape library system, visually browse the raw satellite data, access the raw data filed, navigate the imagery through 'C' programming and X-Windows interface software, and deliver the finished image to the end user over the Internet by means of file transfer protocol methods. The conclusion is that the distribution of satellite imagery by means of the Internet is feasible, and the archiving of large data sets can be accomplished with low cost storage systems allowing multiple users.

  4. The JPL Phase B interferometer testbed

    NASA Astrophysics Data System (ADS)

    Eldred, Daniel B.; Oneal, Mike

    1993-02-01

    Future NASA missions with large optical systems will require alignment stability at the nanometer level. However, design studies indicate that vibration resulting from on-board disturbances can cause jitter at levels three to four orders of magnitude greater than this. Feasibility studies have shown that a combination of three distinct control layers will be required for these missions, including disturbance isolation, active and passive structural vibration suppression, and active optical pathlength compensation. The CSI technology challenge is to develop these design and control approaches that can reduce vibrations in the optical train by a factor of 1000 to 10,000. The focus of the paper is on describing the Phase B Testbed structure and facility, as the experimental results are included in other papers presented at this same conference.

  5. The Goddard Space Flight Center (GSFC) robotics technology testbed

    NASA Technical Reports Server (NTRS)

    Schnurr, Rick; Obrien, Maureen; Cofer, Sue

    1989-01-01

    Much of the technology planned for use in NASA's Flight Telerobotic Servicer (FTS) and the Demonstration Test Flight (DTF) is relatively new and untested. To provide the answers needed to design safe, reliable, and fully functional robotics for flight, NASA/GSFC is developing a robotics technology testbed for research of issues such as zero-g robot control, dual arm teleoperation, simulations, and hierarchical control using a high level programming language. The testbed will be used to investigate these high risk technologies required for the FTS and DTF projects. The robotics technology testbed is centered around the dual arm teleoperation of a pair of 7 degree-of-freedom (DOF) manipulators, each with their own 6-DOF mini-master hand controllers. Several levels of safety are implemented using the control processor, a separate watchdog computer, and other low level features. High speed input/output ports allow the control processor to interface to a simulation workstation: all or part of the testbed hardware can be used in real time dynamic simulation of the testbed operations, allowing a quick and safe means for testing new control strategies. The NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) hierarchical control scheme, is being used as the reference standard for system design. All software developed for the testbed, excluding some of simulation workstation software, is being developed in Ada. The testbed is being developed in phases. The first phase, which is nearing completion, and highlights future developments is described.

  6. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  7. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  8. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  9. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  10. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    SciTech Connect

    Maloney, T.M.; Voecks, G.E.

    1995-01-25

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway. {copyright} {ital 1995} {ital American} {ital Institute} {ital of} {ital Physics}

  11. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    NASA Astrophysics Data System (ADS)

    Maloney, Thomas M.; Prokopius, Paul R.; Voecks, Gerald E.

    1995-01-01

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway.

  12. Development of Hardware-in-the-loop Microgrid Testbed

    SciTech Connect

    Xiao, Bailu; Prabakar, Kumaraguru; Starke, Michael R; Liu, Guodong; Dowling, Kevin; Ollis, T Ben; Irminger, Philip; Xu, Yan; Dimitrovski, Aleksandar D

    2015-01-01

    A hardware-in-the-loop (HIL) microgrid testbed for the evaluation and assessment of microgrid operation and control system has been presented in this paper. The HIL testbed is composed of a real-time digital simulator (RTDS) for modeling of the microgrid, multiple NI CompactRIOs for device level control, a prototype microgrid energy management system (MicroEMS), and a relay protection system. The applied communication-assisted hybrid control system has been also discussed. Results of function testing of HIL controller, communication, and the relay protection system are presented to show the effectiveness of the proposed HIL microgrid testbed.

  13. The Magellan Final Report on Cloud Computing

    SciTech Connect

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  14. Cross layer optimization for cloud-based radio over optical fiber networks

    NASA Astrophysics Data System (ADS)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  15. Custom data support for the FAst -physics System Testbed and Research (FASTER) Project

    SciTech Connect

    Toto, T.; Jensen, M.; Vogelmann, A.; Wagener, R.; Liu, Y.; Lin, W.

    2010-03-15

    The multi-institution FAst -physics System Testbed and Research (FASTER) project, funded by the DOE Earth System Modeling program, aims to evaluate and improve the parameterizations of fast processes (those involving clouds, precipitation and aerosols) in global climate models, using a combination of numerical prediction models, single column models, cloud resolving models, large-eddy simulations, full global climate model output and ARM active and passive remote sensing and in-situ data. This poster presents the Custom Data Support effort for the FASTER project. The effort will provide tailored datasets, statistics, best estimates and quality control data, as needed and defined by FASTER participants, for use in evaluating and improving parameterizations of fast processes in GCMs. The data support will include custom gridding and averaging, for the model of interest, using high time resolution and pixel level data from continuous ARM observations and complementary datasets. In addition to the FASTER team, these datasets will be made available to the ARM Science Team. Initial efforts with respect to data product development, priorities, availability and distribution are summarized here with an emphasis on cloud, atmospheric state and aerosol properties as observed during the Spring 2000 Cloud IOP and the Spring 2003 Aerosol IOP at the ARM Southern Great Plains site.

  16. Performance of the optical communication adaptive optics testbed

    NASA Technical Reports Server (NTRS)

    Troy, Mitchell; Roberts, Jennifer; Guiwits, Steve; Azevedo, Steve; Bikkannavar, Siddarayappa; Brack, Gary; Garkanian, Vachik; Palmer, Dean; Platt, Benjamin; Truong, Tuan; Wilson, Keith; Wallace, Kent

    2005-01-01

    We describe the current performance of an adaptive optics testbed for optical communication. This adaptive optics system allows for simulation of night and day-time observing on a 1 meter telescope with a 97 actuator deformable mirror.

  17. Situational descriptions of behavioral procedures: the in situ testbed.

    PubMed Central

    Kemp, S M; Eckerman, D A

    2001-01-01

    We demonstrate the In Situ testbed, a system that aids in evaluating computational models of learning, including artificial neural networks. The testbed models contingencies of reinforcement rising an extension of Mechner's (1959) notational system for the description of behavioral procedures. These contingencies are input to the model under test. The model's output is displayed as cumulative records. The cumulative record can then be compared to one produced by a pigeon exposed to the same contingencies. The testbed is tried with three published models of learning. Each model is exposed to up to three reinforcement schedules (testing ends when the model does not produce acceptable cumulative records): continuous reinforcement and extinction, fixed ratio, and fixed interval. The In Sitt testbed appears to be a reliable and valid testing procedure for comparing models of learning. PMID:11394484

  18. CT-directed robotic biopsy testbed: motivation and concept

    NASA Astrophysics Data System (ADS)

    Cleary, Kevin R.; Stoianovici, Dan S.; Glossop, Neil D.; Gary, Kevin A.; Onda, Sumiyo; Cody, Richard; Lindisch, David; Stanimir, Alexandru; Mazilu, Dumitru; Patriciu, Alexandru; Watson, Vance; Levy, Elliot

    2001-05-01

    As a demonstration platform, we are developing a robotic biopsy testbed incorporating a mobile CT scanner, a small needle driver robot, and an optical localizer. This testbed will be used to compare robotically assisted biopsy to the current manual technique, and allow us to investigate software architectures for integrating multiple medical devices. This is a collaboration between engineers and physicians from three universities and a commercial vendor. In this paper we describe the CT-directed biopsy technique, review some other biopsy systems including passive and semi- autonomous devices, describe our testbed components, and present our software architecture. This testbed is a first step in developing the image-guided, robotically assisted, physician directed, biopsy systems of the future.

  19. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system is documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  20. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system are documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  1. Phoenix Missile Hypersonic Testbed (PMHT): System Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    A viewgraph presentation of the Phoenix Missile Hypersonic Testbed (PMHT) is shown. The contents include: 1) Need and Goals; 2) Phoenix Missile Hypersonic Testbed; 3) PMHT Concept; 4) Development Objectives; 5) Possible Research Payloads; 6) Possible Research Program Participants; 7) PMHT Configuration; 8) AIM-54 Internal Hardware Schematic; 9) PMHT Configuration; 10) New Guidance and Armament Section Profiles; 11) Nomenclature; 12) PMHT Stack; 13) Systems Concept; 14) PMHT Preflight Activities; 15) Notional Ground Path; and 16) Sample Theoretical Trajectories.

  2. Time-multiplexed open-path TDLAS spectrometer for dynamic, sampling-free, interstitial H2 18O and H2 16O vapor detection in ice clouds

    NASA Astrophysics Data System (ADS)

    Kühnreich, B.; Wagner, S.; Habig, J. C.; Möhler, O.; Saathoff, H.; Ebert, V.

    2015-04-01

    An advanced in situ diode laser hygrometer for simultaneous, sampling-free detection of interstitial H2 16O and H2 18O vapor was developed and tested in the aerosol interaction and dynamics in atmosphere (AIDA) cloud chamber during dynamic cloud formation processes. The spectrometer to measure isotope-resolved water vapor concentrations comprises two rapidly time-multiplexed DFB lasers near 1.4 and 2.7 µm and an open-path White cell with 227-m absorption path length and 4-m mirror separation. A dynamic water concentration range from 2.6 ppb to 87 ppm for H2 16O and 87 ppt to 3.6 ppm for H2 18O could be achieved and was used to enable a fast and direct detection of dynamic isotope ratio changes during ice cloud formation in the AIDA chamber at temperatures between 190 and 230 K. Relative changes in the H2 18O/H2 16O isotope ratio of 1 % could be detected and resolved with a signal-to-noise ratio of 7. This converts to an isotope ratio resolution limit of 0.15 % at 1-s time resolution.

  3. Comparison of millimeter-wave cloud radar measurements for the Fall 1997 Cloud IOP

    SciTech Connect

    Sekelsky, S.M.; Li, L.; Galloway, J.; McIntosh, R.E.; Miller, M.A.; Clothiaux, E.E.; Haimov, S.; Mace, G.; Sassen, K.

    1998-05-01

    One of the primary objectives of the Fall 1997 IOP was to intercompare Ka-band (350Hz) and W-band (95GHz) cloud radar observations and verify system calibrations. During September 1997, several cloud radars were deployed at the Southern Great Plains (SOP) Cloud and Radiation Testbed (CART) site, including the full time operation 35 GHz CART Millimeter-wave Cloud Radar (MMCR), the University of Massachusetts (UMass) single antenna 33GHz/95 GHz Cloud Profiling Radar System (CPRS), the 95 GHz Wyoming Cloud Radar (WCR) flown on the University of Wyoming King Air, the University of Utah 95 GHz radar and the dual-antenna Pennsylvania State University 94 GHz radar. In this paper the authors discuss several issues relevant to comparison of ground-based radars, including the detection and filtering of insect returns. Preliminary comparisons of ground-based Ka-band radar reflectivity data and comparisons with airborne radar reflectivity measurements are also presented.

  4. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  5. Optical testbed for the LISA phasemeter

    NASA Astrophysics Data System (ADS)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  6. Micro-Pixel Image Position Sensing Testbed

    NASA Technical Reports Server (NTRS)

    Nemati, Bijan; Shao, Michael; Zhai, Chengxing; Erlig, Hernan; Wang, Xu; Goullioud, Renaud

    2011-01-01

    The search for Earth-mass planets in the habitable zones of nearby Sun-like stars is an important goal of astrophysics. This search is not feasible with the current slate of astronomical instruments. We propose a new concept for microarcsecond astrometry which uses a simplified instrument and hence promises to be low cost. The concept employs a telescope with only a primary, laser metrology applied to the focal plane array, and new algorithms for measuring image position and displacement on the focal plane. The required level of accuracy in both the metrology and image position sensing is at a few micro-pixels. We have begun a detailed investigation of the feasibility of our approach using simulations and a micro-pixel image position sensing testbed called MCT. So far we have been able to demonstrate that the pixel-to-pixel distances in a focal plane can be measured with a precision of 20 micro-pixels and image-to-image distances with a precision of 30 micro-pixels. We have also shown using simulations that our image position algorithm can achieve accuracy of 4 micro-pixels in the presence of lambda/20 wavefront errors.

  7. Ames life science telescience testbed evaluation

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  8. Further progress in watermark evaluation testbed (WET)

    NASA Astrophysics Data System (ADS)

    Kim, Hyung C.; Lin, Eugene T.; Guitart, Oriol; Delp, Edward J., III

    2005-03-01

    While Digital Watermarking has received much attention in recent years, it is still a relatively young technology. There are few accepted tools/metrics that can be used to evaluate the suitability of a watermarking technique for a specific application. This lack of a universally adopted set of metrics/methods has motivated us to develop a web-based digital watermark evaluation system called the Watermark Evaluation Testbed or WET. There have been more improvements over the first version of WET. We implemented batch mode with a queue that allows for user submitted jobs. In addition to StirMark 3.1 as an attack module, we added attack modules based on StirMark 4.0. For a new image fidelity measure, we evaluate conditional entropy as an image fidelity measure for different watermarking algorithms and different attacks. Also, we show the results of curve fitting the Receiver Operating Characteristic (ROC) analysis data using the Parzen window density estimation. The curve fits the data closely while having only two parameters to estimate.

  9. SMILES ice cloud products

    NASA Astrophysics Data System (ADS)

    MilláN, L.; Read, W.; Kasai, Y.; Lambert, A.; Livesey, N.; Mendrok, J.; Sagawa, H.; Sano, T.; Shiotani, M.; Wu, D. L.

    2013-06-01

    Upper tropospheric water vapor and clouds play an important role in Earth's climate, but knowledge of them, in particular diurnal variation in deep convective clouds, is limited. An essential variable to understand them is cloud ice water content. The Japanese Superconducting Submillimeter-Wave Limb-Emission Sounder (SMILES) on board the International Space Station (ISS) samples the atmosphere at different local times allowing the study of diurnal variability of atmospheric parameters. We describe a new ice cloud data set consisting of partial Ice Water Path and Ice Water Content. Preliminary comparisons with EOS-MLS, CloudSat-CPR and CALIOP-CALIPSO are presented. Then, the diurnal variation over land and over open ocean for partial ice water path is reported. Over land, a pronounced diurnal variation peaking strongly in the afternoon/early evening was found. Over the open ocean, little temporal dependence was encountered. This data set is publicly available for download in HDF5 format.

  10. JINR cloud infrastructure evolution

    NASA Astrophysics Data System (ADS)

    Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.

    2016-09-01

    To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.

  11. Advanced turboprop testbed systems study. Volume 1: Testbed program objectives and priorities, drive system and aircraft design studies, evaluation and recommendations and wind tunnel test plans

    NASA Technical Reports Server (NTRS)

    Bradley, E. S.; Little, B. H.; Warnock, W.; Jenness, C. M.; Wilson, J. M.; Powell, C. W.; Shoaf, L.

    1982-01-01

    The establishment of propfan technology readiness was determined and candidate drive systems for propfan application were identified. Candidate testbed aircraft were investigated for testbed aircraft suitability and four aircraft selected as possible propfan testbed vehicles. An evaluation of the four candidates was performed and the Boeing KC-135A and the Gulfstream American Gulfstream II recommended as the most suitable aircraft for test application. Conceptual designs of the two recommended aircraft were performed and cost and schedule data for the entire testbed program were generated. The program total cost was estimated and a wind tunnel program cost and schedule is generated in support of the testbed program.

  12. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  13. NASA's Coastal and Ocean Airborne Science Testbed

    NASA Astrophysics Data System (ADS)

    Guild, L. S.; Dungan, J. L.; Edwards, M.; Russell, P. B.; Morrow, J. H.; Hooker, S.; Myers, J.; Kudela, R. M.; Dunagan, S.; Soulage, M.; Ellis, T.; Clinton, N. E.; Lobitz, B.; Martin, K.; Zell, P.; Berthold, R. W.; Smith, C.; Andrew, D.; Gore, W.; Torres, J.

    2011-12-01

    The Coastal and Ocean Airborne Science Testbed (COAST) Project is a NASA Earth-science flight mission that will advance coastal ecosystems research by providing a unique airborne payload optimized for remote sensing in the optically complex coastal zone. Teaming NASA Ames scientists and engineers with Biospherical Instruments, Inc. (San Diego) and UC Santa Cruz, the airborne COAST instrument suite combines a customized imaging spectrometer, sunphotometer system, and a new bio-optical radiometer package to obtain ocean/coastal/atmosphere data simultaneously in flight for the first time. The imaging spectrometer (Headwall) is optimized in the blue region of the spectrum to emphasize remote sensing of marine and freshwater ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data will be accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Based on optical detectors called microradiometers, the NASA Ocean Biology and Biogeochemistry Calibration and Validation (cal/val) Office team has deployed advanced commercial off-the-shelf instrumentation that provides in situ measurements of the apparent optical properties at the land/ocean boundary including optically shallow aquatic ecosystems (e.g., lakes, estuaries, coral reefs). A complimentary microradiometer instrument package (Biospherical Instruments, Inc.), optimized for use above water, will be flown for the first time with the airborne instrument suite. Details of the October 2011 COAST airborne mission over Monterey Bay demonstrating this new airborne instrument suite capability will be presented, with associated preliminary data on coastal ocean color products, coincident spatial and temporal data on aerosol optical depth and water vapor column content, as well as derived exact water-leaving radiances.

  14. Energy Aware Clouds

    NASA Astrophysics Data System (ADS)

    Orgerie, Anne-Cécile; de Assunção, Marcos Dias; Lefèvre, Laurent

    Cloud infrastructures are increasingly becoming essential components for providing Internet services. By benefiting from economies of scale, Clouds can efficiently manage and offer a virtually unlimited number of resources and can minimize the costs incurred by organizations when providing Internet services. However, as Cloud providers often rely on large data centres to sustain their business and offer the resources that users need, the energy consumed by Cloud infrastructures has become a key environmental and economical concern. This chapter presents an overview of techniques that can improve the energy efficiency of Cloud infrastructures. We propose a framework termed as Green Open Cloud, which uses energy efficient solutions for virtualized environments; the framework is validated on a reference scenario.

  15. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  16. Development of the CSI phase-3 evolutionary model testbed

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Davis, D. A.; Tan, M. K.

    1994-01-01

    This report documents the development effort for the reconfiguration of the Controls-Structures Integration (CSI) Evolutionary Model (CEM) Phase-2 testbed into the CEM Phase-3 configuration. This step responds to the need to develop and test CSI technologies associated with typical planned earth science and remote sensing platforms. The primary objective of the CEM Phase-3 ground testbed is to simulate the overall on-orbit dynamic behavior of the EOS AM-1 spacecraft. Key elements of the objective include approximating the low-frequency appendage dynamic interaction of EOS AM-1, allowing for the changeout of components, and simulating the free-free on-orbit environment using an advanced suspension system. The fundamentals of appendage dynamic interaction are reviewed. A new version of the multiple scaling method is used to design the testbed to have the full-scale geometry and dynamics of the EOS AM-1 spacecraft, but at one-tenth the weight. The testbed design is discussed, along with the testing of the solar array, high gain antenna, and strut components. Analytical performance comparisons show that the CEM Phase-3 testbed simulates the EOS AM-1 spacecraft with good fidelity for the important parameters of interest.

  17. Kite: Status of the External Metrology Testbed for SIM

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar; Azizi, Alireza; Moser, Steven; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-01-01

    Kite is a system level testbed for the External Metrology system of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducial that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to tens of picometers in order to correct for thermal measurements, the Kite testbed was build to test both the metrology gauges and out ability to optically model the system at these levels. The Kite testbed is an over-constraint system where 6 lengths are measured, but only 5 are needed to determine the system. The agreement in the over-constrained length needs to be on the order of 140 pm for the SIM Wide-Angle observing scenario and 8 pm for the Narrow-Angle observing scenario. We demonstrate that we have met the Wide-Angle goal with our current setup. For the Narrow-Angle case, we have only reached the goal for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more SIM like.

  18. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  19. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  20. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  1. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network.

    PubMed

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-28

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  2. Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2

    NASA Technical Reports Server (NTRS)

    Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.

    2003-01-01

    Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used

  3. A Testbed for Deploying Distributed State Estimation in Power Grid

    SciTech Connect

    Jin, Shuangshuang; Chen, Yousu; Rice, Mark J.; Liu, Yan; Gorton, Ian

    2012-07-22

    Abstract—With the increasing demand, scale and data information of power systems, fast distributed applications are becoming more important in power system operation and control. This paper proposes a testbed for evaluating power system distributed applications, considering data exchange among distributed areas. A high-performance computing (HPC) version of distributed state estimation is implemented and used as a distributed application example. The IEEE 118-bus system is used to deploy the parallel distributed state estimation, and the MeDICi middleware is used for data communication. The performance of the testbed demonstrates its capability to evaluate parallel distributed state estimation by leveraging the HPC paradigm. This testbed can also be applied to evaluate other distributed applications.

  4. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  5. Telescience testbed pilot program, volume 2: Program results

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.

  6. Laser Metrology in the Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    An, Xin; Marx, D.; Goullioud, Renaud; Zhao, Feng

    2004-01-01

    The Space Interferometer Mission (SIM), scheduled for launch in 2009, is a space-born visible light stellar interferometer capable of micro-arcsecond-level astrometry. The Micro-Arcsecond Metrology testbed (MAM) is the ground-based testbed that incorporates all the functionalities of SIM minus the telescope, for mission-enabling technology development and verification. MAM employs a laser heterodyne metrology system using the Sub-Aperture Vertex-to-Vertex (SAVV) concept. In this paper, we describe the development and modification of the SAVV metrology launchers and the metrology instrument electronics, precision alignments and pointing control, locating cyclic error sources in the MAM testbed and methods to mitigate the cyclic errors, as well as the performance under the MAM performance metrics.

  7. An Experimental Testbed for Evaluation of Trust and Reputation Systems

    NASA Astrophysics Data System (ADS)

    Kerr, Reid; Cohen, Robin

    To date, trust and reputation systems have often been evaluated using methods of their designers’ own devising. Recently, we demonstrated that a number of noteworthy trust and reputation systems could be readily defeated, revealing limitations in their original evaluations. Efforts in the trust and reputation community to develop a testbed have yielded a successful competition platform, ART. This testbed, however, is less suited to general experimentation and evaluation of individual trust and reputation technologies. In this paper, we propose an experimentation and evaluation testbed based directly on that used in our investigations into security vulnerabilities in trust and reputation systems for marketplaces. We demonstrate the advantages of this design, towards the development of more thorough, objective evaluations of trust and reputation systems.

  8. Experiences with the JPL telerobot testbed: Issues and insights

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Balaram, Bob; Beahan, John

    1989-01-01

    The Jet Propulsion Laboratory's (JPL) Telerobot Testbed is an integrated robotic testbed used to develop, implement, and evaluate the performance of advanced concepts in autonomous, tele-autonomous, and tele-operated control of robotic manipulators. Using the Telerobot Testbed, researchers demonstrated several of the capabilities and technological advances in the control and integration of robotic systems which have been under development at JPL for several years. In particular, the Telerobot Testbed was recently employed to perform a near completely automated, end-to-end, satellite grapple and repair sequence. The task of integrating existing as well as new concepts in robot control into the Telerobot Testbed has been a very difficult and timely one. Now that researchers have completed the first major milestone (i.e., the end-to-end demonstration) it is important to reflect back upon experiences and to collect the knowledge that has been gained so that improvements can be made to the existing system. It is also believed that the experiences are of value to the others in the robotics community. Therefore, the primary objective here will be to use the Telerobot Testbed as a case study to identify real problems and technological gaps which exist in the areas of robotics and in particular systems integration. Such problems have surely hindered the development of what could be reasonably called an intelligent robot. In addition to identifying such problems, researchers briefly discuss what approaches have been taken to resolve them or, in several cases, to circumvent them until better approaches can be developed.

  9. The Wide-Field Imaging Interferometry Testbed: Recent Progress

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.

    2010-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) at NASA's Goddard Space Flight Center was designed to demonstrate the practicality and application of techniques for wide-field spatial-spectral ("double Fourier") interferometry. WIIT is an automated system, and it is now producing substantial amounts of high-quality data from its state-of-the-art operating environment, Goddard's Advanced Interferometry and Metrology Lab. In this paper, we discuss the characterization and operation of the testbed and present the most recent results. We also outline future research directions. A companion paper within this conference discusses the development of new wide-field double Fourier data analysis algorithms.

  10. Telescience testbed pilot program, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.

  11. The Living With a Star Space Environment Testbed Program

    NASA Technical Reports Server (NTRS)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  12. Search Cloud

    MedlinePlus

    ... of this page: https://medlineplus.gov/cloud.html Search Cloud To use the sharing features on this ... of Top 110 zoster vaccine Share the MedlinePlus search cloud with your users by embedding our search ...

  13. X.509 Authentication/Authorization in FermiCloud

    SciTech Connect

    Kim, Hyunwoo; Timm, Steven

    2014-11-11

    We present a summary of how X.509 authentication and authorization are used with OpenNebula in FermiCloud. We also describe a history of why the X.509 authentication was needed in FermiCloud, and review X.509 authorization options, both internal and external to OpenNebula. We show how these options can be and have been used to successfully run scientific workflows on federated clouds, which include OpenNebula on FermiCloud and Amazon Web Services as well as other community clouds. We also outline federation options being used by other commercial and open-source clouds and cloud research projects.

  14. Cloud regimes as phase transitions

    NASA Astrophysics Data System (ADS)

    Stechmann, Samuel N.; Hottovy, Scott

    2016-06-01

    Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes -- open versus closed cells -- fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. With this new conceptual viewpoint, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions.

  15. A programmable laboratory testbed in support of evaluation of functional brain activation and connectivity.

    PubMed

    Barbour, Randall L; Graber, Harry L; Xu, Yong; Pei, Yaling; Schmitz, Christoph H; Pfeil, Douglas S; Tyagi, Anandita; Andronica, Randy; Lee, Daniel C; Barbour, San-Lian S; Nichols, J David; Pflieger, Mark E

    2012-03-01

    An important determinant of the value of quantitative neuroimaging studies is the reliability of the derived information, which is a function of the data collection conditions. Near infrared spectroscopy (NIRS) and electroencelphalography are independent sensing domains that are well suited to explore principal elements of the brain's response to neuroactivation, and whose integration supports development of compact, even wearable, systems suitable for use in open environments. In an effort to maximize the translatability and utility of such resources, we have established an experimental laboratory testbed that supports measures and analysis of simulated macroscopic bioelectric and hemodynamic responses of the brain. Principal elements of the testbed include 1) a programmable anthropomorphic head phantom containing a multisignal source array embedded within a matrix that approximates the background optical and bioelectric properties of the brain, 2) integrated translatable headgear that support multimodal studies, and 3) an integrated data analysis environment that supports anatomically based mapping of experiment-derived measures that are directly and not directly observable. Here, we present a description of system components and fabrication, an overview of the analysis environment, and findings from a representative study that document the ability to experimentally validate effective connectivity models based on NIRS tomography.

  16. An Integrated Testbed for Cooperative Perception with Heterogeneous Mobile and Static Sensors

    PubMed Central

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  17. Interferometric adaptive optics testbed for laser pointing, wave-front control and phasing.

    PubMed

    Baker, K L; Homoelle, D; Utternback, E; Stappaerts, E A; Siders, C W; Barty, C P J

    2009-09-14

    Implementing the capability to perform fast ignition experiments, as well as, radiography experiments on the National Ignition Facility (NIF) places stringent requirements on the control of each of the beam's pointing, intra-beam phasing and overall wave-front quality. In this article experimental results are presented which were taken on an interferometric adaptive optics testbed that was designed and built to test the capabilities of such a system to control phasing, pointing and higher order beam aberrations. These measurements included quantification of the reduction in Strehl ratio incurred when using the MEMS device to correct for pointing errors in the system. The interferometric adaptive optics system achieved a Strehl ratio of 0.83 when correcting for a piston, tip/tilt error between two adjacent rectangular apertures, the geometry expected for the National ignition Facility. The interferometric adaptive optics system also achieved a Strehl ratio of 0.66 when used to correct for a phase plate aberration of similar magnitude as expected from simulations of the ARC beam line. All of these corrections included measuring both the upstream and downstream aberrations in the testbed and applying the sum of these two measurements in open-loop to the MEMS deformable mirror.

  18. Interferometric adaptive optics testbed for laser pointing, wave-front control and phasing.

    PubMed

    Baker, K L; Homoelle, D; Utternback, E; Stappaerts, E A; Siders, C W; Barty, C P J

    2009-09-14

    Implementing the capability to perform fast ignition experiments, as well as, radiography experiments on the National Ignition Facility (NIF) places stringent requirements on the control of each of the beam's pointing, intra-beam phasing and overall wave-front quality. In this article experimental results are presented which were taken on an interferometric adaptive optics testbed that was designed and built to test the capabilities of such a system to control phasing, pointing and higher order beam aberrations. These measurements included quantification of the reduction in Strehl ratio incurred when using the MEMS device to correct for pointing errors in the system. The interferometric adaptive optics system achieved a Strehl ratio of 0.83 when correcting for a piston, tip/tilt error between two adjacent rectangular apertures, the geometry expected for the National ignition Facility. The interferometric adaptive optics system also achieved a Strehl ratio of 0.66 when used to correct for a phase plate aberration of similar magnitude as expected from simulations of the ARC beam line. All of these corrections included measuring both the upstream and downstream aberrations in the testbed and applying the sum of these two measurements in open-loop to the MEMS deformable mirror. PMID:19770884

  19. Delft testbed interferometer: layout design and research goals

    NASA Astrophysics Data System (ADS)

    van Brug, Hedser H.; van den Dool, Teun; Gielesen, Wim; Giesen, Peter; Oostdijck, Bastiaan; d'Arcio, Luigi

    2003-02-01

    The Delft Testbed Interferometer (DTI) will be presented. The main purpose for the DTI is to demonstrate the feasibility of homothetic mapping, both fixed and under scanning conditions. The driving design issues behind the DTI will be presented together with a list of experiments to be conducted with the DTI system in the field of wide field imaging.

  20. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  1. Operation Duties on the F-15B Research Testbed

    NASA Technical Reports Server (NTRS)

    Truong, Samson S.

    2010-01-01

    This presentation entails what I have done this past summer for my Co-op tour in the Operations Engineering Branch. Activities included supporting the F-15B Research Testbed, supporting the incoming F-15D models, design work, and other operations engineering duties.

  2. Smart Antenna UKM Testbed for Digital Beamforming System

    NASA Astrophysics Data System (ADS)

    Islam, Mohammad Tariqul; Misran, Norbahiah; Yatim, Baharudin

    2009-12-01

    A new design of smart antenna testbed developed at UKM for digital beamforming purpose is proposed. The smart antenna UKM testbed developed based on modular design employing two novel designs of L-probe fed inverted hybrid E-H (LIEH) array antenna and software reconfigurable digital beamforming system (DBS). The antenna is developed based on using the novel LIEH microstrip patch element design arranged into [InlineEquation not available: see fulltext.] uniform linear array antenna. An interface board is designed to interface to the ADC board with the RF front-end receiver. The modular concept of the system provides the capability to test the antenna hardware, beamforming unit, and beamforming algorithm in an independent manner, thus allowing the smart antenna system to be developed and tested in parallel, hence reduces the design time. The DBS was developed using a high-performance [InlineEquation not available: see fulltext.] floating-point DSP board and a 4-channel RF front-end receiver developed in-house. An interface board is designed to interface to the ADC board with the RF front-end receiver. A four-element receiving array testbed at 1.88-2.22 GHz frequency is constructed, and digital beamforming on this testbed is successfully demonstrated.

  3. Extending the Information Commons: From Instructional Testbed to Internet2

    ERIC Educational Resources Information Center

    Beagle, Donald

    2002-01-01

    The author's conceptualization of an Information Commons (IC) is revisited and elaborated in reaction to Bailey and Tierney's article. The IC's role as testbed for instructional support and knowledge discovery is explored, and progress on pertinent research is reviewed. Prospects for media-rich learning environments relate the IC to the…

  4. Developmental Cryogenic Active Telescope Testbed, a Wavefront Sensing and Control Testbed for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Leboeuf, Claudia M.; Davila, Pamela S.; Redding, David C.; Morell, Armando; Lowman, Andrew E.; Wilson, Mark E.; Young, Eric W.; Pacini, Linda K.; Coulter, Dan R.

    1998-01-01

    As part of the technology validation strategy of the next generation space telescope (NGST), a system testbed is being developed at GSFC, in partnership with JPL and Marshall Space Flight Center (MSFC), which will include all of the component functions envisioned in an NGST active optical system. The system will include an actively controlled, segmented primary mirror, actively controlled secondary, deformable, and fast steering mirrors, wavefront sensing optics, wavefront control algorithms, a telescope simulator module, and an interferometric wavefront sensor for use in comparing final obtained wavefronts from different tests. The developmental. cryogenic active telescope testbed (DCATT) will be implemented in three phases. Phase 1 will focus on operating the testbed at ambient temperature. During Phase 2, a cryocapable segmented telescope will be developed and cooled to cryogenic temperature to investigate the impact on the ability to correct the wavefront and stabilize the image. In Phase 3, it is planned to incorporate industry developed flight-like components, such as figure controlled mirror segments, cryogenic, low hold power actuators, or different wavefront sensing and control hardware or software. A very important element of the program is the development and subsequent validation of the integrated multidisciplinary models. The Phase 1 testbed objectives, plans, configuration, and design will be discussed.

  5. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme.

  6. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme. PMID:27137302

  7. xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud[OPEN

    PubMed Central

    Merchant, Nirav

    2016-01-01

    Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today’s pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant’s Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. PMID:27020957

  8. Development of structural health monitoring systems for railroad bridge testbeds

    NASA Astrophysics Data System (ADS)

    Park, Hyun-Jun; Min, Jiyoung; Yun, Chung-Bang; Shin, Min-Ho; Kim, Yong-Su; Park, Su-Yeol

    2011-04-01

    Recently a challenging project has been carried out for construction of a national network for safety management and monitoring of civil infrastructures in Korea. As a part of the project, structural health monitoring (SHM) systems have been established on railroad bridges employing various types of sensors such as accelerometers, optical fiber sensors, and piezoelectric sensors. This paper presents the current status of railroad bridge health monitoring testbeds. Emerging sensors and monitoring technologies are under investigation. They are local damage detection using PZT-based electro-mechanical impedances; vibration-based global monitoring using accelerations, FBG-based dynamic strains; and wireless sensor data acquisition systems. The monitoring systems provide real-time measurements under train-transit and environmental loadings, and can be remotely accessible and controllable via the web. Long-term behaviors of the railroad bridge testbeds are investigated, and guidelines for safety management are to be established by combining numerical analysis and signal processing of the measured data.

  9. Telescience testbed pilot program, volume 3: Experiment summaries

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.

  10. The advanced orbiting systems testbed program: Results to date

    NASA Technical Reports Server (NTRS)

    Newsome, Penny A.; Otranto, John F.

    1993-01-01

    The Consultative Committee for Space Data Systems Recommendations for Packet Telemetry and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. Central to the AOST Program are the development of an end-to-end Testbed and its use in a comprehensive testing program. Other Program activities include flight-qualifiable component development, supporting studies, and knowledge dissemination. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations. The results presented in this paper include architectural issues, a draft proposed standardized test suite and flight-qualifiable components.

  11. Amplitude variations on the Extreme Adaptive Optics testbed

    SciTech Connect

    Evans, J; Thomas, S; Dillon, D; Gavel, D; Phillion, D; Macintosh, B

    2007-08-14

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. At the Laboratory for Adaptive Optics on the Extreme Adaptive Optics testbed, we have already demonstrated wavefront control of better than 1 nm rms within controllable spatial frequencies. Corresponding contrast measurements, however, are limited by amplitude variations, including those introduced by the micro-electrical-mechanical-systems (MEMS) deformable mirror. Results from experimental measurements and wave optic simulations of amplitude variations on the ExAO testbed are presented. We find systematic intensity variations of about 2% rms, and intensity variations with the MEMS to be 6%. Some errors are introduced by phase and amplitude mixing because the MEMS is not conjugate to the pupil, but independent measurements of MEMS reflectivity suggest that some error is introduced by small non-uniformities in the reflectivity.

  12. FDIR Validation Test-Bed Development and Results

    NASA Astrophysics Data System (ADS)

    Karlsson, Alexander; Sakthivel, Anandhavel; Aberg, Martin; Andersson, Jan; Habinc, Sandi; Dellandrea, Brice; Nodet, Jean-Christian; Guettache, Farid; Furano, Gianluca

    2015-09-01

    This paper describes work being performed by Cobham Gaisler and Thales Alenia Space France for the European Space Agency to develop an extension of the existing avionics system testbed facility in ESTEC's Avionics Lab. The work is funded by the European Space Agency under contract 4000109928/13/NL/AK. The resulting FDIR (Fault Detection, Isolation and Recovery) testbed will allow to test concepts, strategy mechanisms and tools related to FDIR. The resulting facility will have the capabilities to support nominal and off-nominal test cases and to support tools for post testing and post simulation analysis. Ultimately the purpose of the output of this activity is to provide a tool for assessment and validation at laboratory level. This paper describes an on-going development; at the time of writing the activity is in the validation phase.

  13. Development of a FDIR Validation Test-Bed

    NASA Astrophysics Data System (ADS)

    Andersson, Jan; Cederman, Daniel; Habinc, Sandi; Dellandrea, Brice; Nodet, Jean-Christian; Guettache, Farid; Furano, Gianluca

    2014-08-01

    This paper describes work being performed by Aeroflex Gaisler and Thales Alenia Space France for the European Space Agency to develop an extension of the existing avionics system testbed facility in ESTEC's Avionics Lab. The work is funded by the European Space Agency under contract 4000109928/13/NL/AK. The resulting FDIR (Fault Detection, Isolation and Recovery) testbed will allow to test concepts, strategy mechanisms and tools related to FDIR. The resulting facility will have the capabilities to support nominal and off-nominal test cases and to support tools for post testing and post simulation analysis. Ultimately the purpose of the output of this activity is to provide a tool for assessment and validation at laboratory level. This paper describes an on-going development; at the time of writing the activity is in the preliminary design phase.

  14. Optical modeling of the wide-field imaging interferometry testbed

    NASA Astrophysics Data System (ADS)

    Thompson, Anita K.; Martino, Anthony J.; Rinehart, Stephen A.; Leisawitz, David T.; Leviton, Douglas B.; Frey, Bradley J.

    2006-06-01

    The technique of wide field imaging for optical/IR interferometers for missions like Space Infrared Interferometric (SPIRIT), Submillimeter Probe of the Evolution of Cosmic Structure (SPECS), and the Terrestrial Planet Finder (TPF-I)/DARWIN has been demonstrated through the Wide-field Imaging Interferometry Testbed (WIIT). In this paper, we present an optical model of the WIIT testbed using the commercially available optical modeling and analysis software FRED. Interferometric results for some simple source targets are presented for a model with ideal surfaces and compared with theoretical closed form solutions. Measured surface deformation data of all mirror surfaces in the form of Zernike coefficients are then added to the optical model compared with results of some simple source targets to laboratory test data. We discuss the sources of error and approximations in the current FRED optical model. Future plans to refine the optical model are also be discussed.

  15. Remotely Accessible Testbed for Software Defined Radio Development

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2012-01-01

    Previous development testbeds have assumed that the developer was physically present in front of the hardware being used. No provision for remote operation of basic functions (power on/off or reset) was made, because the developer/operator was sitting in front of the hardware, and could just push the button manually. In this innovation, a completely remotely accessible testbed has been created, with all diagnostic equipment and tools set up for remote access, and using standardized interfaces so that failed equipment can be quickly replaced. In this testbed, over 95% of the operating hours were used for testing without the developer being physically present. The testbed includes a pair of personal computers, one running Linux and one running Windows. A variety of peripherals is connected via Ethernet and USB (universal serial bus) interfaces. A private internal Ethernet is used to connect to test instruments and other devices, so that the sole connection to the outside world is via the two PCs. An important design consideration was that all of the instruments and interfaces used stable, long-lived industry standards, such as Ethernet, USB, and GPIB (general purpose interface bus). There are no plug-in cards for the two PCs, so there are no problems with finding replacement computers with matching interfaces, device drivers, and installation. The only thing unique to the two PCs is the locally developed software, which is not specific to computer or operating system version. If a device (including one of the computers) were to fail or become unavailable (e.g., a test instrument needed to be recalibrated), replacing it is a straightforward process with a standard, off-the-shelf device.

  16. Delft testbed interferometer: a homothetic mapping test setup

    NASA Astrophysics Data System (ADS)

    van Brug, Hedser; Oostdijck, Bastiaan; van den Dool, Teun; Giesen, Peter; Gielesen, Wim

    2004-02-01

    The Delft Testbed Interferometer (DTI) will be presented. The basics of homothetic mapping will be explained together with the method of fulfilling the requirements as chosen in the DTI setup. The optical layout incorporates a novel tracking concept enabling the use of homothetic mapping in real telescope systems for observations on the sky. The requirements for homothetic mapping and the choices made in the DTI setup will be discussed. Finally the planned experiments will be discussed.

  17. System integration of a Telerobotic Demonstration System (TDS) testbed

    NASA Technical Reports Server (NTRS)

    Myers, John K.

    1987-01-01

    The concept for and status of a telerobotic demonstration system testbed that integrates teleoperation and robotics is described. The components of the telerobotic system are described and the ongoing projects are discussed. The system can be divided into two sections: the autonomous subsystems, and the additional interface and support subsystems including teleoperations. The workings of each subsystem by itself and how the subsystems integrate into a complete system is discussed.

  18. Development and experimentation of an eye/brain/task testbed

    NASA Technical Reports Server (NTRS)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  19. Planning and reasoning in the JPL telerobot testbed

    NASA Technical Reports Server (NTRS)

    Peters, Stephen; Mittman, David; Collins, Carol; Omeara, Jacquie; Rokey, Mark

    1990-01-01

    The Telerobot Interactive Planning System is developed to serve as the highest autonomous-control level of the Telerobot Testbed. A recent prototype is described which integrates an operator interface for supervisory control, a task planner supporting disassembly and re-assembly operations, and a spatial planner for collision-free manipulator motion through the workspace. Each of these components is described in detail. Descriptions of the technical problem, approach, and lessons learned are included.

  20. Visible Nulling Coronagraphy Testbed Development for Exoplanet Detection

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-01-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 10(exp 8) , 10(exp 9) and 10(exp 10) at an inner working angle of 2*lambda/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  1. The Mini-Mast CSI testbed: Lessons learned

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Belvin, W. Keith; Horta, Lucas G.; Pappa, R. S.

    1993-01-01

    The Mini-Mast testbed was one of the first large scale Controls-Structure-Interaction (CSI) systems used to evaluate state-of-the-art methodology in flexible structure control. Now that all the testing at Langley Research Center has been completed, a look back is warranted to evaluate the program. This paper describes some of the experiences and technology development studies by NASA, university, and industry investigators. Lessons learned are presented from three categories: the testbed development, control methods, and the operation of a guest investigator program. It is shown how structural safety margins provided a realistic environment to simulate on-orbit CSI research, even though they also reduced the research flexibility afforded to investigators. The limited dynamic coupling between the bending and torsion modes of the cantilevered test article resulted in highly successful SISO and MIMO controllers. However, until accurate models were obtained for the torque wheel actuators, sensors, filters, and the structure itself, most controllers were unstable. Controls research from this testbed should be applicable to cantilevered appendages of future large space structures.

  2. Cyber security analysis testbed : combining real, emulation, and simulation.

    SciTech Connect

    Villamarin, Charles H.; Eldridge, John M.; Van Leeuwen, Brian P.; Urias, Vincent E.

    2010-07-01

    Cyber security analysis tools are necessary to evaluate the security, reliability, and resilience of networked information systems against cyber attack. It is common practice in modern cyber security analysis to separately utilize real systems of computers, routers, switches, firewalls, computer emulations (e.g., virtual machines) and simulation models to analyze the interplay between cyber threats and safeguards. In contrast, Sandia National Laboratories has developed novel methods to combine these evaluation platforms into a hybrid testbed that combines real, emulated, and simulated components. The combination of real, emulated, and simulated components enables the analysis of security features and components of a networked information system. When performing cyber security analysis on a system of interest, it is critical to realistically represent the subject security components in high fidelity. In some experiments, the security component may be the actual hardware and software with all the surrounding components represented in simulation or with surrogate devices. Sandia National Laboratories has developed a cyber testbed that combines modeling and simulation capabilities with virtual machines and real devices to represent, in varying fidelity, secure networked information system architectures and devices. Using this capability, secure networked information system architectures can be represented in our testbed on a single, unified computing platform. This provides an 'experiment-in-a-box' capability. The result is rapidly-produced, large-scale, relatively low-cost, multi-fidelity representations of networked information systems. These representations enable analysts to quickly investigate cyber threats and test protection approaches and configurations.

  3. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  4. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  5. Thin Clouds

    Atmospheric Science Data Center

    2013-04-18

    ... their delicate appearance, thin, feathery clouds of ice crystals called cirrus may contribute to global warming. Some scientists ... July 9, 2002 - Thin, feathery clouds of ice crystals over the Caribbean Sea. project:  MISR ...

  6. Development of Liquid Propulsion Systems Testbed at MSFC

    NASA Technical Reports Server (NTRS)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  7. Openings

    PubMed Central

    Selwyn, Peter A.

    2015-01-01

    Reviewing his clinic patient schedule for the day, a physician reflects on the history of a young woman he has been caring for over the past 9 years. What starts out as a routine visit then turns into a unique opening for communication and connection. A chance glimpse out the window of the exam room leads to a deeper meditation on parenthood, survival, and healing, not only for the patient but also for the physician. How many missed opportunities have we all had, without even realizing it, to allow this kind of fleeting but profound opening? PMID:26195687

  8. Ice clouds over Fairbanks, Alaska

    NASA Astrophysics Data System (ADS)

    Kayetha, Vinay Kumar

    Arctic clouds have been recognized long ago as one of the key elements modulating the global climate system. They have gained much interest in recent years because the availability of new continuous datasets is opening doors to explore cloud and aerosol properties as never before. This is particularly important in the light of current climate change studies that predict changing weather scenarios around the world. This research investigates the occurrence and properties of a few types of ice clouds over the Arctic region with datasets available through the Arctic Facility for Atmospheric Remote Sensing (AFARS; 64.86° N, 147.84° W). This study exclusively focuses on ice clouds that form in the upper (cirrus clouds) and midlevels of the troposphere, and that are transparent to laser pulses (visible optical depth, tau < 3.0 -- 4.0). Cirrus clouds are ice-dominated clouds that are formed in the upper levels of the troposphere and are relatively thin such that their visual appearances range from bluish to gray in color. Mid-level ice clouds are those clouds primarily composed of ice crystals forming in the midlevels of the troposphere. It is hypothesized that unlike the basic midlevel cloud type (altostratus), other varieties of midlevel ice clouds exist at times over the Arctic region. The midlevel ice clouds studied here are also transparent to laser pulses and sometimes appear as a family of cirrus clouds to a surface observer. Because of their intermediate heights of occurrence in the troposphere, these could have microphysical properties and radiative effects that are distinct from those associated with upper level ice clouds in the troposphere. A ground-based lidar dataset with visual observations for identifying cloud types collected at AFARS over eight years is used to investigate this hypothesis. Cloud types over AFARS have been identified by a surface observer (Professor Kenneth Sassen) using established characteristics traits. Essential macrophysical

  9. Sensor Networking Testbed with IEEE 1451 Compatibility and Network Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Gurkan, Deniz; Yuan, X.; Benhaddou, D.; Figueroa, F.; Morris, Jonathan

    2007-01-01

    Design and implementation of a testbed for testing and verifying IEEE 1451-compatible sensor systems with network performance monitoring is of significant importance. The performance parameters measurement as well as decision support systems implementation will enhance the understanding of sensor systems with plug-and-play capabilities. The paper will present the design aspects for such a testbed environment under development at University of Houston in collaboration with NASA Stennis Space Center - SSST (Smart Sensor System Testbed).

  10. The computational structural mechanics testbed generic structural-element processor manual

    NASA Technical Reports Server (NTRS)

    Stanley, Gary M.; Nour-Omid, Shahram

    1990-01-01

    The usage and development of structural finite element processors based on the CSM Testbed's Generic Element Processor (GEP) template is documented. By convention, such processors have names of the form ESi, where i is an integer. This manual is therefore intended for both Testbed users who wish to invoke ES processors during the course of a structural analysis, and Testbed developers who wish to construct new element processors (or modify existing ones).

  11. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  12. Trusted Computing Strengthens Cloud Authentication

    PubMed Central

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  13. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.

  14. Spacelab system analysis: A study of the Marshall Avionics System Testbed (MAST)

    NASA Technical Reports Server (NTRS)

    Ingels, Frank M.; Owens, John K.; Daniel, Steven P.; Ahmad, F.; Couvillion, W.

    1988-01-01

    An analysis of the Marshall Avionics Systems Testbed (MAST) communications requirements is presented. The average offered load for typical nodes is estimated. Suitable local area networks are determined.

  15. Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test S

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test Sets and Networks Integration Management Office Testing for the Tracking and Data Relay Satellite System

  16. The AppScale Cloud Platform

    PubMed Central

    Krintz, Chandra

    2013-01-01

    AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721

  17. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    PubMed Central

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  18. Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.

    PubMed

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-06-01

    Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.

  19. The Wide-Field Imaging Interferometry Testbed: Recent Results

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen

    2006-01-01

    We present recent results from the Wide-Field Imaging Interferometry Testbed (WIIT). The data acquired with the WIIT is "double Fourier" data, including both spatial and spectral information within each data cube. We have been working with this data, and starting to develop algorithms, implementations, and techniques for reducing this data. Such algorithms and tools are of great importance for a number of proposed future missions, including the Space Infrared Interferometric Telescope (SPIRIT), the Submillimeter Probe of the Evolution of Cosmic Structure (SPECS), and the Terrestrial Planet Finder Interferometer (TPF-I)/Darwin. Recent results are discussed and future study directions are described.

  20. The Advanced Orbiting Systems Testbed Program: Results to date

    NASA Technical Reports Server (NTRS)

    Otranto, John F.; Newsome, Penny A.

    1994-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Recommendations for Packet Telemetry (PT) and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's (GSFC's) AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations.

  1. CCPP-ARM Parameterization Testbed Model Forecast Data

    DOE Data Explorer

    Klein, Stephen

    2008-01-15

    Dataset contains the NCAR CAM3 (Collins et al., 2004) and GFDL AM2 (GFDL GAMDT, 2004) forecast data at locations close to the ARM research sites. These data are generated from a series of multi-day forecasts in which both CAM3 and AM2 are initialized at 00Z every day with the ECMWF reanalysis data (ERA-40), for the year 1997 and 2000 and initialized with both the NASA DAO Reanalyses and the NCEP GDAS data for the year 2004. The DOE CCPP-ARM Parameterization Testbed (CAPT) project assesses climate models using numerical weather prediction techniques in conjunction with high quality field measurements (e.g. ARM data).

  2. Performance of the PARCS Testbed Cesium Fountain Frequency Standard

    NASA Technical Reports Server (NTRS)

    Enzer, Daphna G.; Klipstein, William M.

    2004-01-01

    A cesium fountain frequency standard has been developed as a ground testbed for the PARCS (Primary Atomic Reference Clock in Space) experiment, an experiment intended to fly on the International Space Station. We report on the performance of the fountain and describe some of the implementations motivated in large part by flight considerations, but of relevance for ground fountains. In particular, we report on a new technique for delivering cooling and trapping laser beams to the atom collection region, in which a given beam is recirculated three times effectively providing much more optical power than traditional configurations. Allan deviations down to 10 have been achieved with this method.

  3. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  4. The Living With a Star Program Space Environment Testbed

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  5. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  6. The CSM testbed matrix processors internal logic and dataflow descriptions

    NASA Technical Reports Server (NTRS)

    Regelbrugge, Marc E.; Wright, Mary A.

    1988-01-01

    This report constitutes the final report for subtask 1 of Task 5 of NASA Contract NAS1-18444, Computational Structural Mechanics (CSM) Research. This report contains a detailed description of the coded workings of selected CSM Testbed matrix processors (i.e., TOPO, K, INV, SSOL) and of the arithmetic utility processor AUS. These processors and the current sparse matrix data structures are studied and documented. Items examined include: details of the data structures, interdependence of data structures, data-blocking logic in the data structures, processor data flow and architecture, and processor algorithmic logic flow.

  7. Phoenix Missile Hypersonic Testbed (PMHT): Project Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    An over view of research into a low cost hypersonic research flight test capability to increase the amount of hypersonic flight data to help bridge the large developmental gap between ground testing/analysis and major flight demonstrator Xplanes is provided. The major objectives included: develop an air launched missile booster research testbed; accurately deliver research payloads through programmable guidance to hypersonic test conditions; low cost; a high flight rate minimum of two flights per year and utilize surplus air launched missiles and NASA aircraft.

  8. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    NASA Technical Reports Server (NTRS)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  9. The Living With a Star Space Environment Testbed Experiments

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  10. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... Pilot Program, 73 FR 76,002 (Dec. 15, 2008). \\3\\ The final Phase I test plan and additional information... National Telecommunications and Information Administration Spectrum Sharing Innovation Test-Bed Pilot... conduct in Phase II/III of the Spectrum Sharing Innovation Test-Bed pilot program to assess...

  11. PORT: A Testbed Paradigm for On-line Digital Archive Development.

    ERIC Educational Resources Information Center

    Keeler, Mary; Kloesel, Christian

    1997-01-01

    Discusses the Peirce On-line Resource Testbed (PORT), a digital archive of primary data. Highlights include knowledge processing testbeds for digital resource development; Peirce's pragmatism in operation; PORT and knowledge processing; obstacles to archive access; and PORT as a paradigm for critical control in knowledge processing. (AEF)

  12. Development of a flexible test-bed for robotics, telemanipulation and servicing research

    NASA Technical Reports Server (NTRS)

    Davies, Barry F.

    1989-01-01

    The development of a flexible operation test-bed, based around a commercially available ASEA industrial robot is described. The test-bed was designed to investigate fundamental human factors issues concerned with the unique problems of robotic manipulation in the hostile environment of Space.

  13. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  14. A Battery Certification Testbed for Small Satellite Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  15. The Hyperion Project: Partnership for an Advaned Technology Cluster Testbed

    SciTech Connect

    Seager, M; Leininger, M

    2008-04-28

    The Hyperion project offers a unique opportunity to participate in a community-driven testing and development resource at a scale beyond what can be accomplished by one entity alone. Hyperion is a new strategic technology partnership intended to support the member-driven development and testing at scale. This partnership will allow commodity clusters to scale up to meet the growing demands of customers multi-core petascale simulation environments. Hyperion will tightly couple together the outstanding research and development capabilities of Lawrence Livermore National Laboratory with leading technology companies, including Cisco, Data Direct Networks, Dell, Intel, LSI, Mellanox, Qlogic, RedHat, SuperMicro and Sun. The end goal of this project is to revolutionize cluster computing in fundamental ways by providing the critical software and hardware components for a highly scalable simulation environment. This environment will include support for high performance networking, parallel file systems, operating system, and cluster management. This goal will be achieved by building a scalable technology cluster testbed that will be fully dedicated to the partners and provide: (1) A scalable development testing and benchmarking environment for critical enabling Linux cluster technologies; (2) An evaluation testbed for new hardware and software technologies; and (3) A vehicle for forming long term collaborations.

  16. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2012-01-01

    NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  17. Development of optical packet and circuit integrated ring network testbed.

    PubMed

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate < 1×10(-4)) operation was achieved with optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated.

  18. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2013-01-01

    NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  19. Characterization of Vegetation using the UC Davis Remote Sensing Testbed

    NASA Astrophysics Data System (ADS)

    Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.

    2006-12-01

    Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.

  20. Off-road perception testbed vehicle design and evaluation

    NASA Astrophysics Data System (ADS)

    Spofford, John R.; Herron, Jennifer B.; Anhalt, David J.; Morgenthaler, Matthew K.; DeHerrera, Clinton

    2003-09-01

    Off-road robotics efforts such as DARPA"s PerceptOR program have motivated the development of testbed vehicles capable of sustained operation in a variety of terrain and environments. This paper describes the retrofitting of a minimally-modified ATV chassis into such a testbed which has been used by multiple programs for autonomous mobility development and sensor characterization. Modular mechanical interfaces for sensors and equipment enclosures enabled integration of multiple payload configurations. The electric power subsystem was capable of short-term operation on batteries with refueled generation for continuous operation. Processing subsystems were mounted in sealed, shock-dampened enclosures with heat exchangers for internal cooling to protect against external dust and moisture. The computational architecture was divided into a real-time vehicle control layer and an expandable high level processing and perception layer. The navigation subsystem integrated real time kinematic GPS with a three-axis IMU for accurate vehicle localization and sensor registration. The vehicle software system was based on the MarsScape architecture developed under DARPA"s MARS program. Vehicle mobility software capabilities included route planning, waypoint navigation, teleoperation, and obstacle detection and avoidance. The paper describes the vehicle design in detail and summarizes its performance during field testing.

  1. Extreme Adaptive Optics Testbed: Results and Future Work

    SciTech Connect

    Evans, J W; Sommargren, G; Poyneer, L; Macintosh, B; Severson, S; Dillon, D; Sheinis, A; Palmer, D; Kasdin, J; Olivier, S

    2004-07-15

    'Extreme' adaptive optics systems are optimized for ultra-high-contrast applications, such as ground-based extrasolar planet detection. The Extreme Adaptive Optics Testbed at UC Santa Cruz is being used to investigate and develop technologies for high-contrast imaging, especially wavefront control. A simple optical design allows us to minimize wavefront error and maximize the experimentally achievable contrast before progressing to a more complex set-up. A phase shifting diffraction interferometer is used to measure wavefront errors with sub-nm precision and accuracy. We have demonstrated RMS wavefront errors of <1.3 nm and a contrast of >10{sup -7} over a substantial region using a shaped pupil. Current work includes the installation and characterization of a 1024-actuator Micro-Electro-Mechanical- Systems (MEMS) deformable mirror, manufactured by Boston Micro-Machines, which will be used for wavefront control. In our initial experiments we can flatten the deformable mirror to 1.8-nm RMS wavefront error within a control radius of 5-13 cycles per aperture. Ultimately this testbed will be used to test all aspects of the system architecture for an extrasolar planet-finding AO system.

  2. Airborne Subscale Transport Aircraft Research Testbed: Aircraft Model Development

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas L.; Langford, William M.; Hill, Jeffrey S.

    2005-01-01

    The Airborne Subscale Transport Aircraft Research (AirSTAR) testbed being developed at NASA Langley Research Center is an experimental flight test capability for research experiments pertaining to dynamics modeling and control beyond the normal flight envelope. An integral part of that testbed is a 5.5% dynamically scaled, generic transport aircraft. This remotely piloted vehicle (RPV) is powered by twin turbine engines and includes a collection of sensors, actuators, navigation, and telemetry systems. The downlink for the plane includes over 70 data channels, plus video, at rates up to 250 Hz. Uplink commands for aircraft control include over 30 data channels. The dynamic scaling requirement, which includes dimensional, weight, inertial, actuator, and data rate scaling, presents distinctive challenges in both the mechanical and electrical design of the aircraft. Discussion of these requirements and their implications on the development of the aircraft along with risk mitigation strategies and training exercises are included here. Also described are the first training (non-research) flights of the airframe. Additional papers address the development of a mobile operations station and an emulation and integration laboratory.

  3. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  4. Dealing with clouds from space-based ultraspectral IR observations

    NASA Astrophysics Data System (ADS)

    Zhou, D.; Smith, W.; Liu, X.; Larar, A.; Mango, S.; Huang, H.-L.

    Hyperspectral infrared sounders with nadir observations are limited by the cloud cover It is critical to detect the clouds in satellite measurements and to accurately retrieve the atmospheric and surface parameters with cloud contamination measurements An inversion scheme has been developed dealing with cloudy as well as cloud-free radiances observed with ultraspectral infrared sounders to simultaneously retrieve surface atmospheric thermodynamic and cloud microphysical parameters A fast radiative transfer model which applies to the clouded atmosphere is used for atmospheric profile and cloud parameter retrieval A one-dimensional 1-d variational multi-variable inversion solution is used to iteratively improve the background state defined by an eigenvector-regression-retrieval The solution is iterated in order to account for non-linearity in the 1-d variational solution NPOESS Airborne Sounder Testbed -- Interferometer NAST-I retrievals are compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar CPL This work was motivated by the need to obtain solutions for atmospheric soundings from infrared radiances observed for every individual field of view regardless of cloud cover from future ultraspectral geostationary satellite sounding instruments such as the Geosynchronous Imaging Fourier Transform Spectrometer GIFTS and the Hyperspectral Environmental Suite HES However this retrieval approach can also be applied to the ultraspectral sounding instruments to fly on polar satellites such

  5. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; Pauls, T.

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  6. NASA/Goddard Space Flight Center's testbed for CCSDS compatible systems

    NASA Technical Reports Server (NTRS)

    Carper, Richard D.

    1993-01-01

    A testbed for flight and ground systems compatible with the Consultative Committee for Space Data Systems (CCSDS) Recommendations has been developed at NASA's Goddard Space Flight Center. The subsystems of an end-to-end CCSDS based data system are being developed. All return link CCSDS telemetry services (except Internet) and both versions of the CCSDS frame formats are being implemented. In key areas of uncertainty, multiple design approaches are being performed. In addition, key flight-qualifiable hardware components, such as Reed-Solomon encoders, are being developed to complement the testbed element development. The testbed and its capabilities are described. The method of dissemination of the testbed results are given, as are plans to make the testbed capabilities available to outside users. Plans for the development of standardized conformance and compatibility tests are provided.

  7. High performance testbed for four-beam infrared interferometric nulling and exoplanet detection.

    PubMed

    Martin, Stefan; Booth, Andrew; Liewer, Kurt; Raouf, Nasrat; Loya, Frank; Tang, Hong

    2012-06-10

    Technology development for a space-based infrared nulling interferometer capable of earthlike exoplanet detection and characterization started in earnest in the last 10 years. At the Jet Propulsion Laboratory, the planet detection testbed was developed to demonstrate the principal components of the beam combiner train for a high performance four-beam nulling interferometer. Early in the development of the testbed, the importance of "instability noise" for nulling interferometer sensitivity was recognized, and the four-beam testbed would produce this noise, allowing investigation of methods for mitigating this noise source. The testbed contains the required features of a four-beam combiner for a space interferometer and performs at a level matching that needed for the space mission. This paper describes in detail the design, functions, and controls of the testbed.

  8. High performance testbed for four-beam infrared interferometric nulling and exoplanet detection.

    PubMed

    Martin, Stefan; Booth, Andrew; Liewer, Kurt; Raouf, Nasrat; Loya, Frank; Tang, Hong

    2012-06-10

    Technology development for a space-based infrared nulling interferometer capable of earthlike exoplanet detection and characterization started in earnest in the last 10 years. At the Jet Propulsion Laboratory, the planet detection testbed was developed to demonstrate the principal components of the beam combiner train for a high performance four-beam nulling interferometer. Early in the development of the testbed, the importance of "instability noise" for nulling interferometer sensitivity was recognized, and the four-beam testbed would produce this noise, allowing investigation of methods for mitigating this noise source. The testbed contains the required features of a four-beam combiner for a space interferometer and performs at a level matching that needed for the space mission. This paper describes in detail the design, functions, and controls of the testbed. PMID:22695670

  9. Cloud Control

    ERIC Educational Resources Information Center

    Ramaswami, Rama; Raths, David; Schaffhauser, Dian; Skelly, Jennifer

    2011-01-01

    For many IT shops, the cloud offers an opportunity not only to improve operations but also to align themselves more closely with their schools' strategic goals. The cloud is not a plug-and-play proposition, however--it is a complex, evolving landscape that demands one's full attention. Security, privacy, contracts, and contingency planning are all…

  10. Cloud Cover

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2012-01-01

    This article features a major statewide initiative in North Carolina that is showing how a consortium model can minimize risks for districts and help them exploit the advantages of cloud computing. Edgecombe County Public Schools in Tarboro, North Carolina, intends to exploit a major cloud initiative being refined in the state and involving every…

  11. Cloud Modeling

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Moncrieff, Mitchell; Einaud, Franco (Technical Monitor)

    2001-01-01

    Numerical cloud models have been developed and applied extensively to study cloud-scale and mesoscale processes during the past four decades. The distinctive aspect of these cloud models is their ability to treat explicitly (or resolve) cloud-scale dynamics. This requires the cloud models to be formulated from the non-hydrostatic equations of motion that explicitly include the vertical acceleration terms since the vertical and horizontal scales of convection are similar. Such models are also necessary in order to allow gravity waves, such as those triggered by clouds, to be resolved explicitly. In contrast, the hydrostatic approximation, usually applied in global or regional models, does allow the presence of gravity waves. In addition, the availability of exponentially increasing computer capabilities has resulted in time integrations increasing from hours to days, domain grids boxes (points) increasing from less than 2000 to more than 2,500,000 grid points with 500 to 1000 m resolution, and 3-D models becoming increasingly prevalent. The cloud resolving model is now at a stage where it can provide reasonably accurate statistical information of the sub-grid, cloud-resolving processes poorly parameterized in climate models and numerical prediction models.

  12. Cloud Control

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2012-01-01

    Your learning curriculum needs a new technological platform, but you don't have the expertise or IT equipment to pull it off in-house. The answer is a learning system that exists online, "in the cloud," where learners can access it anywhere, anytime. For trainers, cloud-based coursework often means greater ease of instruction resulting in greater…

  13. Arctic Clouds

    Atmospheric Science Data Center

    2013-04-19

    ...   View Larger Image Stratus clouds are common in the Arctic during the summer months, and are important modulators of ... from MISR's two most obliquely forward-viewing cameras. The cold, stable air causes the clouds to persist in stratified layers, and this ...

  14. COMPARISON OF MILLIMETER-WAVE CLOUD RADAR MEASUREMENTS FOR THE FALL 1997 CLOUD IOP

    SciTech Connect

    SEKELSKY,S.M.; LI,L.; GALLOWAY,J.; MCINTOSH,R.E.; MILLER,M.A.; CLOTHIAUX,E.E.; HAIMOV,S.; MACE,G.; SASSEN,K.

    1998-03-23

    One of the primary objectives of the Fall 1997 IOP was to intercompare Ka-band (35GHz) and W-band (95GHz) cloud radar observations and verify system calibrations. During September 1997, several cloud radars were deployed at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site, including the full time operation 35 GHz CART Millimeter-wave Cloud Radar (MMCR), (Moran, 1997), the University of Massachusetts (UMass) single antenna 33GHz/95 GHz Cloud Profiling Radar System (CPRS), (Sekelsky, 1996), the 95 GHz Wyoming Cloud Radar (WCR) flown on the University of Wyoming King Air (Galloway, 1996), the University of Utah 95 GHz radar and the dual-antenna Pennsylvania State University 94 GHz radar (Clothiaux, 1995). In this paper the authors discuss several issues relevant to comparison of ground-based radars, including the detection and filtering of insect returns. Preliminary comparisons of ground-based Ka-band radar reflectivity data and comparisons with airborne radar reflectivity measurements are also presented.

  15. SPHERES: Design of a Formation Flying Testbed for ISS

    NASA Astrophysics Data System (ADS)

    Sell, S. W.; Chen, S. E.

    2002-01-01

    The SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) payload is an innovative formation-flying spacecraft testbed currently being developed for use internally aboard the International Space Station (ISS). The purpose of the testbed is to provide a cost-effective, long duration, replenishable, and easily reconfigurable platform with representative dynamics for the development and validation of metrology, formation flying, and autonomy algorithms. The testbed components consist of three 8-inch diameter free-flying "satellites," five ultrasound beacons, and an ISS laptop workstation. Each satellite is self-contained with on-board battery power, cold-gas propulsion (CO2), and processing systems. Satellites use two packs of eight standard AA batteries for approximately 90 minutes of lifetime while beacons last the duration of the mission powered by a single AA battery. The propulsion system uses pressurized carbon dioxide gas, stored in replaceable tanks, distributed through an adjustable regulator and associated tubing to twelve thrusters located on the faces of the satellites. A Texas Instruments C6701 DSP handles control algorithm data while an FPGA manages all sensor data, timing, and communication processes on the satellite. All three satellites communicate with each other and with the controlling laptop via a wireless RF link. Five ultrasound beacons, located around a predetermined work area, transmit ultrasound signals that are received by each satellite. The system effectively acts as a pseudo-GPS system, allowing the satellites to determine position and attitude and to navigate within the test arena. The payload hardware are predominantly Commercial Off The Shelf (COTS) products with the exception of custom electronics boards, selected propulsion system adaptors, and beacon and satellite structural elements. Operationally, SPHERES will run in short duration test sessions with approximately two weeks between each session. During

  16. Space Station technology testbed: 2010 deep space transport

    NASA Technical Reports Server (NTRS)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  17. Space Station technology testbed: 2010 deep space transport

    NASA Astrophysics Data System (ADS)

    Holt, Alan C.

    1993-12-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  18. Delivering Unidata Technology via the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  19. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  20. Simulation to Flight Test for a UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; Logan, Michael J.; French, Michael L.; Guerreiro, Nelson M.

    2006-01-01

    The NASA Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis, Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights, including a fully autonomous demonstration at the Association of Unmanned Vehicle Systems International (AUVSI) UAV Demo 2005. Simulations based on wind tunnel data are being used to further develop advanced controllers for implementation and flight test.

  1. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; Grendreau, Keith C.

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  2. SIM Interferometer Testbed (SCDU) Status and Recent Results

    NASA Technical Reports Server (NTRS)

    Nemati, Bijan; An, Xin; Goullioud, Renaud; Shao, Michael; Shen, Tsae-Pyng; Wehmeier, Udo J.; Weilert, Mark A.; Wang, Xu; Werne, Thomas A.; Wu, Janet P.; Zhai, Chengxing

    2010-01-01

    SIM Lite is a space-borne stellar interferometer capable of searching for Earth-size planets in the habitable zones of nearby stars. This search will require measurement of astrometric angles with sub micro-arcsecond accuracy and optical pathlength differences to 1 picometer by the end of the five-year mission. One of the most significant technical risks in achieving this level of accuracy is from systematic errors that arise from spectral differences between candidate stars and nearby reference stars. The Spectral Calibration Development Unit (SCDU), in operation since 2007, has been used to explore this effect and demonstrate performance meeting SIM goals. In this paper we present the status of this testbed and recent results.

  3. Experimental validation of docking and capture using space robotics testbeds

    NASA Astrophysics Data System (ADS)

    Spofford, John; Schmitz, Eric; Hoff, William

    This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.

  4. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John; Schmitz, Eric; Hoff, William

    1991-01-01

    This presentation describes the application of robotic and computer vision systems to validate docking and capture operations for space cargo transfer vehicles. Three applications are discussed: (1) air bearing systems in two dimensions that yield high quality free-flying, flexible, and contact dynamics; (2) validation of docking mechanisms with misalignment and target dynamics; and (3) computer vision technology for target location and real-time tracking. All the testbeds are supported by a network of engineering workstations for dynamic and controls analyses. Dynamic simulation of multibody rigid and elastic systems are performed with the TREETOPS code. MATRIXx/System-Build and PRO-MATLAB/Simulab are the tools for control design and analysis using classical and modern techniques such as H-infinity and LQG/LTR. SANDY is a general design tool to optimize numerically a multivariable robust compensator with a user-defined structure. Mathematica and Macsyma are used to derive symbolically dynamic and kinematic equations.

  5. MIT-KSC space life sciences telescience testbed

    NASA Technical Reports Server (NTRS)

    1989-01-01

    A Telescience Life Sciences Testbed is being developed. The first phase of this effort consisted of defining the experiments to be performed, investigating the various possible means of communication between KSC and MIT, and developing software and hardware support. The experiments chosen were two vestibular sled experiments: a study of ocular torsion produced by Y axis linear acceleration, based on the Spacelab D-1 072 Vestibular Experiment performed pre- and post-flight at KSC; and an optokinetic nystagmus (OKN)/linear acceleration interaction experiment. These two experiments were meant to simulate actual experiments that might be performed on the Space Station and to be representative of space life sciences experiments in general in their use of crew time and communications resources.

  6. A Simulation Testbed for Airborne Merging and Spacing

    NASA Technical Reports Server (NTRS)

    Santos, Michel; Manikonda, Vikram; Feinberg, Art; Lohr, Gary

    2008-01-01

    The key innovation in this effort is the development of a simulation testbed for airborne merging and spacing (AM&S). We focus on concepts related to airports with Super Dense Operations where new airport runway configurations (e.g. parallel runways), sequencing, merging, and spacing are some of the concepts considered. We focus on modeling and simulating a complementary airborne and ground system for AM&S to increase efficiency and capacity of these high density terminal areas. From a ground systems perspective, a scheduling decision support tool generates arrival sequences and spacing requirements that are fed to the AM&S system operating on the flight deck. We enhanced NASA's Airspace Concept Evaluation Systems (ACES) software to model and simulate AM&S concepts and algorithms.

  7. Intelligent Elements for the ISHM Testbed and Prototypes (ITP) Project

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Park, Han; Schwabacher, Mark; Watson, Michael; Mackey, Ryan; Fijany, Amir; Trevino, Luis; Weir, John

    2005-01-01

    Deep-space manned missions will require advanced automated health assessment capabilities. Requirements such as in-space assembly, long dormant periods and limited accessibility during flight, present significant challenges that should be addressed through Integrated System Health Management (ISHM). The ISHM approach will provide safety and reliability coverage for a complete system over its entire life cycle by determining and integrating health status and performance information from the subsystem and component levels. This paper will focus on the potential advanced diagnostic elements that will provide intelligent assessment of the subsystem health and the planned implementation of these elements in the ISHM Testbed and Prototypes (ITP) Project under the NASA Exploration Systems Research and Technology program.

  8. Easy and hard testbeds for real-time search algorithms

    SciTech Connect

    Koenig, S.; Simmons, R.G.

    1996-12-31

    Although researchers have studied which factors influence the behavior of traditional search algorithms, currently not much is known about how domain properties influence the performance of real-time search algorithms. In this paper we demonstrate, both theoretically and experimentally, that Eulerian state spaces (a super set of undirected state spaces) are very easy for some existing real-time search algorithms to solve: even real-time search algorithms that can be intractable, in general, are efficient for Eulerian state spaces. Because traditional real-time search testbeds (such as the eight puzzle and gridworlds) are Eulerian, they cannot be used to distinguish between efficient and inefficient real-time search algorithms. It follows that one has to use non-Eulerian domains to demonstrate the general superiority of a given algorithm. To this end, we present two classes of hard-to-search state spaces and demonstrate the performance of various real-time search algorithms on them.

  9. The computational structural mechanics testbed architecture. Volume 1: The language

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the first set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP, and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 1 presents the basic elements of the CLAMP language and is intended for all users.

  10. The computational structural mechanics testbed architecture. Volume 2: Directives

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the second of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 2 describes the CLIP directives in detail. It is intended for intermediate and advanced users.

  11. The computational structural mechanics testbed architecture. Volume 2: The interface

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the third set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 3 describes the CLIP-Processor interface and related topics. It is intended only for processor developers.

  12. Experimental Testbed for the Study of Hydrodynamic Issues in Supernovae

    SciTech Connect

    Robey, H F; Kane, J O; Remington, B A; Drake, R P; Hurricane, O A; Louis, H; Wallace, R J; Knauer, J; Keiter, P; Arnett, D

    2000-10-09

    More than a decade after the explosion of SN 1987A, unresolved discrepancies still remain in attempts to numerically simulate the mixing processes initiated by the passage of a very strong shock through the layered structure of the progenitor star. Numerically computed velocities of the radioactive {sup 56}Ni and {sup 56}CO, produced by shock-induced explosive burning within the silicon layer for example, are still more than 50% too low as compared with the measured velocities. In order to resolve such discrepancies between observation and simulation, an experimental testbed has been designed on the Omega Laser for the study of hydrodynamic issues of importance to supernovae (SNe). In this paper, we present results from a series of scaled laboratory experiments designed to isolate and explore several issues in the hydrodynamics of SN explosions. The results of the experiments are compared with numerical simulations and are generally found to be in reasonable agreement.

  13. An Overview of Research Activity at the Launch Systems Testbed

    NASA Technical Reports Server (NTRS)

    Vu, Bruce; Kandula, Max

    2003-01-01

    This paper summarizes the acoustic testing and analysis activities at the Launch System Testbed (LST) of Kennedy Space Center (KSC). A major goal is to develop passive methods of mitigation of sound from rocket exhaust jets with ducted systems devoid of traditional water injection. Current testing efforts are concerned with the launch-induced vibroacoustic behavior of scaled exhaust jets. Numerical simulations are also developed to study the sound propagation from supersonic jets in free air and through enclosed ducts. Scaling laws accounting for the effects of important parameters such as jet Mach number, jet velocity, and jet temperature on the far-field noise are investigated in order to deduce full-scale environment from small-scale tests.

  14. Modular, Rapid Propellant Loading System/Cryogenic Testbed

    NASA Technical Reports Server (NTRS)

    Hatfield, Walter, Sr.; Jumper, Kevin

    2012-01-01

    The Cryogenic Test Laboratory (CTL) at Kennedy Space Center (KSC) has designed, fabricated, and installed a modular, rapid propellant-loading system to simulate rapid loading of a launch-vehicle composite or standard cryogenic tank. The system will also function as a cryogenic testbed for testing and validating cryogenic innovations and ground support equipment (GSE) components. The modular skid-mounted system is capable of flow rates of liquid nitrogen from 1 to 900 gpm (approx equals 3.8 to 3,400 L/min), of pressures from ambient to 225 psig (approx equals 1.5 MPa), and of temperatures to -320 F (approx equals -195 C). The system can be easily validated to flow liquid oxygen at a different location, and could be easily scaled to any particular vehicle interface requirements

  15. Telescience testbed: Operational support functions for biomedical experiments

    NASA Astrophysics Data System (ADS)

    Yamashita, Masamichi; Watanabe, Satoru; Shoji, Takatoshi; Clarke, Andrew H.; Suzuki, Hiroyuki; Yanagihara, Dai

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  16. Photovoltaic Engineering Testbed Designed for Calibrating Photovoltaic Devices in Space

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    2002-01-01

    Accurate prediction of the performance of solar arrays in space requires that the cells be tested in comparison with a space-flown standard. Recognizing that improvements in future solar cell technology will require an ever-increasing fidelity of standards, the Photovoltaics and Space Environment Branch at the NASA Glenn Research Center, in collaboration with the Ohio Aerospace Institute, designed a prototype facility to allow routine calibration, measurement, and qualification of solar cells on the International Space Station, and then the return of the cells to Earth for laboratory use. For solar cell testing, the Photovoltaic Engineering Testbed (PET) site provides a true air-mass-zero (AM0) solar spectrum. This allows solar cells to be accurately calibrated using the full spectrum of the Sun.

  17. Telescience testbed: operational support functions for biomedical experiments.

    PubMed

    Yamashita, M; Watanabe, S; Shoji, T; Clarke, A H; Suzuki, H; Yanagihara, D

    1992-07-01

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  18. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Mike; Munson, Mike; Teate, George

    2006-01-01

    A new testbed for hypersonic flight research is proposed. Known as the Phoenix air-launched small missile (ALSM) flight testbed, it was conceived to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of two unique and very capable flight assets: the United States Navy Phoenix AIM-54 long-range, guided air-to-air missile and the NASA Dryden F-15B testbed airplane. The U.S. Navy retirement of the Phoenix AIM-54 missiles from fleet operation has presented an excellent opportunity for converting this valuable flight asset into a new flight testbed. This cost-effective new platform will fill an existing gap in the test and evaluation of current and future hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform. When launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will be valuable for the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite small-payload air-launched space boosters.

  19. TopCAT and PySESA: Open-source software tools for point cloud decimation, roughness analyses, and quantitative description of terrestrial surfaces

    NASA Astrophysics Data System (ADS)

    Hensleigh, J.; Buscombe, D.; Wheaton, J. M.; Brasington, J.; Welcker, C. W.; Anderson, K.

    2015-12-01

    The increasing use of high-resolution topography (HRT) constructed from point clouds obtained from technology such as LiDAR, SoNAR, SAR, SfM and a variety of range-imaging techniques, has created a demand for custom analytical tools and software for point cloud decimation (data thinning and gridding) and spatially explicit statistical analysis of terrestrial surfaces. We will present on a number of analytical and computational tools designed to quantify surface roughness and texture, directly from point clouds in a variety of ways (using spatial- and frequency-domain statistics). TopCAT (Topographic Point Cloud Analysis Toolkit; Brasington et al., 2012) and PySESA (Python program for Spatially Explicit Spectral Analysis) both work by applying a small moving window to (x,y,z) data to calculate a suite of (spatial and spectral domain) statistics, which are then spatially-referenced on a regular (x,y) grid at a user-defined resolution. Collectively, these tools facilitate quantitative description of surfaces and may allow, for example, fully automated texture characterization and segmentation, roughness and grain size calculation, and feature detection and classification, on very large point clouds with great computational efficiency. Using tools such as these, it may be possible to detect geomorphic change in surfaces which have undergone minimal elevation difference, for example deflation surfaces which have coarsened but undergone no net elevation change, or surfaces which have eroded and accreted, leaving behind a different textural surface expression than before. The functionalities of the two toolboxes are illustrated with example high-resolution bathymetric point cloud data collected with multibeam echosounder, and topographic data collected with LiDAR.

  20. Testbeds for Wind Resource Characterization: Needs and Potential Facilities

    NASA Astrophysics Data System (ADS)

    Shaw, W. J.; Berg, L. K.; Rishel, J. P.; Flaherty, J. E.

    2008-12-01

    With the emergence of wind as a significant source of alternative energy, it is becoming increasingly clear that some problems associated with the installation and operation of wind plants arise because of continuing gaps in our knowledge of fundamental physical processes in the lower atmospheric boundary layer. Over the years, a number of well-designed intensive field campaigns have yielded significant insight into boundary layer structure and turbulence under targeted conditions. However, to be able to usefully simulate the atmosphere for applications of wind power, it is important to evaluate the resulting parameterizations under a realistic spectrum of atmospheric conditions. To do this, facilities - testbeds - are required that operate continually over long periods. Such facilities could also be used, among other things, to establish long-term statistics of mean wind and low-level shear, to explore the representativeness of shorter-period (e.g. one year) statistics, to explore techniques for extrapolating wind statistics in space, and to serve as host infrastructure for boundary layer campaigns targeted to wind energy applications. During the last half of the 20th century, a number of tall instrumented towers were installed at locations around the United States for studies of atmospheric dispersion and other purposes. Many of these are no longer in service, but some have operated continuously for decades and continue to collect calibrated wind and temperature information from multiple heights extending to hub height or higher for many current operational wind turbines. This talk will review the status of tall towers in the U.S. that could anchor testbeds for research related wind power production and will use data from the 120-m meteorological tower on the Hanford Site in southeastern Washington State to illustrate the kind of information is available.

  1. Vacuum Nuller Testbed Performance, Characterization and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  2. Design, performance, and operational characteristics of an FDDI testbed

    SciTech Connect

    Testi, N.; Gossage, S.A.; Ralph, W.D.

    1991-01-01

    Sandia National Laboratories has recently completed a major upgrade of its Central Computing Network (CCN), which is a large heterogeneous network providing scientific supercomputing, file storage, output services, and remote access to network resources. The new network, called the Secure Supercomputing Network (SSN), is based on the HPYERchannel-100 technology platform and is primarily a UNIX-TCP/IP network environment. A migration from the HYPERchannel-100 hardware platform presently in use on the SSN to the Network Systems Corporation (NSC) Fiber Distributed Data Interface (FDDI) hardware platform is currently being considered for several reasons. First, Sandia supports the movement from proprietary hardware and software solutions to standardized solutions. Second, the inherent robustness of the FDDI standard would provide the reliability which is critical to production computing at the Labs. Finally, the potential for tuning the ring for specific applications and connection schemes is an advantage over HYPERchannel. In order to evaluate the NSC FDDI technology platform, an FDDI testbed has been constructed, consisting of two independent FDDI rings. The rings are connected via T3 (44.736 Megabits/second) and T1 (1.544 Megabits/second) link Data Exchange Units (DXUs), and an FE649 FDDI/FDDI router. In addition to a variety of NSC FDDI DXUs and associated host computers, several other vendors' FDDI products are also present on the testbed. Test data on fault isolation and recovery mechanisms, performance, IP routing (within and between rings), IP packet labeling, monitor capabilities, and interoperability'' will be presented. The design and implementation of the underlying fiber optic physical plant will also be discussed.

  3. Demo III: Department of Defense testbed for unmanned ground mobility

    NASA Astrophysics Data System (ADS)

    Shoemaker, Chuck M.; Bornstein, Jonathan A.; Myers, Scott D.; Brendle, Bruce E., Jr.

    1999-07-01

    Robotics has been identified by numerous recent Department of Defense (DOD) studies as a key enabling technology for future military operational concepts. The Demo III Program is a multiyear effort encompassing technology development and demonstration on testbed platforms, together with modeling simulation and experimentation directed toward optimization of operational concepts to employ this technology. Primary program focus is the advancement of capabilities for autonomous mobility through unstructured environments, concentrating on both perception and intelligent control technology. The scout mission will provide the military operational context for demonstration of this technology, although a significant emphasis is being placed upon both hardware and software modularity to permit rapid extension to other military missions. The Experimental Unmanned Vehicle (XUV) is a small (approximately 1150 kg, V-22 transportable) technology testbed vehicle designed for experimentation with multiple military operational concepts. Currently under development, the XUV is scheduled for roll-out in Summer 1999, with an initial troop experimentation to be conducted in September 1999. Though small, and relatively lightweight, modeling has shown the chassis capable of automotive mobility comparable to the current Army lightweight high-mobility, multipurpose, wheeled vehicle (HMMWV). The XUV design couples multisensor perception with intelligent control to permit autonomous cross-country navigation at speeds of up to 32 kph during daylight and 16 kph during hours of darkness. A small, lightweight, highly capable user interface will permit intuitive control of the XUV by troops from current-generation tactical vehicles. When it concludes in 2002, Demo III will provide the military with both the technology and the initial experience required to develop and field the first generation of semi-autonomous tactical ground vehicles for combat, combat support, and logistics applications.

  4. The Wide-Field Imaging Interferometry Testbed (WIIT): Recent Progress and Results

    NASA Technical Reports Server (NTRS)

    Rinehart, Stephen A.; Frey, Bradley J.; Leisawitz, David T.; Lyon, Richard G.; Maher, Stephen F.; Martino, Anthony J.

    2008-01-01

    Continued research with the Wide-Field Imaging Interferometry Testbed (WIIT) has achieved several important milestones. We have moved WIIT into the Advanced Interferometry and Metrology (AIM) Laboratory at Goddard, and have characterized the testbed in this well-controlled environment. The system is now completely automated and we are in the process of acquiring large data sets for analysis. In this paper, we discuss these new developments and outline our future research directions. The WIIT testbed, combined with new data analysis techniques and algorithms, provides a demonstration of the technique of wide-field interferometric imaging, a powerful tool for future space-borne interferometers.

  5. Complex Clouds

    Atmospheric Science Data Center

    2013-04-16

    ... title:  Multi-layer Clouds Over the South Indian Ocean     View Larger Image ... a noticeable cyclonic circulation over the Southern Indian Ocean, to the north of Enderbyland, East Antarctica. The image at left was ...

  6. Noctilucent clouds

    NASA Astrophysics Data System (ADS)

    Gadsden, M.

    An assessment of spacecraft, sounding rocket and ground level observational data on the noctilucent clouds which appear during summertime, at high latitudes, near the top of the mesosphere shows that these data are not sufficiently unambiguous and clear to permit conclusions as to the nature of the clouds. Although they seem to be ice particles nucleated at very low temperatures and pressures by either meteoric smoke or atmospheric ions, the very existence of the clouds poses the problem of how so much water vapor could be present at such a great height. An attempt is made to predict the microscopic behavior of the cloud particles through consideration of the relative importance of radiometer effects, radiation balance, Brownian movement, electric polarization, and the influence of Coulomb attraction on the growth of large clustered ions.

  7. Observational evidence linking precipitation and mesoscale cloud fraction in the southeast Pacific

    NASA Astrophysics Data System (ADS)

    Rapp, Anita D.

    2016-07-01

    Precipitation has been hypothesized to play an important role in the transition of low clouds from closed to open cell cumulus in regions of large-scale subsidence. A synthesis of A-Train satellite measurements is used to examine the relationship between precipitation and mesoscale cloud fraction across a transition region in the southeastern Pacific. Low cloud pixels are identified in 4 years of CloudSat/CALIPSO observations and along-track mean cloud fraction within 2.5-500 km surrounding the clouds calculated. Results show that cloud fraction decreases more rapidly in areas surrounding precipitating clouds than around nonprecipitating clouds. The closed to open cell transition region appears especially sensitive, with the surrounding mesoscale cloud fraction decreasing 30% faster in the presence of precipitation compared to nonprecipitating clouds. There is also dependence on precipitation rate and cloud liquid water path (LWP), with higher rain rates or lower LWP showing larger decreases in surrounding cloud fraction.

  8. Astronomy In The Cloud: Using Mapreduce For Image Coaddition

    NASA Astrophysics Data System (ADS)

    Wiley, Keith; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-01-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computational challenges such as anomaly detection, classification, and moving object tracking. Since such studies require the highest quality data, methods such as image coaddition, i.e., registration, stacking, and mosaicing, will be critical to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources, e.g., asteroids, or transient objects, e.g., supernovae, these datastreams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this paper we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data is partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources, i.e., platforms where Hadoop is offered as a service. We report on our experience implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multi-terabyte imaging dataset provides a good testbed for algorithm development since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image coaddition to the MapReduce framework. Then we describe a number of optimizations to our basic approach and report experimental results compring their performance. This work is funded by

  9. Research on private cloud computing based on analysis on typical opensource platform: a case study with Eucalyptus and Wavemaker

    NASA Astrophysics Data System (ADS)

    Yu, Xiaoyuan; Yuan, Jian; Chen, Shi

    2013-03-01

    Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.

  10. Independent Technology Assessment within the Federation of Earth Science Information Partners (ESIP) Testbed

    NASA Astrophysics Data System (ADS)

    Burgess, A. B.; Robinson, E.; Graybeal, J.

    2015-12-01

    The Federation of Earth Science Information Partners (ESIP) is a community of science, data and information technology practitioners. ESIP's mission is to support the networking and data dissemination needs of our members and the global community. We do this by linking the functional sectors of education, observation, research and application with the ultimate use of Earth science. Amongst the services provided to ESIP members is the Testbed; a collaborative forum for the development of technology standards, services, protocols and best practices. ESIP has partnered with the NASA Advanced Information Systems Technology (AIST) program to integrate independent assessment of Testing Readiness Level (TRL) into the ESIP Testbed. In this presentation we will 1) demonstrate TRL assessment in the ESIP Testbed using three AIST projects, 2) discuss challenges and insights into creating an independent validation/verification framework and 3) outline the versatility of the ESIP Testbed as applied to other technology projects.

  11. Preliminary Design of a Galactic Cosmic Ray Shielding Materials Testbed for the International Space Station

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen; Sechkar, Edward A.; Panko, Scott R.

    2012-01-01

    The preliminary design of a testbed to evaluate the effectiveness of galactic cosmic ray (GCR) shielding materials, the MISSE Radiation Shielding Testbed (MRSMAT) is presented. The intent is to mount the testbed on the Materials International Space Station Experiment-X (MISSE-X) which is to be mounted on the International Space Station (ISS) in 2016. A key feature is the ability to simultaneously test nine samples, including standards, which are 5.25 cm thick. This thickness will enable most samples to have an areal density greater than 5 g/sq cm. It features a novel and compact GCR telescope which will be able to distinguish which cosmic rays have penetrated which shielding material, and will be able to evaluate the dose transmitted through the shield. The testbed could play a pivotal role in the development and qualification of new cosmic ray shielding technologies.

  12. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  13. Carrier Plus: A Sensor Payload for Living With a Star Space Environment Testbed (LWS/SET)

    NASA Technical Reports Server (NTRS)

    Marshall, Cheryl; Moss, Steven; Howard, Regan; LaBel, Kenneth; Grycewicz, Tom; Barth, Janet; Brewer, Dana

    2003-01-01

    The paper discusses the following: 1. Living with a Star (LWS) program: space environment testbed (SET); natural space environment. 2. Carrier plus: goals and benefits. 3. ON-orbit sensor measurements. 4. Carrier plus architecture. 5. Participation in carrier plus.

  14. Development and Experiments of a Test-Bed for Wheel-Soil Interaction of Lunar Rover

    NASA Astrophysics Data System (ADS)

    Tao, Jianguo; Ding, Liang; Quan, Qiquan; Gao, Haibo

    2012-07-01

    Wheel-soil interaction of lunar exploring rover plays a critical role in rover mechanical design, control and simulation. For presenting and validating effective terramechanics models, as well as evaluating rover wheel performance, a set of wheel-soil interaction test- bed was developed. The test-bed can control the wheel rolling or steering movement at different slippage rates and different speeds, and through a variety of sensors to acquire the measured values of mechanical properties of wheel-soil interaction such as drawbar pull, side force, wheel sinkage displacement, steering torque. In this paper, some characteristics of the test-bed are described, and some experimental works in a rigid rover wheel design and wheel-soil interaction modeling by means of this test-bed are summarized. Experimental results show that the test-bed can accurately and efficiently test wheel-soil interaction for various wheels and loose soil types.

  15. Telescience testbed examination aboard Japanese Experiment Module (JEM): life and material science experiments.

    PubMed

    Matsumoto, K; Fujimori, Y; Shimizu, M; Usami, R; Kusunose, T; Kimura, H; Ohyama, M; Ishikura, S; Nishida, H; Negishi, N; Kawabata, S

    1992-07-01

    A telescience ground testbed experiment was conducted by the National Space Development Agency of Japan (NASDA) at the Tsukuba Space Center in March 1991. The objectives of the ground testbed experiment were to extract scientists' requirements for a communication method, to evaluate the influence of transmission delay and capacity on experiment operations, and to evaluate performance and functions of the system for the testbed experiment. The microscopic operations experiment, the image furnace experiment and the onboard training experiment were selected as typical ground testbed experiments. In these experiments, motion video transmission at 320 kbps was acceptable for observing the experiments and communicating between the principal investigator and the payload specialist. In the microscopic operations experiment, motion video transmission at 1.5 Mbps or more was required for detailed observation. A 4-second transmission delay (roundtrip) was allowable for mutual communication.

  16. Overview of New Cloud Optical Properties in Air Force Weather Worldwide Merged Cloud Analysis

    NASA Astrophysics Data System (ADS)

    Nobis, T. E.; Conner, M. D.

    2013-12-01

    Air Force Weather (AFW) has documented requirements for real-time cloud analysis to support DoD missions around the world. To meet these needs, AFW utilizes the Cloud Depiction and Forecast System (CDFS) II system to develop an hourly cloud analysis. The system creates cloud masks at pixel level from 16 different satellite sources, diagnoses cloud layers, reconciles the pixel level data to a regular grid by instrument class, and optimally merges the various instrument classes to create a final multi-satellite analysis. In Jan, 2013, Northrop Grumman Corp. delivered a new CDFS II baseline which included the addition of new Atmospheric and Environmental Research Inc (AER) developed Cloud Optical Property (COP) variables in the analysis. The new variables include phase (ice/water), optical depth, ice/water path, and particle size. In addition, the COP schemes have radically changed the derivation of cloud properties like cloud top height and thickness. The Northrop-developed CDFS II Test Bed was used to examine and characterize the behavior of these new variables in order to understand how the variables are performing, especially between instrument classes. Understanding this behavior allows performance tuning and uncertainty estimation which will assist users seeking to reason with the data and will be necessary for use in model development and climatology development. This presentation will provide a basic overview of the CDFS II produced COP variables and show results from experiments conducted on the CDFS II Testbed. Results will include a basic comparison of COP derived using different instrument classes as well as comparison between pixel level and derived gridded products with an eye towards better characterization of uncertainty.

  17. A high-resolution, four-band SAR testbed with real-time image formation

    SciTech Connect

    Walker, B.; Sander, G.; Thompson, M.; Burns, B.; Fellerhoff, R.; Dubbert, D.

    1996-03-01

    This paper describes the Twin-Otter SAR Testbed developed at Sandia National Laboratories. This SAR is a flexible, adaptable testbed capable of operation on four frequency bands: Ka, Ku, X, and VHF/UHF bands. The SAR features real-time image formation at fine resolution in spotlight and stripmap modes. High-quality images are formed in real time using the overlapped subaperture (OSA) image-formation and phase gradient autofocus (PGA) algorithms.

  18. Validation of the CERTS Microgrid Concept The CEC/CERTS MicrogridTestbed

    SciTech Connect

    Nichols, David K.; Stevens, John; Lasseter, Robert H.; Eto,Joseph H.

    2006-06-01

    The development of test plans to validate the CERTSMicrogrid concept is discussed, including the status of a testbed.Increased application of Distributed Energy Resources on the Distributionsystem has the potential to improve performance, lower operational costsand create value. Microgrids have the potential to deliver these highvalue benefits. This presentation will focus on operationalcharacteristics of the CERTS microgrid, the partners in the project andthe status of the CEC/CERTS microgrid testbed. Index Terms DistributedGeneration, Distributed Resource, Islanding, Microgrid,Microturbine

  19. Versatile simulation testbed for rotorcraft speech I/O system design

    NASA Technical Reports Server (NTRS)

    Simpson, Carol A.

    1986-01-01

    A versatile simulation testbed for the design of a rotorcraft speech I/O system is described in detail. The testbed will be used to evaluate alternative implementations of synthesized speech displays and speech recognition controls for the next generation of Army helicopters including the LHX. The message delivery logic is discussed as well as the message structure, the speech recognizer command structure and features, feedback from the recognizer, and random access to controls via speech command.

  20. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  1. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    NASA Astrophysics Data System (ADS)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  2. Aerosol-cloud interactions in ship tracks using Terra MODIS/MISR

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Chun; Christensen, Matthew W.; Diner, David J.; Garay, Michael J.

    2015-04-01

    Simultaneous ship track observations from Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) have been compiled to investigate how ship-injected aerosols affect marine warm boundary layer clouds for different cloud types and environmental conditions. By taking advantage of the high spatial resolution multiangle observations available from MISR, we utilized the retrieved cloud albedo, cloud top height, and cloud motion vectors to examine cloud property responses in ship-polluted and nearby unpolluted clouds. The strength of the cloud albedo response to increased aerosol level is primarily dependent on cloud cell structure, dryness of the free troposphere, and boundary layer depth, corroborating a previous study by Chen et al. (2012) where A-Train satellite data were utilized. Under open cell cloud structure the cloud properties are more susceptible to aerosol perturbations as compared to closed cells. Aerosol plumes caused an increase in liquid water amount (+38%), cloud top height (+13%), and cloud albedo (+49%) for open cell clouds, whereas for closed cell clouds, little change in cloud properties was observed. Further capitalizing on MISR's unique capabilities, the MISR cross-track cloud speed was used to derive cloud top divergence. Statistically averaging the results from the identified plume segments to reduce random noise, we found evidence of cloud top divergence in the ship-polluted clouds, whereas the nearby unpolluted clouds showed cloud top convergence, providing observational evidence of a change in local mesoscale circulation associated with enhanced aerosols. Furthermore, open cell polluted clouds revealed stronger cloud top divergence as compared to closed cell clouds, consistent with different dynamical mechanisms driving their responses. These results suggest that detailed cloud responses, classified by cloud type and environmental conditions, must be accounted for in global climate modeling

  3. Designing an autonomous helicopter testbed: From conception through implementation

    NASA Astrophysics Data System (ADS)

    Garcia, Richard D.

    Miniature Unmanned Aerial Vehicles (UAVs) are currently being researched for a wide range of tasks, including search and rescue, surveillance, reconnaissance, traffic monitoring, fire detection, pipe and electrical line inspection, and border patrol to name only a few of the application domains. Although small/miniature UAVs, including both Vertical Takeoff and Landing (VTOL) vehicles and small helicopters, have shown great potential in both civilian and military domains, including research and development, integration, prototyping, and field testing, these unmanned systems/vehicles are limited to only a handful of university labs. For VTOL type aircraft the number is less than fifteen worldwide! This lack of development is due to both the extensive time and cost required to design, integrate and test a fully operational prototype as well as the shortcomings of published materials to fully describe how to design and build a "complete" and "operational" prototype system. This dissertation overcomes existing barriers and limitations by describing and presenting in great detail every technical aspect of designing and integrating a small UAV helicopter including the on-board navigation controller, capable of fully autonomous takeoff, waypoint navigation, and landing. The presented research goes beyond previous works by designing the system as a testbed vehicle. This design aims to provide a general framework that will not only allow researchers the ability to supplement the system with new technologies but will also allow researchers to add innovation to the vehicle itself. Examples include modification or replacement of controllers, updated filtering and fusion techniques, addition or replacement of sensors, vision algorithms, Operating Systems (OS) changes or replacements, and platform modification or replacement. This is supported by the testbed's design to not only adhere to the technology it currently utilizes but to be general enough to adhere to a multitude of

  4. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    PubMed Central

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  5. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Michael T.; Munson, Michael J.; Teate, George F.

    2007-01-01

    The Phoenix Air-Launched Small Missile (ALSM) flight testbed was conceived and is proposed to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of the United States Navy Phoenix AIM-54 (Hughes Aircraft Company, now Raytheon Company, Waltham, Massachusetts) long-range, guided air-to-air missile and the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (Edwards, California) F-15B (McDonnell Douglas, now the Boeing Company, Chicago, Illinois) testbed airplane. The retirement of the Phoenix AIM-54 missiles from fleet operation has presented an opportunity for converting this flight asset into a new flight testbed. This cost-effective new platform will fill the gap in the test and evaluation of hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform; when launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will assist the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite-small-payload air-launched space boosters.

  6. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  7. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  8. Neptune's clouds

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The bright cirrus-like clouds of Neptune change rapidly, often forming and dissipating over periods of several to tens of hours. In this sequence Voyager 2 observed cloud evolution in the region around the Great Dark Spot (GDS). The surprisingly rapid changes which occur separating each panel shows that in this region Neptune's weather is perhaps as dynamic and variable as that of the Earth. However, the scale is immense by our standards -- the Earth and the GDS are of similar size -- and in Neptune's frigid atmosphere, where temperatures are as low as 55 degrees Kelvin (-360 F), the cirrus clouds are composed of frozen methane rather than Earth's crystals of water ice. The Voyager Mission is conducted by JPL for NASA's Office of Space Science and Applications

  9. CLOUD CHEMISTRY.

    SciTech Connect

    SCHWARTZ,S.E.

    2001-03-01

    Clouds present substantial concentrations of liquid-phase water, which can potentially serve as a medium for dissolution and reaction of atmospheric gases. The important precursors of acid deposition, SO{sub 2} and nitrogen oxides NO and NO{sub 2} are only sparingly soluble in clouds without further oxidation to sulfuric and nitric acids. In the case of SO{sub 2} aqueous-phase reaction with hydrogen peroxide, and to lesser extent ozone, are identified as important processes leading to this oxidation, and methods have been described by which to evaluate the rates of these reactions. The limited solubility of the nitrogen oxides precludes significant aqueous-phase reaction of these species, but gas-phase reactions in clouds can be important especially at night.

  10. Gathering clouds.

    PubMed

    Conde, Crystal

    2012-01-01

    Many physicians are finding their heads in a "cloud" as they ponder adopting or upgrading an electronic health record (EHR). That doesn't mean they're not in touch with reality. It means they now can choose new web-based systems, also known as cloud-based EHRs, that allow them to pay a monthly subscription fee to access an EHR rather than purchase it. They don't have to buy an expensive server with its associated hardware and software; a computer with an Internet connection will do. PMID:22714732

  11. Gathering clouds.

    PubMed

    Conde, Crystal

    2012-01-01

    Many physicians are finding their heads in a "cloud" as they ponder adopting or upgrading an electronic health record (EHR). That doesn't mean they're not in touch with reality. It means they now can choose new web-based systems, also known as cloud-based EHRs, that allow them to pay a monthly subscription fee to access an EHR rather than purchase it. They don't have to buy an expensive server with its associated hardware and software; a computer with an Internet connection will do.

  12. Our World: Cool Clouds

    NASA Video Gallery

    Learn how clouds are formed and watch an experiment to make a cloud using liquid nitrogen. Find out how scientists classify clouds according to their altitude and how clouds reflect and absorb ligh...

  13. 76 FR 13984 - Cloud Computing Forum & Workshop III

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... National Institute of Standards and Technology Cloud Computing Forum & Workshop III AGENCY: National... announces the Cloud Computing Forum & Workshop III to be held on April 7 and 8, 2011. The event will include... tactical Cloud Computing program, including progress on the NIST efforts to advance open standards...

  14. NN-SITE: A remote monitoring testbed facility

    SciTech Connect

    Kadner, S.; White, R.; Roman, W.; Sheely, K.; Puckett, J.; Ystesund, K.

    1997-08-01

    DOE, Aquila Technologies, LANL and SNL recently launched collaborative efforts to create a Non-Proliferation Network Systems Integration and Test (NN-Site, pronounced N-Site) facility. NN-Site will focus on wide area, local area, and local operating level network connectivity including Internet access. This facility will provide thorough and cost-effective integration, testing and development of information connectivity among diverse operating systems and network topologies prior to full-scale deployment. In concentrating on instrument interconnectivity, tamper indication, and data collection and review, NN-Site will facilitate efforts of equipment providers and system integrators in deploying systems that will meet nuclear non-proliferation and safeguards objectives. The following will discuss the objectives of ongoing remote monitoring efforts, as well as the prevalent policy concerns. An in-depth discussion of the Non-Proliferation Network Systems Integration and Test facility (NN-Site) will illuminate the role that this testbed facility can perform in meeting the objectives of remote monitoring efforts, and its potential contribution in promoting eventual acceptance of remote monitoring systems in facilities worldwide.

  15. F-15B transonic flight research testbed aircraft in flight

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA's Dryden Flight Research Center, Edwards, California, is flying a modified McDonnell-Douglas F-15B aircraft as a testbed for a variety of transonic flight experiments. The two-seat aircraft, bearing NASA tail number 836, is shown during a recent flight over the high desert carrying a Drdyen-designed Flight Test Fixture (FTF) upon which aerodynamic experiments are mounted. The FTF is a heavily instrumented fin-like structure which is mounted on the F-15B's underbelly in place of the standard external fuel tank. Since being aquired by NASA in 1993, the aircraft has been modified to include video recording, telemetry and data recording capabilities. The twin-engine aircraft flew several flights recently in support of an experiment to determine the precise location of sonic shockwave development as air passes over an airfoil. The F-15B is currently being prepared for the Boundary Layer Heat Experiment, which will explore the potential drag reduction from heating the turbulent portion of the air that passes over the fuselage of a large aircraft.

  16. Extrasolar Planetary Imaging Coronagraph: Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.

    2008-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime ( 5 year goal) and will revisit planets at least three times at intervals of 9 months. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed,

  17. Finite Element Modeling of the NASA Langley Aluminum Testbed Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Pritchard, Joselyn I.; Buehrle, Ralph D.; Pappa, Richard S.

    2002-01-01

    The NASA Langley Aluminum Testbed Cylinder (ATC) was designed to serve as a universal structure for evaluating structural acoustic codes, modeling techniques and optimization methods used in the prediction of aircraft interior noise. Finite element models were developed for the components of the ATC based on the geometric, structural and material properties of the physical test structure. Numerically predicted modal frequencies for the longitudinal stringer, ring frame and dome component models, and six assembled ATC configurations were compared with experimental modal survey data. The finite element models were updated and refined, using physical parameters, to increase correlation with the measured modal data. Excellent agreement, within an average 1.5% to 2.9%, was obtained between the predicted and measured modal frequencies of the stringer, frame and dome components. The predictions for the modal frequencies of the assembled component Configurations I through V were within an average 2.9% and 9.1%. Finite element modal analyses were performed for comparison with 3 psi and 6 psi internal pressurization conditions in Configuration VI. The modal frequencies were predicted by applying differential stiffness to the elements with pressure loading and creating reduced matrices for beam elements with offsets inside external superelements. The average disagreement between the measured and predicted differences for the 0 psi and 6 psi internal pressure conditions was less than 0.5%. Comparably good agreement was obtained for the differences between the 0 psi and 3 psi measured and predicted internal pressure conditions.

  18. Development of a Testbed for Distributed Satellite Command and Control

    NASA Astrophysics Data System (ADS)

    Zetocha, Paul; Brito, Margarita

    2002-01-01

    At the Air Force Research Laboratory's Space Vehicles Directorate we are investigating and developing architectures for commanding and controlling a cluster of cooperating satellites through prototype development for the TechSat-21 program. The objective of this paper is to describe a distributed satellite testbed that is currently under development and to summarize near term prototypes being implemented for cluster command and control. To design, develop, and test our architecture we are using eight PowerPC 750 VME-based single board computers, representing eight satellites. Each of these computers is hosting the OSE(TM) real-time operating system from Enea Systems. At the core of our on-board cluster manager is ObjectAgent. ObjectAgent is an agent-based object-oriented framework for flight systems, which is particularly suitable for distributed applications. In order to handle communication with the ground as well as to assist with the cluster management we are using the Spacecraft Command Language (SCL). SCL is also at the centerpiece of our ground control station and handles cluster commanding, telemetry decommutation, state-of-health monitoring, and Fault Detection, Isolation, and Resolution (FDIR). For planning and scheduling activities we are currently using ASPEN from NASA/JPL. This paper will describe each of the above components in detail and then present the prototypes being implemented.

  19. Priority scheme planning for the robust SSM/PMAD testbed

    NASA Astrophysics Data System (ADS)

    Elges, Michael R.; Ashworth, Barry R.

    Whenever mixing priorities of manually controlled resources with those of autonomously controlled resources, the space station module power management and distribution (SSM/PMAD) environment requires cooperating expert system interaction between the planning function and the priority manager. The elements and interactions of the SSM/PMAD planning and priority management functions are presented. Their adherence to cooperating for common achievement are described. In the SSM/PMAD testbed these actions are guided by having a system planning function, KANT, which has insight to the executing system and its automated database. First, the user must be given access to all information which may have an effect on the desired outcome. Second, the fault manager element, FRAMES, must be informed as to the change so that correct diagnoses and operations take place if and when faults occur. Third, some element must engage as mediator for selection of resources and actions to be added or removed at the user's request. This is performed by the priority manager, LPLMS. Lastly, the scheduling mechanism, MAESTRO, must provide future schedules adhering to the user modified resource base.

  20. NASA'S Coastal and Ocean Airborne Science Testbed (COAST): Early Results

    NASA Astrophysics Data System (ADS)

    Guild, L. S.; Dungan, J. L.; Edwards, M.; Russell, P. B.; Morrow, J. H.; Kudela, R. M.; Myers, J. S.; Livingston, J.; Lobitz, B.; Torres-Perez, J.

    2012-12-01

    The NASA Coastal and Ocean Airborne Science Testbed (COAST) project advances coastal ecosystems research and ocean color calibration and validation capability by providing a unique airborne payload optimized for remote sensing in the optically complex coastal zone. The COAST instrument suite combines a customized imaging spectrometer, sunphotometer system, and a new bio-optical radiometer package to obtain ocean/coastal/atmosphere data simultaneously in flight for the first time. The imaging spectrometer (Headwall) is optimized in the blue region of the spectrum to emphasize remote sensing of marine and freshwater ecosystems. Simultaneous measurements supporting empirical atmospheric correction of image data is accomplished using the Ames Airborne Tracking Sunphotometer (AATS-14). Coastal Airborne In situ Radiometers (C-AIR, Biospherical Instruments, Inc.), developed for COAST for airborne campaigns from field-deployed microradiometer instrumentation, will provide measurements of apparent optical properties at the land/ocean boundary including optically shallow aquatic ecosystems. Ship-based measurements allowed validation of airborne measurements. Radiative transfer modeling on in-water measurements from the HyperPro and Compact-Optical Profiling System (C-OPS, the in-water companion to C-AIR) profiling systems allows for comparison of airborne and in-situ water leaving radiance measurements. Results of the October 2011 Monterey Bay COAST mission include preliminary data on coastal ocean color products, coincident spatial and temporal data on aerosol optical depth and water vapor column content, as well as derived exact water-leaving radiances.

  1. Biosensing Test-Bed Using Electrochemically Deposited Reduced Graphene Oxide.

    PubMed

    Bhardwaj, Sheetal K; Yadav, Premlata; Ghosh, Subhasis; Basu, Tinku; Mahapatro, Ajit K

    2016-09-21

    The development of an efficient test-bed for biosensors requires stable surfaces, capable of interacting with the functional groups present in bioentities. This work demonstrates the formation of highly stable electrochemically reduced graphene oxide (ERGO) thin films reproducibly on indium tin oxide (ITO)-coated glass substrates using a reliable technique through 60 s chronoamperometric reduction of a colloidal suspension maintained at neutral pH containing graphene oxide in deionized water. Structural optimization and biocompatible interactions of the resulting closely packed and uniformly distributed ERGO flakes on ITO surfaces (ERGO/ITO) are characterized using various microscopic and spectroscopic tools. Lipase enzyme is immobilized on the ERGO surface in the presence of ethyl-3-[3-(dimethylamino)propyl]carbodimide and N-hydroxysuccinimide for the detection of triglyceride in a tributyrin (TBN) solution. The ERGO/ITO surfaces prepared using the current technique indicate the noticeable detection of TBN, a source of triglycerides, at a sensitivity of 37 pA mg dL(-1) cm(-2) in the linear range from 50 to 300 mg dL(-1) with a response time of 12 s. The low apparent Michaelies-Menten constant of 0.28 mM suggests high enzyme affinity to TBN. The currently developed fast, simple, highly reproducible, and reliable technique for the formation of an ERGO electrode could be routinely utilized as a test bed for the detection of clinically active bioentities. PMID:27509332

  2. Expanded Owens Valley Solar Array (EOVSA) Testbed and Prototype

    NASA Astrophysics Data System (ADS)

    Gary, Dale E.; Nita, G. M.; Sane, N.

    2012-05-01

    NJIT is engaged in constructing a new solar-dedicated radio array, the Expanded Owens Valley Solar Array (EOVSA), which is slated for completion in late 2013. An initial 3-antenna array, the EOVSA Subsystem Testbed (EST), is now in operation from 1-9 GHz based on three of the old OVSA antennas, to test certain design elements of the new array. We describe this instrument and show some results from recent solar flares observed with it. We also describe plans for an upcoming prototype of EOVSA, which will use three antennas of the new design over the full 1-18 GHz signal chain of the entirely new system. The EOVSA prototype will be in operation by late 2012. Highlights of the new design are ability to cover the entire 1-18 GHz in less than 1 s, simultaneous dual polarization, and improved sensitivity and stability. We discuss what can be expected from the prototype, and how it will compare with the full 13-antenna EOVSA. This work was supported by NSF grants AGS-0961867 and AST-0908344, and NASA grant NNX11AB49G to New Jersey Institute of Technology.

  3. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  4. An integrated dexterous robotic testbed for space applications

    NASA Technical Reports Server (NTRS)

    Li, Larry C.; Nguyen, Hai; Sauer, Edward

    1992-01-01

    An integrated dexterous robotic system was developed as a testbed to evaluate various robotics technologies for advanced space applications. The system configuration consisted of a Utah/MIT Dexterous Hand, a PUMA 562 arm, a stereo vision system, and a multiprocessing computer control system. In addition to these major subsystems, a proximity sensing system was integrated with the Utah/MIT Hand to provide capability for non-contact sensing of a nearby object. A high-speed fiber-optic link was used to transmit digitized proximity sensor signals back to the multiprocessing control system. The hardware system was designed to satisfy the requirements for both teleoperated and autonomous operations. The software system was designed to exploit parallel processing capability, pursue functional modularity, incorporate artificial intelligence for robot control, allow high-level symbolic robot commands, maximize reusable code, minimize compilation requirements, and provide an interactive application development and debugging environment for the end users. An overview is presented of the system hardware and software configurations, and implementation is discussed of subsystem functions.

  5. Articulated navigation testbed (ANT): an example of adaptable intrinsic mobility

    NASA Astrophysics Data System (ADS)

    Brosinsky, Chris A.; Hanna, Doug M.; Penzes, Steven G.

    2000-07-01

    An important but oft overlooked aspect of any robotic system is the synergistic benefit of designing the chassis to have high intrinsic mobility which complements rather than limits, its system capabilities. This novel concept continues to be investigated by the Defence Research Establishment Suffield (DRES) with the Articulated Navigation Testbed (ANT) Unmanned Ground Vehicle (UGV). The ANT demonstrates high mobility through the combination of articulated steering and a hybrid locomotion scheme which utilizes individually powered wheels on the edge of rigid legs; legs which are capable of approximately 450 degrees of rotation. The configuration can be minimally configured as a 4x4 and modularly expanded to 6x6, 8x8, and so on. This enhanced mobility configuration permits pose control and novel maneuvers such as stepping, bridging, crawling, etc. Resultant mobility improvements, particularly in unstructured and off-road environments, will reduce the resolution with which the UGV sensor systems must perceive its surroundings and decreases the computational requirements of the UGV's perception systems1 for successful semi-autonomous or autonomous terrain negotiation. This paper reviews critical vehicle developments leading up to the ANT concept, describes the basis for its configuration and speculates on the impact of the intrinsic mobility concept for UGV effectiveness.

  6. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    SciTech Connect

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  7. A Test-Bed Configuration: Toward an Autonomous System

    NASA Astrophysics Data System (ADS)

    Ocaña, F.; Castillo, M.; Uranga, E.; Ponz, J. D.; TBT Consortium

    2015-09-01

    In the context of the Space Situational Awareness (SSA) program of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. In order to fulfill all the security requirements for the TBT project, the use of a autonomous emergency system (AES) is foreseen to monitor the control system. The AES will monitor remotely the health of the observing system and the internal and external environment. It will incorporate both autonomous and interactive actuators to force the protection of the system (i.e., emergency dome close out).

  8. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  9. A satellite orbital testbed for SATCOM using mobile robots

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Lu, Wenjie; Wang, Zhonghai; Jia, Bin; Wang, Gang; Wang, Tao; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2016-05-01

    This paper develops and evaluates a satellite orbital testbed (SOT) for satellite communications (SATCOM). SOT can emulate the 3D satellite orbit using the omni-wheeled robots and a robotic arm. The 3D motion of satellite is partitioned into the movements in the equatorial plane and the up-down motions in the vertical plane. The former actions are emulated by omni-wheeled robots while the up-down motions are performed by a stepped-motor-controlled-ball along a rod (robotic arm), which is attached to the robot. The emulated satellite positions will go to the measure model, whose results will be used to perform multiple space object tracking. Then the tracking results will go to the maneuver detection and collision alert. The satellite maneuver commands will be translated to robots commands and robotic arm commands. In SATCOM, the effects of jamming depend on the range and angles of the positions of satellite transponder relative to the jamming satellite. We extend the SOT to include USRP transceivers. In the extended SOT, the relative ranges and angles are implemented using omni-wheeled robots and robotic arms.

  10. Assessing the Performance of Computationally Simple and Complex Representations of Aerosol Processes using a Testbed Methodology

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Ma, P.; Easter, R. C.; Liu, X.; Zaveri, R. A.; Rasch, P.

    2012-12-01

    Predictions of aerosol radiative forcing in climate models still contain large uncertainties, resulting from a poor understanding of certain aerosol processes, the level of complexity of aerosol processes represented in models, and the ability of models to account for sub-grid scale variability of aerosols and processes affecting them. In addition, comparing the performance and computational efficiency of new aerosol process modules used in various studies is problematic because different studies often employ different grid configurations, meteorology, trace gas chemistry, and emissions that affect the temporal and spatial evolution of aerosols. To address this issue, we have developed an Aerosol Modeling Testbed (AMT) to systematically and objectively evaluate aerosol process modules. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series of testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from a global climate model, Community Atmosphere Model version 5 (CAM5), has also been ported to WRF so that these parameterizations can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. In this study, we evaluate simple and complex treatments of the aerosol size distribution and secondary organic aerosols using the AMT and measurements collected during three field campaigns: the Megacities Initiative Local and Global Observations (MILAGRO) campaign conducted in the vicinity of Mexico City during March 2006, the

  11. Cloud Front

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA02171 Cloud Front

    These clouds formed in the south polar region. The faintness of the cloud system likely indicates that these are mainly ice clouds, with relatively little dust content.

    Image information: VIS instrument. Latitude -86.7N, Longitude 212.3E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  12. Cloud Arcs

    Atmospheric Science Data Center

    2013-04-19

    ... a sinking motion elsewhere, are very common, the degree of organization exhibited here is relatively rare, as the wind field at different altitudes usually disrupts such patterns. The degree of self organization of this cloud image, whereby three or four such circular events ...

  13. Three-dimensional geospatial information service based on cloud computing

    NASA Astrophysics Data System (ADS)

    Zhai, Xi; Yue, Peng; Jiang, Liangcun; Wang, Linnan

    2014-01-01

    Cloud computing technologies can support high-performance geospatial services in various domains, such as smart city and agriculture. Apache Hadoop, an open-source software framework, can be used to build a cloud environment on commodity clusters for storage and large-scale processing of data sets. The Open Geospatial Consortium (OGC) Web 3-D Service (W3DS) is a portrayal service for three-dimensional (3-D) geospatial data. Its performance could be improved by cloud computing technologies. This paper investigates how OGC W3DS could be developed in a cloud computing environment. It adopts the Apache Hadoop as the framework to provide a cloud implementation. The design and implementation of the 3-D geospatial information cloud service is presented. The performance evaluation is performed over data retrieval tests running in a cloud platform built by Hadoop clusters. The evaluation results provide a valuable reference on providing high-performance 3-D geospatial information cloud services.

  14. Description of the SSF PMAD DC testbed control system data acquisition function

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Mackin, Michael; Wright, Theodore

    1992-01-01

    The NASA LeRC in Cleveland, Ohio has completed the development and integration of a Power Management and Distribution (PMAD) DC Testbed. This testbed is a reduced scale representation of the end to end, sources to loads, Space Station Freedom Electrical Power System (SSF EPS). This unique facility is being used to demonstrate DC power generation and distribution, power management and control, and system operation techniques considered to be prime candidates for the Space Station Freedom. A key capability of the testbed is its ability to be configured to address system level issues in support of critical SSF program design milestones. Electrical power system control and operation issues like source control, source regulation, system fault protection, end-to-end system stability, health monitoring, resource allocation, and resource management are being evaluated in the testbed. The SSF EPS control functional allocation between on-board computers and ground based systems is evolving. Initially, ground based systems will perform the bulk of power system control and operation. The EPS control system is required to continuously monitor and determine the current state of the power system. The DC Testbed Control System consists of standard controllers arranged in a hierarchical and distributed architecture. These controllers provide all the monitoring and control functions for the DC Testbed Electrical Power System. Higher level controllers include the Power Management Controller, Load Management Controller, Operator Interface System, and a network of computer systems that perform some of the SSF Ground based Control Center Operation. The lower level controllers include Main Bus Switch Controllers and Photovoltaic Controllers. Power system status information is periodically provided to the higher level controllers to perform system control and operation. The data acquisition function of the control system is distributed among the various levels of the hierarchy. Data

  15. Emulating JWST Exoplanet Transit Observations in a Testbed laboratory experiment

    NASA Astrophysics Data System (ADS)

    Touli, D.; Beichman, C. A.; Vasisht, G.; Smith, R.; Krist, J. E.

    2014-12-01

    The transit technique is used for the detection and characterization of exoplanets. The combination of transit and radial velocity (RV) measurements gives information about a planet's radius and mass, respectively, leading to an estimate of the planet's density (Borucki et al. 2011) and therefore to its composition and evolutionary history. Transit spectroscopy can provide information on atmospheric composition and structure (Fortney et al. 2013). Spectroscopic observations of individual planets have revealed atomic and molecular species such as H2O, CO2 and CH4 in atmospheres of planets orbiting bright stars, e.g. Deming et al. (2013). The transit observations require extremely precise photometry. For instance, Jupiter transit results to a 1% brightness decrease of a solar type star while the Earth causes only a 0.0084% decrease (84 ppm). Spectroscopic measurements require still greater precision <30ppm. The Precision Projector Laboratory (PPL) is a collaboration between the Jet Propulsion Laboratory (JPL) and California Institute of Technology (Caltech) to characterize and validate detectors through emulation of science images. At PPL we have developed a testbed to project simulated spectra and other images onto a HgCdTe array in order to assess precision photometry for transits, weak lensing etc. for Explorer concepts like JWST, WFIRST, EUCLID. In our controlled laboratory experiment, the goal is to demonstrate ability to extract weak transit spectra as expected for NIRCam, NIRIS and NIRSpec. Two lamps of variable intensity, along with spectral line and photometric simulation masks emulate the signals from a star-only, from a planet-only and finally, from a combination of a planet + star. Three masks have been used to simulate spectra in monochromatic light. These masks, which are fabricated at JPL, have a length of 1000 pixels and widths of 2 pixels, 10 pixels and 1 pixel to correspond respectively to the noted above JWST instruments. From many-hour long

  16. Laboratory Spacecraft Data Processing and Instrument Autonomy: AOSAT as Testbed

    NASA Astrophysics Data System (ADS)

    Lightholder, Jack; Asphaug, Erik; Thangavelautham, Jekan

    2015-11-01

    Recent advances in small spacecraft allow for their use as orbiting microgravity laboratories (e.g. Asphaug and Thangavelautham LPSC 2014) that will produce substantial amounts of data. Power, bandwidth and processing constraints impose limitations on the number of operations which can be performed on this data as well as the data volume the spacecraft can downlink. We show that instrument autonomy and machine learning techniques can intelligently conduct data reduction and downlink queueing to meet data storage and downlink limitations. As small spacecraft laboratory capabilities increase, we must find techniques to increase instrument autonomy and spacecraft scientific decision making. The Asteroid Origins Satellite (AOSAT) CubeSat centrifuge will act as a testbed for further proving these techniques. Lightweight algorithms, such as connected components analysis, centroid tracking, K-means clustering, edge detection, convex hull analysis and intelligent cropping routines can be coupled with the tradition packet compression routines to reduce data transfer per image as well as provide a first order filtering of what data is most relevant to downlink. This intelligent queueing provides timelier downlink of scientifically relevant data while reducing the amount of irrelevant downlinked data. Resulting algorithms allow for scientists to throttle the amount of data downlinked based on initial experimental results. The data downlink pipeline, prioritized for scientific relevance based on incorporated scientific objectives, can continue from the spacecraft until the data is no longer fruitful. Coupled with data compression and cropping strategies at the data packet level, bandwidth reductions exceeding 40% can be achieved while still downlinking data deemed to be most relevant in a double blind study between scientist and algorithm. Applications of this technology allow for the incorporation of instrumentation which produces significant data volumes on small spacecraft

  17. Earthbound Unmanned Autonomous Vehicles (UAVS) As Planetary Science Testbeds

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Bland, G.; Diaz, J. A.; Fladeland, M. M.

    2014-12-01

    Recent advances in the technology of unmanned vehicles have greatly expanded the range of contemplated terrestrial operational environments for their use, including aerial, surface, and submarine. The advances have been most pronounced in the areas of autonomy, miniaturization, durability, standardization, and ease of operation, most notably (especially in the popular press) for airborne vehicles. Of course, for a wide range of planetary venues, autonomy at high cost of both money and risk, has always been a requirement. Most recently, missions to Mars have also featured an unprecedented degree of mobility. Combining the traditional planetary surface deployment operational and science imperatives with emerging, very accessible, and relatively economical small UAV platforms on Earth can provide flexible, rugged, self-directed, test-bed platforms for landed instruments and strategies that will ultimately be directed elsewhere, and, in the process, provide valuable earth science data. While the most direct transfer of technology from terrestrial to planetary venues is perhaps for bodies with atmospheres (and oceans), with appropriate technology and strategy accommodations, single and networked UAVs can be designed to operate on even airless bodies, under a variety of gravities. In this presentation, we present and use results and lessons learned from our recent earth-bound UAV volcano deployments, as well as our future plans for such, to conceptualize a range of planetary and small-body missions. We gratefully acknowledge the assistance of students and colleagues at our home institutions, and the government of Costa Rica, without which our UAV deployments would not have been possible. This work was carried out, in part, at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.

  18. Sensor System Performance Evaluation and Benefits from the NPOESS Airborne Sounder Testbed-Interferometer (NAST-I)

    NASA Technical Reports Server (NTRS)

    Larar, A.; Zhou, D.; Smith, W.

    2009-01-01

    Advanced satellite sensors are tasked with improving global-scale measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring, and environmental change detection. Validation of the entire measurement system is crucial to achieving this goal and thus maximizing research and operational utility of resultant data. Field campaigns employing satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft are an essential part of this validation task. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Airborne Sounder Testbed-Interferometer (NAST-I) has been a fundamental contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This paper focuses on some of the challenges associated with validating advanced atmospheric sounders and the benefits obtained from employing airborne interferometers such as the NAST-I. Select results from underflights of the Aqua Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) obtained during recent field campaigns will be presented.

  19. Modeling of clouds and radiation for developing parameterizations for general circulation models. Annual report, 1995

    SciTech Connect

    Toon, O.B.; Westphal, D.L.

    1996-07-01

    We have used a hierarchy of numerical models for cirrus and stratus clouds and for radiative transfer to improve the reliability of general circulation models. Our detailed cloud microphysical model includes all of the physical processes believed to control the lifecycles of liquid and ice clouds in the troposphere. We have worked on specific GCM parameterizations for the radiative properties of cirrus clouds, making use of a mesocale model as the test-bed for the parameterizations. We have also modeled cirrus cloud properties with a detailed cloud physics model to better understand how the radiatively important properties of cirrus are controlled by their environment. We have used another cloud microphysics model to investigate of the interactions between aerosols and clouds. This work is some of the first to follow the details of interactions between aerosols and cloud droplets and has shown some unexpected relations between clouds and aerosols. We have also used line-by- line radiative transfer results verified with ARM data, to derive a GCMS.

  20. The Development of a Smart Distribution Grid Testbed for Integrated Information Management Systems

    SciTech Connect

    Lu, Ning; Du, Pengwei; Paulson, Patrick R.; Greitzer, Frank L.; Guo, Xinxin; Hadley, Mark D.

    2011-07-28

    This paper presents a smart distribution grid testbed to test or compare designs of integrated information management systems (I2MSs). An I2MS extracts and synthesizes information from a wide range of data sources to detect abnormal system behaviors, identify possible causes, assess the system status, and provide grid operators with response suggestions. The objective of the testbed is to provide a modeling environment with sufficient data sources for the I2MS design. The testbed includes five information layers and a physical layer; it generates multi-layer chronological data based on actual measurement playbacks or simulated data sets produced by the physical layer. The testbed models random hardware failures, human errors, extreme weather events, and deliberate tampering attempts to allow users to evaluate the performance of different I2MS designs. Initial results of I2MS performance tests showed that the testbed created a close-to-real-world environment that allowed key performance metrics of the I2MS to be evaluated.

  1. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    NASA Technical Reports Server (NTRS)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  2. Preliminary results of the LLNL airborne experimental test-bed SAR system

    SciTech Connect

    Miller, M.G.; Mullenhoff, C.J.; Kiefer, R.D.; Brase, J.M.; Wieting, M.G.; Berry, G.L.; Jones, H.E.

    1996-01-16

    The Imaging and Detection Program (IDP) within Laser Programs at Lawrence Livermore National Laboratory (LLNL) in cooperation with the Hughes Aircraft Company has developed a versatile, high performance, airborne experimental test-bed (AETB) capability. The test-bed has been developed for a wide range of research and development experimental applications including radar and radiometry plus, with additional aircraft modifications, optical systems. The airborne test-bed capability has been developed within a Douglas EA-3B Skywarrior jet aircraft provided and flown by Hughes Aircraft Company. The current test-bed payload consists of an X-band radar system, a high-speed data acquisition, and a real-time processing capability. The medium power radar system is configured to operate in a high resolution, synthetic aperture radar (SAR) mode and is highly configurable in terms of waveforrns, PRF, bandwidth, etc. Antennas are mounted on a 2-axis gimbal in the belly radome of the aircraft which provides pointing and stabilization. Aircraft position and antenna attitude are derived from a dedicated navigational system and provided to the real-time SAR image processor for instant image reconstruction and analysis. This paper presents a further description of the test-bed and payload subsystems plus preliminary results of SAR imagery.

  3. Linear Clouds

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA03667 Linear Clouds

    These clouds are located near the edge of the south polar region. The cloud tops are the puffy white features in the bottom half of the image.

    Image information: VIS instrument. Latitude -80.1N, Longitude 52.1E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  4. Towards an autonomous telescope system: the Test-Bed Telescope project

    NASA Astrophysics Data System (ADS)

    Racero, E.; Ocaña, F.; Ponz, D.; the TBT Consortium

    2015-05-01

    In the context of the Space Situational Awareness (SSA) programme of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. It is foreseen that this test-bed environment will be used to validate future prototype software systems as well as to evaluate remote monitoring and control techniques. The test-bed system will be capable to deliver astrometric and photometric data of the observed objects in near real-time. This contribution describes the current status of the project.

  5. Definition study for variable cycle engine testbed engine and associated test program

    NASA Technical Reports Server (NTRS)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  6. Model-based beam control for illumination of remote objects, part II: laboratory testbed

    NASA Astrophysics Data System (ADS)

    Basu, Santasri; Voelz, David; Chandler, Susan M.; Lukesh, Gordon W.; Sjogren, Jon

    2004-10-01

    When a laser beam propagates through the atmosphere, it is subject to corrupting influences including mechanical vibrations, turbulence and tracker limitations. As a result, pointing errors can occur, causing loss of energy or signal at the target. Nukove Scientific Consulting has developed algorithms to estimate these pointing errors from the statistics of the return photons from the target. To prove the feasibility of this approach for real-time estimation, an analysis tool called RHINO was developed by Nukove. Associated with this effort, New Mexico State University developed a laboratory testbed, the ultimate objective being to test the estimation algorithms under controlled conditions and to stream data into RHINO to prove the feasibility of real-time operation. The present paper outlines the description of this testbed and the results obtained through RHINO when the testbed was used to test the estimation approach.

  7. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    NASA Technical Reports Server (NTRS)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  8. Multivesicular Assemblies as Real-World Testbeds for Embryogenic Evolutionary Systems

    NASA Astrophysics Data System (ADS)

    Hadorn, Maik; Eggenberger Hotz, Peter

    Embryogenic evolution emulates in silico cell-like entities to get more powerful methods for complex evolutionary tasks. As simulations have to abstract from the biological model, implicit information hidden in its physics is lost. Here, we propose to use cell-like entities as a real-world in vitro testbed. In analogy to evolutionary robotics, where solutions evolved in simulations may be tested in real-world on macroscale, the proposed vesicular testbed would do the same for the embryogenic evolutionary tasks on mesoscale. As a first step towards a vesicular testbed emulating growth, cell division, and cell differentiation, we present a modified vesicle production method, providing custom-tailored chemical cargo, and present a novel self-assembly procedure to provide vesicle aggregates of programmable composition.

  9. Flight Testing of Guidance, Navigation and Control Systems on the Mighty Eagle Robotic Lander Testbed

    NASA Technical Reports Server (NTRS)

    Hannan, Mike; Rickman, Doug; Chavers, Greg; Adam, Jason; Becker, Chris; Eliser, Joshua; Gunter, Dan; Kennedy, Logan; O'Leary, Patrick

    2015-01-01

    During 2011 a series of progressively more challenging flight tests of the Mighty Eagle autonomous terrestrial lander testbed were conducted primarily to validate the GNC system for a proposed lunar lander. With the successful completion of this GNC validation objective the opportunity existed to utilize the Mighty Eagle as a flying testbed for a variety of technologies. In 2012 an Autonomous Rendezvous and Capture (AR&C) algorithm was implemented in flight software and demonstrated in a series of flight tests. In 2012 a hazard avoidance system was developed and flight tested on the Mighty Eagle. Additionally, GNC algorithms from Moon Express and a MEMs IMU were tested in 2012. All of the testing described herein was above and beyond the original charter for the Mighty Eagle. In addition to being an excellent testbed for a wide variety of systems the Mighty Eagle also provided a great learning opportunity for many engineers and technicians to work a flight program.

  10. Web Solutions Inspire Cloud Computing Software

    NASA Technical Reports Server (NTRS)

    2013-01-01

    An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.

  11. Estimating Cloud Cover

    ERIC Educational Resources Information Center

    Moseley, Christine

    2007-01-01

    The purpose of this activity was to help students understand the percentage of cloud cover and make more accurate cloud cover observations. Students estimated the percentage of cloud cover represented by simulated clouds and assigned a cloud cover classification to those simulations. (Contains 2 notes and 3 tables.)

  12. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    NASA Technical Reports Server (NTRS)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and

  13. The Bermuda Testbed Mooring and Emerging Technologies for Interdisciplinary Research

    NASA Astrophysics Data System (ADS)

    Dickey, T. D.

    2001-12-01

    The Bermuda Testbed Mooring (BTM) program provides the oceanographic community with a deep-water platform for testing new instrumentation. Scientific studies also utilize data collected from the BTM, particularly in conjunction with the U.S. JGOFS Bermuda Atlantic Time-series Study (BATS) program. Additionally, the BTM has been used for groundtruthing of satellite ocean color imager (SeaWiFS) data. The mooring is located about 80 km southeast of Bermuda. Surface instruments have collected meteorological and spectral radiometric data from the buoy tower and measurements at depth have included: currents, temperature, bio-optical, chemical, and acoustical variables. The BTM captures a broad dynamic range of oceanic variability (minutes to years). Key results include: 1. Data obtained during passages of cold-core eddies have been used to estimate the role of such features on new production and carbon flux to the deep ocean. One of the observed features contained the greatest values of chlorophyll observed during the decade of observations at the site (based on BATS historical data base). The measurements provide high frequency, long-term data, which can be used for a) detailed studies of a variety of physical, chemical, bio-optical, and ecological processes on time scales from minutes to years, b) contextual information for many other observations made near the BTM/BATS sites, c) evaluation of undersampling/aliasing effects, and d) developing/testing models. 2. The dynamics of the upper ocean have been observed during transient re-stratification events and during passages of hurricanes and other intense storms. These observations are unique and the subject of ongoing modeling efforts. 3. BTM papers have provided new insights concerning bio-optical variability on short (minutes to day) time scales and have proven valuable for ocean color satellite groundtruthing. 4. During the BTM project, several new sensors and systems have been tested by U.S. and international groups

  14. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  15. Aerosol-Cloud Interactions in Ship Tracks Using Terra MODIS/MISR

    NASA Astrophysics Data System (ADS)

    Chen, Y. C.; Christensen, M.; Diner, D. J.; Garay, M. J.; Nelson, D. L.

    2014-12-01

    Simultaneous ship track observations from Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Multi-angle Imaging SpectroRadiometer (MISR) have been compiled to investigate how ship-injected aerosols affect marine warm boundary layer clouds under different cloud types and environmental conditions. Taking advantage of the high spatial resolution multiangle observations uniquely available from MISR, we utilized the retrieved cloud albedo, cloud top height, and cloud motion vectors to examine the cloud property responses in ship-polluted and nearby unpolluted clouds. The strength of cloud albedo response to increased aerosol level is primarily dependent on cloud cell structure, dryness of the free troposphere, and boundary layer depth, corroborating a previous study by Chen et al. (2012) where A-Train satellite data were applied. Under open cell cloud structure, the cloud properties are more susceptible to aerosol perturbations as compared to closed cells. Aerosol plumes caused an increase in liquid water amount (+27%), cloud top height (+11%), and cloud albedo (+40%) for open cell clouds, whereas under closed cell clouds, little changes in cloud properties were observed. Further capitalizing on MISR's unique capabilities, the MISR cross-track cloud speed has been used to derive cloud top divergence. Statistically averaging the results from many plume segments to reduce random noise, we have found that in ship-polluted clouds there is stronger cloud top divergence, and in nearby unpolluted clouds, convergence occurs and leads to downdrafts, providing observational evidence for cloud top entrainment feedback. These results suggest that detailed cloud responses, classified by cloud type and environmental conditions, must be accounted for in global climate modeling studies to reduce uncertainties of aerosol indirect forcing. Reference: Chen, Y.-C. et al. Occurrence of lower cloud albedo in ship tracks, Atmos. Chem. Phys., 12, 8223-8235, doi:10.5194/acp-12

  16. Spectral Dependence of MODIS Cloud Droplet Effective Radius Retrievals for Marine Boundary Layer Clouds

    NASA Technical Reports Server (NTRS)

    Zhang, Zhibo; Platnick, Steven E.; Ackerman, Andrew S.; Cho, Hyoun-Myoung

    2014-01-01

    Low-level warm marine boundary layer (MBL) clouds cover large regions of Earth's surface. They have a significant role in Earth's radiative energy balance and hydrological cycle. Despite the fundamental role of low-level warm water clouds in climate, our understanding of these clouds is still limited. In particular, connections between their properties (e.g. cloud fraction, cloud water path, and cloud droplet size) and environmental factors such as aerosol loading and meteorological conditions continue to be uncertain or unknown. Modeling these clouds in climate models remains a challenging problem. As a result, the influence of aerosols on these clouds in the past and future, and the potential impacts of these clouds on global warming remain open questions leading to substantial uncertainty in climate projections. To improve our understanding of these clouds, we need continuous observations of cloud properties on both a global scale and over a long enough timescale for climate studies. At present, satellite-based remote sensing is the only means of providing such observations.

  17. Feedback in Clouds II: UV Photoionisation and the first supernova in a massive cloud

    NASA Astrophysics Data System (ADS)

    Geen, Sam; Hennebelle, Patrick; Tremblin, Pascal; Rosdahl, Joakim

    2016-09-01

    Molecular cloud structure is regulated by stellar feedback in various forms. Two of the most important feedback processes are UV photoionisation and supernovae from massive stars. However, the precise response of the cloud to these processes, and the interaction between them, remains an open question. In particular, we wish to know under which conditions the cloud can be dispersed by feedback, which in turn can give us hints as to how feedback regulates the star formation inside the cloud. We perform a suite of radiative magnetohydrodynamic simulations of a 105 solar mass cloud with embedded sources of ionising radiation and supernovae, including multiple supernovae and a hypernova model. A UV source corresponding to 10% of the mass of the cloud is required to disperse the cloud, suggesting that the star formation efficiency should be on the order of 10%. A single supernova is unable to significantly affect the evolution of the cloud. However, energetic hypernovae and multiple supernovae are able to add significant quantities of momentum to the cloud, approximately 1043 g cm/s of momentum per 1051 ergs of supernova energy. We argue that supernovae alone are unable to regulate star formation in molecular clouds. We stress the importance of ram pressure from turbulence in regulating feedback in molecular clouds.

  18. Interferometric Testbed for Nanometer Level Stabilization of Environmental Motion Over Long Timescales

    NASA Technical Reports Server (NTRS)

    Numata, Kenji; Camp, Jordan

    2008-01-01

    We developed an interferometric testbed to stabilize environmental motions over timescales of several hours and a lengthscale of 1m. Typically, thermal and seismic motions on the ground are larger than 1 micron over these scales, affecting the precision of more sensitive measurements. To suppress such motions, we built an active stabilization system composed of interferometric sensors, a hexapod actuator, and a frequency stabilized laser. With this stabilized testbed, environmental motions were suppressed down to nm level. This system will allow us to perform sensitive measurements, such as ground testing of LISA (Laser Interferometer Space Antenna), in the presence of environmental noise.

  19. Virtual Pipeline System Testbed to Optimize the U.S. Natural Gas Transmission Pipeline System

    SciTech Connect

    Kirby S. Chapman; Prakash Krishniswami; Virg Wallentine; Mohammed Abbaspour; Revathi Ranganathan; Ravi Addanki; Jeet Sengupta; Liubo Chen

    2005-06-01

    The goal of this project is to develop a Virtual Pipeline System Testbed (VPST) for natural gas transmission. This study uses a fully implicit finite difference method to analyze transient, nonisothermal compressible gas flow through a gas pipeline system. The inertia term of the momentum equation is included in the analysis. The testbed simulate compressor stations, the pipe that connects these compressor stations, the supply sources, and the end-user demand markets. The compressor station is described by identifying the make, model, and number of engines, gas turbines, and compressors. System operators and engineers can analyze the impact of system changes on the dynamic deliverability of gas and on the environment.

  20. Experimental demonstration of a classical approach for flexible structure control - The ACES testbed

    NASA Technical Reports Server (NTRS)

    Wie, Bong

    1991-01-01

    This paper describes the results of an active structural control experiment performed for the Advanced Control Evaluation for Structures (ACES) testbed at NASA-Marshall as part of the NASA Control-Structure Interaction Guest Investigator Program. The experimental results successfully demonstrate the effectiveness of a 'dipole' concept for line-of-sight control of a pointing system mounted on a flexible structure. The simplicity and effectiveness of a classical 'single-loop-at-a-time' approach for the active structural control design for a complex structure, such as the ACES testbed, are demonstrated.

  1. Regenerative Fuel Cell System Testbed Program for Government and Commercial Applications

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA Lewis Research Center's Electrochemical Technology Branch has led a multiagency effort to design, fabricate, and operate a regenerative fuel cell (RFC) system testbed. Key objectives of this program are to evaluate, characterize, and demonstrate fully integrated RFC's for space, military, and commercial applications. The Lewis-led team is implementing the program through a unique international coalition that encompasses both Government and industry participants. Construction of the 25-kW RFC testbed at the NASA facility at Edwards Air Force Base was completed in January 1995, and the system has been operational since that time.

  2. SAVA 3: A testbed for integration and control of visual processes

    NASA Technical Reports Server (NTRS)

    Crowley, James L.; Christensen, Henrik

    1994-01-01

    The development of an experimental test-bed to investigate the integration and control of perception in a continuously operating vision system is described. The test-bed integrates a 12 axis robotic stereo camera head mounted on a mobile robot, dedicated computer boards for real-time image acquisition and processing, and a distributed system for image description. The architecture was designed to: (1) be continuously operating, (2) integrate software contributions from geographically dispersed laboratories, (3) integrate description of the environment with 2D measurements, 3D models, and recognition of objects, (4) capable of supporting diverse experiments in gaze control, visual servoing, navigation, and object surveillance, and (5) dynamically reconfiguarable.

  3. Testing Cloud Microphysics Parameterizations in NCAR CAM5 with ISDAC and M-PACE Observations

    SciTech Connect

    Liu, Xiaohong; Xie, Shaocheng; Boyle, James; Klein, Stephen A.; Shi, Xiangjun; Wang, Zhien; Lin, Wuyin; Ghan, Steven J.; Earle, Michael; Liu, Peter; Zelenyuk, Alla

    2011-12-24

    Arctic clouds simulated by the NCAR Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic Spring and Fall seasons performed under the Cloud- Associated Parameterizations Testbed (CAPT) framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary layer mixed-phase stratocumulus, and multilayer or deep frontal clouds. However, for low-level clouds, the model significantly underestimates the observed cloud liquid water content in both seasons and cloud fraction in the Spring season. As a result, CAM5 significantly underestimates the surface downward longwave (LW) radiative fluxes by 20-40 W m-2. The model with a new ice nucleation parameterization moderately improves the model simulations by increasing cloud liquid water content in mixed-phase clouds through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron- Findeisen (WBF) process. The CAM5 single column model testing shows that change in the homogeneous freezing temperature of rain to form snow from -5 C to -40 C has a substantial impact on the modeled liquid water content through the slowing-down of liquid and rain-related processes. In contrast, collections of cloud ice by snow and cloud liquid by rain are of minor importance for single-layer boundary layer mixed-phase clouds in the Arctic.

  4. Testing cloud microphysics parameterizations in NCAR CAM5 with ISDAC and M-PACE observations

    SciTech Connect

    Liu X.; Lin W.; Xie, S.; Boyle, J.; Klein, S. A.; Shi, X.; Wang, Z.; Ghan, S. J.; Earle, M.; Liu, P. S. K.; Zelenyuk, A.

    2011-12-24

    Arctic clouds simulated by the National Center for Atmospheric Research (NCAR) Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic spring and fall seasons performed under the Cloud-Associated Parameterizations Testbed framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary-layer mixed-phase stratocumulus and multilayer or deep frontal clouds. However, for low-level stratocumulus, the model significantly underestimates the observed cloud liquid water content in both seasons. As a result, CAM5 significantly underestimates the surface downward longwave radiative fluxes by 20-40 W m{sup -2}. Introducing a new ice nucleation parameterization slightly improves the model performance for low-level mixed-phase clouds by increasing cloud liquid water content through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron-Findeisen process. The CAM5 single-column model testing shows that changing the instantaneous freezing temperature of rain to form snow from -5 C to -40 C causes a large increase in modeled cloud liquid water content through the slowing down of cloud liquid and rain-related processes (e.g., autoconversion of cloud liquid to rain). The underestimation of aerosol concentrations in CAM5 in the Arctic also plays an important role in the low bias of cloud liquid water in the single-layer mixed-phase clouds. In addition, numerical issues related to the coupling of model physics and time stepping in CAM5 are responsible for the model biases and will be explored in future studies.

  5. High Vertically Resolved Atmospheric and Surface/Cloud Parameters Retrieved with Infrared Atmospheric Sounding Interferometer (IASI)

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Larar, Allen M.; Smith, WIlliam L.; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    The Joint Airborne IASI Validation Experiment (JAIVEx) was conducted during April 2007 mainly for validation of the IASI on the MetOp satellite. IASI possesses an ultra-spectral resolution of 0.25/cm and a spectral coverage from 645 to 2760/cm. Ultra-spectral resolution infrared spectral radiance obtained from near nadir observations provide atmospheric, surface, and cloud property information. An advanced retrieval algorithm with a fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. This physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the cloud-free and/or clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals are achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). Preliminary retrievals of atmospheric soundings, surface properties, and cloud optical/microphysical properties with the IASI observations are obtained and presented. These retrievals will be further inter-compared with those obtained from airborne FTS system, such as the NPOESS Airborne Sounder Testbed - Interferometer (NAST-I), dedicated dropsondes, radiosondes, and ground based Raman Lidar. The

  6. Martian Clouds

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site]

    Released 28 June 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.

    Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.

    This image was acquired during early spring near the North Pole. The linear 'ripples' are transparent water-ice clouds. This linear form is typical for polar clouds. The black regions on the margins of this image are areas of saturation caused by the build up of scattered light from the bright polar material during the long image exposure.

    Image information: VIS instrument. Latitude 68.1, Longitude 147.9 East (212.1 West). 38 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS

  7. Corona-producing ice clouds: a case study of a cold mid-latitude cirrus layer.

    PubMed

    Sassen, K; Mace, G G; Hallett, J; Poellot, M R

    1998-03-20

    A high (14.0-km), cold (-71.0 degrees C) cirrus cloud was studied by ground-based polarization lidar and millimeter radar and aircraft probes on the night of 19 April 1994 from the Cloud and Radiation Testbed site in northern Oklahoma. A rare cirrus cloud lunar corona was generated by this 1-2-km-deep cloud, thus providing an opportunity to measure the composition in situ, which had previously been assumed only on the basis of lidar depolarization data and simple diffraction theory for spheres. In this case, corona ring analysis indicated an effective particle diameter of ~22 mum. A variety of in situ data corroborates the approximate ice-particle size derived from the passive retrieval method, especially near the cloud top, where impacted cloud samples show simple solid crystals. The homogeneous freezing of sulfuric acid droplets of stratospheric origin is assumed to be the dominant ice-particle nucleation mode acting in corona-producing cirrus clouds. It is speculated that this process results in a previously unrecognized mode of acid-contaminated ice-particle growth and that such small-particle cold cirrus clouds are potentially a radiatively distinct type of cloud.

  8. Corona-producing ice clouds: A case study of a cold mid-latitude cirrus layer

    SciTech Connect

    Sassen, K.; Mace, G.G.; Hallett, J.; Poellot, M.R.

    1998-03-01

    A high (14.0-km), cold ({minus}71.0thinsp{degree}C) cirrus cloud was studied by ground-based polarization lidar and millimeter radar and aircraft probes on the night of 19 April 1994 from the Cloud and Radiation Testbed site in northern Oklahoma. A rare cirrus cloud lunar corona was generated by this 1{endash}2-km-deep cloud, thus providing an opportunity to measure the composition {ital in situ}, which had previously been assumed only on the basis of lidar depolarization data and simple diffraction theory for spheres. In this case, corona ring analysis indicated an effective particle diameter of {approximately}22 {mu}m. A variety of {ital in situ} data corroborates the approximate ice-particle size derived from the passive retrieval method, especially near the cloud top, where impacted cloud samples show simple solid crystals. The homogeneous freezing of sulfuric acid droplets of stratospheric origin is assumed to be the dominant ice-particle nucleation mode acting in corona-producing cirrus clouds. It is speculated that this process results in a previously unrecognized mode of acid-contaminated ice-particle growth and that such small-particle cold cirrus clouds are potentially a radiatively distinct type of cloud. {copyright} 1998 Optical Society of America

  9. Climatic Implications of the Observed Temperature Dependence of the Liquid Water Path of Low Clouds

    NASA Technical Reports Server (NTRS)

    DelGenio, Anthony

    1999-01-01

    The uncertainty in the global climate sensitivity to an equilibrium doubling of carbon dioxide is often stated to be 1.5-4.5 K, largely due to uncertainties in cloud feedbacks. The lower end of this range is based on the assumption or prediction in some GCMs that cloud liquid water behaves adiabatically, thus implying that cloud optical thickness will increase in a warming climate if the physical thickness of clouds is invariant. Satellite observations of low-level cloud optical thickness and liquid water path have challenged this assumption, however, at low and middle latitudes. We attempt to explain the satellite results using four years of surface remote sensing data from the Atmospheric Radiation Measurements (ARM) Cloud And Radiation Testbed (CART) site in the Southern Great Plains. We find that low cloud liquid water path is insensitive to temperature in winter but strongly decreases with temperature in summer. The latter occurs because surface relative humidity decreases with warming, causing cloud base to rise and clouds to geometrically thin. Meanwhile, inferred liquid water contents hardly vary with temperature, suggesting entrainment depletion. Physically, the temperature dependence appears to represent a transition from higher probabilities of stratified boundary layers at cold temperatures to a higher incidence of convective boundary layers at warm temperatures. The combination of our results and the earlier satellite findings imply that the minimum climate sensitivity should be revised upward from 1.5 K.

  10. Crater Clouds

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA06085 Crater Clouds

    The crater on the right side of this image is affecting the local wind regime. Note the bright line of clouds streaming off the north rim of the crater.

    Image information: VIS instrument. Latitude -78.8N, Longitude 320.0E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  11. Atmospheric Corerection of Aviris Data of Monterey Bay Contaminated by Thin Cirrus Clouds

    NASA Technical Reports Server (NTRS)

    van den Bosch, Jeannette; Davis, Curtiss O.; Mobley, Curtis D.; Rhea, W. Joseph

    1993-01-01

    Aviris scenes are often rejected when cloud cover exceeds 10 percent. However, if the cloud cover is determined to be primarily cirrus rather than cumulus, inwater optical properties may still be extracted over open ocean.

  12. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  13. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Li, C.; Wang, J.; Cui, C.; He, B.; Fan, D.; Yang, Y.; Chen, J.; Zhang, H.; Yu, C.; Xiao, J.; Wang, C.; Cao, Z.; Fan, Y.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Yin, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on CloudStack, an open source software, we set up the cloud computing environment for AstroCloud Project. It consists of five distributed nodes across the mainland of China. Users can use and analysis data in this cloud computing environment. Based on GlusterFS, we built a scalable cloud storage system. Each user has a private space, which can be shared among different virtual machines and desktop systems. With this environments, astronomer can access to astronomical data collected by different telescopes and data centers easily, and data producers can archive their datasets safely.

  14. Cloud Infrastructure & Applications - CloudIA

    NASA Astrophysics Data System (ADS)

    Sulistio, Anthony; Reich, Christoph; Doelitzscher, Frank

    The idea behind Cloud Computing is to deliver Infrastructure-as-a-Services and Software-as-a-Service over the Internet on an easy pay-per-use business model. To harness the potentials of Cloud Computing for e-Learning and research purposes, and to small- and medium-sized enterprises, the Hochschule Furtwangen University establishes a new project, called Cloud Infrastructure & Applications (CloudIA). The CloudIA project is a market-oriented cloud infrastructure that leverages different virtualization technologies, by supporting Service-Level Agreements for various service offerings. This paper describes the CloudIA project in details and mentions our early experiences in building a private cloud using an existing infrastructure.

  15. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  16. Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA)

    NASA Technical Reports Server (NTRS)

    Banker, Brian F.; Robinson, Travis

    2016-01-01

    The proposed paper will cover ongoing effort named HESTIA (Human Exploration Spacecraft Testbed for Integration and Advancement), led at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) to promote a cross-subsystem approach to developing Mars-enabling technologies with the ultimate goal of integrated system optimization. HESTIA also aims to develop the infrastructure required to rapidly test these highly integrated systems at a low cost. The initial focus is on the common fluids architecture required to enable human exploration of mars, specifically between life support and in-situ resource utilization (ISRU) subsystems. An overview of the advancements in both integrated technologies, in infrastructure, in simulation, and in modeling capabilities will be presented, as well as the results and findings of integrated testing,. Due to the enormous mass gear-ratio required for human exploration beyond low-earth orbit, (for every 1 kg of payload landed on Mars, 226 kg will be required on Earth), minimization of surface hardware and commodities is paramount. Hardware requirements can be minimized by reduction of equipment performing similar functions though for different subsystems. If hardware could be developed which meets the requirements of both life support and ISRU it could result in the reduction of primary hardware and/or reduction in spares. Minimization of commodities to the surface of mars can be achieved through the creation of higher efficiency systems producing little to no undesired waste, such as a closed-loop life support subsystem. Where complete efficiency is impossible or impractical, makeup commodities could be manufactured via ISRU. Although, utilization of ISRU products (oxygen and water) for crew consumption holds great promise of reducing demands on life support hardware, there exist concerns as to the purity and transportation of commodities. To date, ISRU has been focused on production rates and purities for

  17. !CHAOS: A cloud of controls

    NASA Astrophysics Data System (ADS)

    Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.

    2016-01-01

    The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.

  18. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Bench-scale Testbed Design Report

    SciTech Connect

    Melin, Alexander M.; Kisner, Roger A.; Drira, Anis; Reed, Frederick K.

    2015-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings to support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.

  19. Description of New Inflatable/Rigidizable Hexapod Structure Testbed for Shape and Vibration Control

    NASA Technical Reports Server (NTRS)

    Adetona, O.; Keel, L. H.; Horta, L. G.; Cadogan, D. P.; Sapna, G. H.; Scarborough, S. E.

    2002-01-01

    Larger and more powerful space based instruments are needed to meet increasingly sophisticated scientific demand. To support this need, concepts for telescopes with apertures of 100 meters are being investigated, but the required technologies are not in hand today. Due to the capacity limits of launch vehicles, the idea of deploying, erecting, or inflating large structures in space is being considered. Recently, rigidization concepts of large inflatable structures have demonstrated the capability of weight reductions of up to 50% from current concepts with packaging efficiencies near 80%. One of the important aspects of inflatable structures is vibration mitigation and line-of-sight control. Such control tasks are possible only after actuators/sensors are properly integrated into a rigidizable concept. To study these issues, we have developed an inflatable/rigidizable hexapod structure testbed. The testbed integrates state of the art piezo-electric self-sensing actuators into an inflatable/rigidizable structure and a flat membrane reflector. Using this testbed, we plan to experimentally demonstrate achievable vibration and line-of-sight control. This paper contains a description of the testbed and an outline of the test plan.

  20. An adaptable, low cost test-bed for unmanned vehicle systems research

    NASA Astrophysics Data System (ADS)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  1. James Webb Space Telescope Optical Simulation Testbed I: overview and first results

    NASA Astrophysics Data System (ADS)

    Perrin, Marshall D.; Soummer, Rémi; Choquet, Élodie; N'Diaye, Mamadou; Levecq, Olivier; Lajoie, Charles-Philippe; Ygouf, Marie; Leboulleux, Lucie; Egron, Sylvain; Anderson, Rachel; Long, Chris; Elliott, Erin; Hartig, George; Pueyo, Laurent; van der Marel, Roeland; Mountain, Matt

    2014-08-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop workbench to study aspects of wavefront sensing and control for a segmented space telescope, including both commissioning and maintenance activities. JOST is complementary to existing optomechanical testbeds for JWST (e.g. the Ball Aerospace Testbed Telescope, TBT) given its compact scale and flexibility, ease of use, and colocation at the JWST Science & Operations Center. We have developed an optical design that reproduces the physics of JWST's three-mirror anastigmat using three aspheric lenses; it provides similar image quality as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at HeNe wavelength. A segmented deformable mirror stands in for the segmented primary mirror and allows control of the 18 segments in piston, tip, and tilt, while the secondary can be controlled in tip, tilt and x, y, z position. This will be sufficient to model many commissioning activities, to investigate field dependence and multiple field point sensing & control, to evaluate alternate sensing algorithms, and develop contingency plans. Testbed data will also be usable for cross-checking of the WFS&C Software Subsystem, and for staff training and development during JWST's five- to ten-year mission.

  2. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our contrast assessment and the development, sensing and control of the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraphy (VNC) for exoplanet detection and characterization. Tbe VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center has an established effort to develop VNC technologies, and an incremental sequence of testbeds to advance this approach and its critical technologies. We discuss the development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(exp 8), 10(exp 9) and ideally 10(exp 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the laboratory results, optical configuration, critical technologies and the null sensing and control approach.

  3. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  4. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Steincamp, James; Taylor, Jaime

    2003-01-01

    A reduced surrogate, one point crossover genetic algorithm with random rank-based selection was used successfully to estimate the multiple phases of a segmented optical system modeled on the seven-mirror Systematic Image-Based Optical Alignment testbed located at NASA's Marshall Space Flight Center.

  5. The Making of America II Testbed Project: A Digital Library Service Model.

    ERIC Educational Resources Information Center

    Hurley, Bernard J.; Price-Wilkin, John; Proffitt, Merrilee; Besser, Howard

    The Making of America (MoA) Testbed Project, coordinated by the Digital Library Federation, is a multi-phase endeavor. Its purpose is to investigate important issues in the creation of an integrated but distributed digital library of archival materials (i.e., digitized surrogates of primary source materials found in archives and special…

  6. Results from the TOM3 testbed: thermal deformation of optics at the picometer Level

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Lindensmith, C. A.; Hahn, I.

    2006-01-01

    We have discussed the TOM3 testbed developed to assess the thermo-opto-mechanical stability of optical assembly such as SIM's siderostat and telescope in flight-like thermal conditions. Although limited by the metrology sensor noise, test results show that optical wavefront stability of SIM's optical assembly is compatible with single micro-arcsecond astrometry.

  7. Model-Based Diagnosis in a Power Distribution Test-Bed

    NASA Technical Reports Server (NTRS)

    Scarl, E.; McCall, K.

    1998-01-01

    The Rodon model-based diagnosis shell was applied to a breadboard test-bed, modeling an automated power distribution system. The constraint-based modeling paradigm and diagnostic algorithm were found to adequately represent the selected set of test scenarios.

  8. Venus: uniformity of clouds, and photography.

    PubMed

    Keene, G T

    1968-01-19

    Photographs of Earth at a resolution of about 600 kilometers were compared to pictures of Venus taken from Earth at about the same resolution . Under these conditions Earth appear very heavily covered by clouds. Since details on the surface of Earth can be recorded from Earth orbit, it may be possible to phiotograph protions of the surface of Venus, through openings in the clouds, from an orbiting satellite.

  9. Statistical Analyses of Satellite Cloud Object Data From CERES. Part 4; Boundary-layer Cloud Objects During 1998 El Nino

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce A.; Parker, Lindsay

    2006-01-01

    Three boundary-layer cloud object types, stratus, stratocumulus and cumulus, that occurred over the Pacific Ocean during January-August 1998, are identified from the CERES (Clouds and the Earth s Radiant Energy System) single scanner footprint (SSF) data from the TRMM (Tropical Rainfall Measuring Mission) satellite. This study emphasizes the differences and similarities in the characteristics of each cloud-object type between the tropical and subtropical regions and among different size categories and among small geographic areas. Both the frequencies of occurrence and statistical distributions of cloud physical properties are analyzed. In terms of frequencies of occurrence, stratocumulus clouds dominate the entire boundary layer cloud population in all regions and among all size categories. Stratus clouds are more prevalent in the subtropics and near the coastal regions, while cumulus clouds are relatively prevalent over open ocean and the equatorial regions, particularly, within the small size categories. The largest size category of stratus cloud objects occurs more frequently in the subtropics than in the tropics and has much larger average size than its cumulus and stratocumulus counterparts. Each of the three cloud object types exhibits small differences in statistical distributions of cloud optical depth, liquid water path, TOA albedo and perhaps cloud-top height, but large differences in those of cloud-top temperature and OLR between the tropics and subtropics. Differences in the sea surface temperature (SST) distributions between the tropics and subtropics influence some of the cloud macrophysical properties, but cloud microphysical properties and albedo for each cloud object type are likely determined by (local) boundary-layer dynamics and structures. Systematic variations of cloud optical depth, TOA albedo, cloud-top height, OLR and SST with cloud object sizes are pronounced for the stratocumulus and stratus types, which are related to systematic

  10. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  11. Scalable cloud without dedicated storage

    NASA Astrophysics Data System (ADS)

    Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.

    2015-05-01

    We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.

  12. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  13. Laser Communications Airborne Testbed: Potential For An Air-To-Satellite Laser Communications Link

    NASA Astrophysics Data System (ADS)

    Feldmann, Robert J.

    1988-05-01

    The Laser Communications Airborne Testbed (LCAT) offers an excellent opportunity for testing of an air-to-satellite laser communications link with the NASA Advanced Communications Technology Satellite (ACTS). The direct detection laser portion of the ACTS is suitable for examining the feasibility of an airborne terminal. Development of an airborne laser communications terminal is not currently part of the ACTS program; however, an air-to-satellite link is of interest. The Air Force performs airborne laser communications experiments to examine the potential usefulness of this technology to future aircraft. Lasers could be used, for example, by future airborne command posts and reconnaissance aircraft to communicate via satellite over long distances and transmit large quantities of data in the fastest way possible from one aircraft to another or to ground sites. Lasers are potentially secure, jam resistant and hard to detect and in this regard increase the survivability of the users. Under a contract awarded by Aeronautical Systems Division's Avionics Laboratory, a C-135E testbed aircraft belonging to ASD's 4950th Test Wing will be modified to create a Laser Communications Airborne Testbed. The contract is for development and fabrication of laser testbed equipment and support of the aircraft modification effort by the Test Wing. The plane to be modified is already in use as a testbed for other satellite communications projects and the LCAT effort will expand those capabilities. This analysis examines the characteristics of an LCAT to ACTS direct detection communications link. The link analysis provides a measure of the feasibility of developing an airborne laser terminal which will interface directly to the LCAT. Through the existence of the LCAT, the potential for development of an air-to-satellite laser communications terminal for the experimentation with the ACTS system is greatly enhanced.

  14. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  15. A testbed for wide-field, high-resolution, gigapixel-class cameras

    NASA Astrophysics Data System (ADS)

    Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  16. Chapter 25: Cloud-Resolving Modeling: ARM and the GCSS Story

    NASA Technical Reports Server (NTRS)

    Krueger, Steven K.; Morrison, Hugh; Fridlind, Ann M.

    2016-01-01

    The Global Energy and Water Cycle Experiment (GEWEX) Cloud System Study (GCSS) was created in 1992. As described by Browning et al., The focus of GCSS is on cloud systems spanning the mesoscale rather than on individual clouds. Observations from field programs will be used to develop and validate the cloud-resolving models, which in turn will be used as test-beds to develop the parameterizations for the large-scale models. The most important activities that GCSS promoted were the following: Identify key questions about cloud systems relating to parameterization issues and suggest approaches to address them, and Organize model intercomparison studies relevant to cloud parameterization. Four different cloud system types were chosen for GCSS to study: boundary layer, cirrus, frontal, and deep precipitating convective. A working group (WG) was formed for each of the cloud system types. The WGs organized model intercomparison studies and meetings to present results of the intercomparisons. The first such intercomparison study took place in 1994.

  17. FY 2011 Second Quarter: Demonstration of New Aerosol Measurement Verification Testbed for Present-Day Global Aerosol Simulations

    SciTech Connect

    Koch, D

    2011-03-20

    The regional-scale Weather Research and Forecasting (WRF) model is being used by a DOE Earth System Modeling (ESM) project titled “Improving the Characterization of Clouds, Aerosols and the Cryosphere in Climate Models” to evaluate the performance of atmospheric process modules that treat aerosols and aerosol radiative forcing in the Arctic. We are using a regional-scale modeling framework for three reasons: (1) It is easier to produce a useful comparison to observations with a high resolution model; (2) We can compare the behavior of the CAM parameterization suite with some of the more complex and computationally expensive parameterizations used in WRF; (3) we can explore the behavior of this parameterization suite at high resolution. Climate models like the Community Atmosphere Model version 5 (CAM5) being used within the Community Earth System Model (CESM) will not likely be run at mesoscale spatial resolutions (10–20 km) until 5–10 years from now. The performance of the current suite of physics modules in CAM5 at such resolutions is not known, and current computing resources do not permit high-resolution global simulations to be performed routinely. We are taking advantage of two tools recently developed under PNNL Laboratory Directed Research and Development (LDRD) projects for this activity. The first is the Aerosol Modeling Testbed (Fast et al., 2011b), a new computational framework designed to streamline the process of testing and evaluating aerosol process modules over a range of spatial and temporal scales. The second is the CAM5 suite of physics parameterizations that have been ported into WRF so that their performance and scale dependency can be quantified at mesoscale spatial resolutions (Gustafson et al., 2010; with more publications in preparation).

  18. The Oort cloud

    NASA Technical Reports Server (NTRS)

    Marochnik, Leonid S.; Mukhin, Lev M.; Sagdeev, Roald Z.

    1991-01-01

    Views of the large-scale structure of the solar system, consisting of the Sun, the nine planets and their satellites, changed when Oort demonstrated that a gigantic cloud of comets (the Oort cloud) is located on the periphery of the solar system. The following subject areas are covered: (1) the Oort cloud's mass; (2) Hill's cloud mass; (3) angular momentum distribution in the solar system; and (4) the cometary cloud around other stars.

  19. Dim star fringe stabilization demonstration using pathlength feed-forward on the SIM testbed 3 (STB3)

    NASA Astrophysics Data System (ADS)

    Goullioud, Renaud; Alvarez-Salazar, Oscar S.; Nemati, Bijan

    2003-02-01

    Future space-based optical interferometers such as the Space Interferometer Mission require fringe stabilization to the level of nanometers in order to produce astrometric data at the micro-arc-second level. Even the best attitude control system available to date will not be able to stabilize the attitude of a several thousand pound spacecraft to a few milli-arc-seconds. Active pathlength control is usually implemented to compensate for attitude drift of the spacecraft. This issue has been addressed in previous experiments while tracking bright stars. In the case of dim stars, as the sensor bandwidth falls below one hertz, feedback control will not provide sufficient rejection. However, stabilization of the fringes from a dim-star down to the nanometer level can be done open loop using information from additional interferometers looking at bright guide stars. The STB3 testbed developed at the Jet Propulsion Laboratory features three optical interferometers sharing a common baseline, dynamically representative to the SIM interferometer. An artificial star feeding the interferometers is installed on a separate optics bench. Voice coils are used to simulate the attitude motion of the spacecraft by moving the entire bench. Data measured on STB3 show that fringe motion of a dim star due to spacecraft attitude changes can be attenuated by 80 dB at 0.1Hz without feedback control, using only information from two guide stars. This paper describes the STB3 setup, the pathlength feed-forward architecture, implementation issues and data collected with the system.

  20. Astronomy in the Cloud: Using MapReduce for Image Co-Addition

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-03-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computation challenges such as anomaly detection and classification and moving-object tracking. Since such studies benefit from the highest-quality data, methods such as image co-addition, i.e., astrometric registration followed by per-pixel summation, will be a critical preprocessing step prior to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this article we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data are partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources: i.e., platforms where Hadoop is offered as a service. We report on our experience of implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multiterabyte imaging data set provides a good testbed for algorithm development, since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image co-addition to the MapReduce framework. Then we describe a number of optimizations to our basic approach

  1. Virtual infrastructure management in private and hybrid clouds.

    SciTech Connect

    Sotomayor, B.; Montero, R. S.; Llorente, I. M.; Foster, I.; Mathematics and Computer Science; Univ. of Chicago; Univ. Complutense of Madrid

    2009-01-01

    One of the many definitions of 'cloud' is that of an infrastructure-as-a-service (IaaS) system, in which IT infrastructure is deployed in a provider's data center as virtual machines. With IaaS clouds growing popularity, tools and technologies are emerging that can transform an organization's existing infrastructure into a private or hybrid cloud. OpenNebula is an open source, virtual infrastructure manager that deploys virtualized services on both a local pool of resources and external IaaS clouds. Haizea, a resource lease manager, can act as a scheduling back end for OpenNebula, providing features not found in other cloud software or virtualization-based data center management software.

  2. Microphysical and macrophysical responses of marine stratocumulus polluted by underlying ships: Evidence of cloud deepening

    NASA Astrophysics Data System (ADS)

    Christensen, Matthew W.; Stephens, Graeme L.

    2011-02-01

    Ship tracks observed by the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) were analyzed to determine the extent to which aerosol plumes from ships passing below marine stratocumulus alter the microphysical and macrophysical properties of the clouds. Moderate Resolution Imaging Spectroradiometer (MODIS) imagery was used to distinguish ship tracks embedded in closed, open, and undefined mesoscale cellular cloud structures. The impact of aerosol on the microphysical cloud properties in both the closed and open cell regimes were consistent with the changes predicted by the Twomey hypothesis. For the macrophysical changes, differences were observed between regimes. In the open cell regime, polluted clouds had significantly higher cloud tops (16%) and more liquid water (39%) than nearby unpolluted clouds. However, in the closed cell regime, polluted clouds exhibited no change in cloud top height and had less liquid water (-6%). Both microphysical (effective radius) and macrophysical (liquid water path) cloud properties contribute to a fractional change in cloud optical depth; in the closed cell regime the microphysical contribution was 3 times larger than the macrophysical contribution. However, the opposite was true in the open cell regime where the macrophysical contribution was nearly 2 times larger than the microphysical contribution because the aerosol probably increased cloud coverage. The results presented here demonstrate key differences aerosols have on the microphysical and macrophysical responses of boundary layer clouds between mesoscale stratocumulus convective regimes.

  3. Aerosol and cloud droplet number concentrations observed in marine stratocumulus

    SciTech Connect

    Vong, R.J.; Covert, D.S.

    1995-12-01

    The relationship between measurements of cloud droplet number concentration and cloud condensation nuclei (CCN) concentration, as inferred from aerosol size spectra, was investigated at a {open_quote}clean air{close_quote}, marine site (Cheeka Peak) located near the coast of the Olympic Peninsula in Washington State. Preliminary results demonstrated that cloud droplet number increased and droplet diameter decreased as aerosol number concentration (CCN) increased. These results support predictions of a climate cooling due to any future increases in marine aerosol concentrations.

  4. Computational relativistic astrophysics with adaptive mesh refinement: Testbeds

    SciTech Connect

    Evans, Edwin; Iyer, Sai; Tao Jian; Wolfmeyer, Randy; Zhang Huimin; Schnetter, Erik; Suen, Wai-Mo

    2005-04-15

    We have carried out numerical simulations of strongly gravitating systems based on the Einstein equations coupled to the relativistic hydrodynamic equations using adaptive mesh refinement (AMR) techniques. We show AMR simulations of NS binary inspiral and coalescence carried out on a workstation having an accuracy equivalent to that of a 1025{sup 3} regular unigrid simulation, which is, to the best of our knowledge, larger than all previous simulations of similar NS systems on supercomputers. We believe the capability opens new possibilities in general relativistic simulations.

  5. A Cloud Microphysics Model for the Gas Giant Planets

    NASA Astrophysics Data System (ADS)

    Palotai, Csaba J.; Le Beau, Raymond P.; Shankar, Ramanakumar; Flom, Abigail; Lashley, Jacob; McCabe, Tyler

    2016-10-01

    Recent studies have significantly increased the quality and the number of observed meteorological features on the jovian planets, revealing banded cloud structures and discrete features. Our current understanding of the formation and decay of those clouds also defines the conceptual modes about the underlying atmospheric dynamics. The full interpretation of the new observational data set and the related theories requires modeling these features in a general circulation model (GCM). Here, we present details of our bulk cloud microphysics model that was designed to simulate clouds in the Explicit Planetary Hybrid-Isentropic Coordinate (EPIC) GCM for the jovian planets. The cloud module includes hydrological cycles for each condensable species that consist of interactive vapor, cloud and precipitation phases and it also accounts for latent heating and cooling throughout the transfer processes (Palotai and Dowling, 2008. Icarus, 194, 303–326). Previously, the self-organizing clouds in our simulations successfully reproduced the vertical and horizontal ammonia cloud structure in the vicinity of Jupiter's Great Red Spot and Oval BA (Palotai et al. 2014, Icarus, 232, 141–156). In our recent work, we extended this model to include water clouds on Jupiter and Saturn, ammonia clouds on Saturn, and methane clouds on Uranus and Neptune. Details of our cloud parameterization scheme, our initial results and their comparison with observations will be shown. The latest version of EPIC model is available as open source software from NASA's PDS Atmospheres Node.

  6. Testbed for development of a DSP-based signal processing subsystem for an Earth-orbiting radar scatterometer

    NASA Technical Reports Server (NTRS)

    Clark, Douglas J.; Lux, James P.; Shirbacheh, Mike

    2002-01-01

    A testbed for evaluation of general-purpose digital signal processors in earth-orbiting radar scatterometers is discussed. Because general purpose DSP represents a departure from previous radar signal processing techniques used on scatterometers, there was a need to demonstrate key elements of the system to verify feasibility for potential future scatterometer instruments. Construction of the testbed also facilitated identification of an appropriate software development environment and the skills mix necessary to perform the work.

  7. TEXSYS. [a knowledge based system for the Space Station Freedom thermal control system test-bed

    NASA Technical Reports Server (NTRS)

    Bull, John

    1990-01-01

    The Systems Autonomy Demonstration Project has recently completed a major test and evaluation of TEXSYS, a knowledge-based system (KBS) which demonstrates real-time control and FDIR for the Space Station Freedom thermal control system test-bed. TEXSYS is the largest KBS ever developed by NASA and offers a unique opportunity for the study of technical issues associated with the use of advanced KBS concepts including: model-based reasoning and diagnosis, quantitative and qualitative reasoning, integrated use of model-based and rule-based representations, temporal reasoning, and scale-up performance issues. TEXSYS represents a major achievement in advanced automation that has the potential to significantly influence Space Station Freedom's design for the thermal control system. An overview of the Systems Autonomy Demonstration Project, the thermal control system test-bed, the TEXSYS architecture, preliminary test results, and thermal domain expert feedback are presented.

  8. NASA's flight-technological development program - A 650 Mbps laser communications testbed

    NASA Technical Reports Server (NTRS)

    Hayden, W. L.; Fitzmaurice, M. W.; Nace, D. A.; Lokerson, D. C.; Minott, P. O.; Chapman, W. W.

    1991-01-01

    A 650 Mbps laser communications testbed under construction for the development of flight qualifiable hardware suitable for near-term operation on geosynchronous-to-geosynchronous crosslink missions is presented. The program's primary purpose is to develop and optimize laser communications unique subsystems. Requirements for the testbed experiments are to optimize the acquisition processes, to fully simulate the long range (up to 21,000 km) and the fine tracking characteristics of two narrow-beam laser communications terminals, and to fully test communications performance which will include average and burst bit error rates, effects of laser diode coalignment, degradation due to internal and external stray light, and the impact of drifts in the optical components.

  9. Visible-band testbed projector with a replicated diffractive optical element.

    PubMed

    Chen, C B; Hegg, R G; Johnson, W T; King, W B; Rock, D F; Spande, R

    1999-12-01

    Raytheon has designed, fabricated, and tested a diffractive-optical-element-based (DOE-based) testbed projector for direct and indirect visual optical applications. By use of a low-cost replicated DOE surface from Rochester Photonics Corporation for color correction the projector optics bettered the modular transfer function of an equivalent commercial camera lens. The testbed demonstrates that a practical DOE-based optical system is suitable for both visual applications (e.g., head-mounted displays) and visual projection (e.g., tactical sensors). The need for and the proper application of DOE's in visual optical systems, the nature and the performance of the projector optical design, and test results are described. PMID:18324257

  10. Model evaluation, recommendation and prioritizing of future work for the manipulator emulator testbed

    NASA Technical Reports Server (NTRS)

    Kelly, Frederick A.

    1989-01-01

    The Manipulator Emulator Testbed (MET) is to provide a facility capable of hosting the simulation of various manipulator configurations to support concept studies, evaluation, and other engineering development activities. Specifically, the testbed is intended to support development of the Space Station Remote Manipulator System (SSRMS) and related systems. The objective of this study is to evaluate the math models developed for the MET simulation of a manipulator's rigid body dynamics and the servo systems for each of the driven manipulator joints. Specifically, the math models are examined with regard to their amenability to pipeline and parallel processing. Based on this evaluation and the project objectives, a set of prioritized recommendations are offered for future work.

  11. Development Of Electrical Power Systems For Test-Bed Rovers Aiming To Planetary Surface Exploration

    NASA Astrophysics Data System (ADS)

    Shimada, Takanobu; Otsuki, Massatsugu; Toyota, Hiroyuki; Ishigami, Genya; Kubota, Takashi

    2011-10-01

    Planetary surface exploration rovers are promisingly expected to safely travel over long distances and to make in-situ scientific observations. The authors have developed an innovative test-bed rover having a novel mobility system, lightweight manipulators, and advanced guidance and navigation functions. Electrical power systems (EPSs) of rovers require stable power supply to realize long-range travel on planetary surfaces. However, a power management scheme for the rover is different from that used for orbiting (or interplanetary) spacecrafts because the power spent by the rover significantly varies along with the rover's motion profile. The authors performed several field tests in a desert using the test-bed rover that uses the newly developed EPS. The developed autonomous power management and control for the rover have been tested and evaluated through the field tests. This paper reports the functions and performance of the developed EPS and the obtained experimental results via the field tests.

  12. Utilizing the EUVE Innovative Technology Testbed to Reduce Operations Cost for Present and Future Orbiting Mission

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This report summarizes work done under Cooperative Agreement (CA) on the following testbed projects: TERRIERS - The development of the ground systems to support the TERRIERS satellite mission at Boston University (BU). HSTS - The application of ARC's Heuristic Scheduling Testbed System (HSTS) to the EUVE satellite mission. SELMON - The application of NASA's Jet Propulsion Laboratory's (JPL) Selective Monitoring (SELMON) system to the EUVE satellite mission. EVE - The development of the EUVE Virtual Environment (EVE), a prototype three-dimensional (3-D) visualization environment for the EUVE satellite and its sensors, instruments, and communications antennae. FIDO - The development of the Fault-Induced Document Officer (FIDO) system, a prototype application to respond to anomalous conditions by automatically searching for, retrieving, and displaying relevant documentation for an operators use.

  13. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Taylor, Jaime; Rakoczy, John; Steincamp, James

    2003-01-01

    Phase retrieval requires calculation of the real-valued phase of the pupil fimction from the image intensity distribution and characteristics of an optical system. Genetic 'algorithms were used to solve two one-dimensional phase retrieval problem. A GA successfully estimated the coefficients of a polynomial expansion of the phase when the number of coefficients was correctly specified. A GA also successfully estimated the multiple p h e s of a segmented optical system analogous to the seven-mirror Systematic Image-Based Optical Alignment (SIBOA) testbed located at NASA s Marshall Space Flight Center. The SIBOA testbed was developed to investigate phase retrieval techniques. Tiphilt and piston motions of the mirrors accomplish phase corrections. A constant phase over each mirror can be achieved by an independent tip/tilt correction: the phase Conection term can then be factored out of the Discrete Fourier Tranform (DFT), greatly reducing computations.

  14. Design of a laboratory testbed for external occulters at flight Fresnel numbers

    NASA Astrophysics Data System (ADS)

    Kim, Yunjong; Galvin, Mike; Kasdin, N. Jeremy; Vanderbei, Robert J.; Ryu, Dongok; Kim, Ki-Won; Kim, Sug-Whan; Sirbu, Dan

    2015-09-01

    One of the main candidates for creating high-contrast for future Exo-Earth detection is an external occulter or sharshade. A starshade blocks the light from the parent star by flying in formation along the line-of-sight from a space telescope. Because of its large size and scale it is impossible to fully test a starshade system on the ground before launch. Instead, we rely on modeling supported by subscale laboratory tests to verify the models. At Princeton, we are designing and building a subscale testbed to verify the suppression and contrast of a starshade at the same Fresnel number as a flight system, and thus mathematically identical to a realistic space mission. Here we present the mechanical design of the testbed and simulations predicting the ultimate contrast performance. We will also present progress in implementation and preliminary results.

  15. An investigation of DTNS2D for use as an incompressible turbulence modelling test-bed

    NASA Technical Reports Server (NTRS)

    Steffen, Christopher J., Jr.

    1992-01-01

    This paper documents an investigation of a two dimensional, incompressible Navier-Stokes solver for use as a test-bed for turbulence modelling. DTNS2D is the code under consideration for use at the Center for Modelling of Turbulence and Transition (CMOTT). This code was created by Gorski at the David Taylor Research Center and incorporates the pseudo compressibility method. Two laminar benchmark flows are used to measure the performance and implementation of the method. The classical solution of the Blasius boundary layer is used for validating the flat plate flow, while experimental data is incorporated in the validation of backward facing step flow. Velocity profiles, convergence histories, and reattachment lengths are used to quantify these calculations. The organization and adaptability of the code are also examined in light of the role as a numerical test-bed.

  16. Towards an Experimental Testbed Facility for Cyber-Physical Security Research

    SciTech Connect

    Edgar, Thomas W.; Manz, David O.; Carroll, Thomas E.

    2012-01-07

    Cyber-Physical Systems (CPSs) are under great scrutiny due to large Smart Grid investments and recent high profile security vulnerabilities and attacks. Research into improved security technologies, communication models, and emergent behavior is necessary to protect these systems from sophisticated adversaries and new risks posed by the convergence of CPSs with IT equipment. However, cyber-physical security research is limited by the lack of access to universal cyber-physical testbed facilities that permit flexible, high-fidelity experiments. This paper presents a remotely-configurable and community-accessible testbed design that integrates elements from the virtual, simulated, and physical environments. Fusing data between the three environments enables the creation of realistic and scalable environments where new functionality and ideas can be exercised. This novel design will enable the research community to analyze and evaluate the security of current environments and design future, secure, cyber-physical technologies.

  17. Phased Array Antenna Testbed Development at the NASA Glenn Research Center

    NASA Technical Reports Server (NTRS)

    Lambert, Kevin M.; Kubat, Gregory; Johnson, Sandra K.; Anzic, Godfrey

    2003-01-01

    Ideal phased array antennas offer advantages for communication systems, such as wide-angle scanning and multibeam operation, which can be utilized in certain NASA applications. However, physically realizable, electronically steered, phased array antennas introduce additional system performance parameters, which must be included in the evaluation of the system. The NASA Glenn Research Center (GRC) is currently conducting research to identify these parameters and to develop the tools necessary to measure them. One of these tools is a testbed where phased array antennas may be operated in an environment that simulates their use. This paper describes the development of the testbed and its use in characterizing a particular K-Band, phased array antenna.

  18. The implementation of the Human Exploration Demonstration Project (HEDP), a systems technology testbed

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Korsmeyer, David J.

    1993-01-01

    The Human Exploration Demonstration Project (HEDP) is an ongoing task at the NASA's Ames Research Center to address the advanced technology requirements necessary to implement an integrated working and living environment for a planetary surface habitat. The integrated environment consists of life support systems, physiological monitoring of project crew, a virtual environment work station, and centralized data acquisition and habitat systems health monitoring. The HEDP is an integrated technology demonstrator, as well as an initial operational testbed. There are several robotic systems operational in a simulated planetary landscape external to the habitat environment, to provide representative work loads for the crew. This paper describes the evolution of the HEDP from initial concept to operational project; the status of the HEDP after two years; the final facilities composing the HEDP; the project's role as a NASA Ames Research Center systems technology testbed; and the interim demonstration scenarios that have been run to feature the developing technologies in 1993.

  19. The Langley Research Center CSI phase-0 evolutionary model testbed-design and experimental results

    NASA Technical Reports Server (NTRS)

    Belvin, W. K.; Horta, Lucas G.; Elliott, K. B.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology is described. The design philosophy, capabilities, and early experimental results are presented to introduce some of the ongoing CSI research at NASA-Langley. The testbed, referred to as the Phase 0 version of the CSI Evolutionary model (CEM), is the first stage of model complexity designed to show the benefits of CSI technology and to identify weaknesses in current capabilities. Early closed loop test results have shown non-model based controllers can provide an order of magnitude increase in damping in the first few flexible vibration modes. Model based controllers for higher performance will need to be robust to model uncertainty as verified by System ID tests. Data are presented that show finite element model predictions of frequency differ from those obtained from tests. Plans are also presented for evolution of the CEM to study integrated controller and structure design as well as multiple payload dynamics.

  20. Development and testing of an aerosol-stratus cloud parameterization scheme for middle and high latitudes

    SciTech Connect

    Olsson, P.Q.; Meyers, M.P.; Kreidenweis, S.; Cotton, W.R.

    1996-04-01

    The aim of this new project is to develop an aerosol/cloud microphysics parameterization of mixed-phase stratus and boundary layer clouds. Our approach is to create, test, and implement a bulk-microphysics/aerosol model using data from Atmospheric Radiation Measurement (ARM) Cloud and Radiation Testbed (CART) sites and large-eddy simulation (LES) explicit bin-resolving aerosol/microphysics models. The primary objectives of this work are twofold. First, we need the prediction of number concentrations of activated aerosol which are transferred to the droplet spectrum, so that the aerosol population directly affects the cloud formation and microphysics. Second, we plan to couple the aerosol model to the gas and aqueous-chemistry module that will drive the aerosol formation and growth. We begin by exploring the feasibility of performing cloud-resolving simulations of Arctic stratus clouds over the North Slope CART site. These simulations using Colorado State University`s regional atmospheric modeling system (RAMS) will be useful in designing the structure of the cloud-resolving model and in interpreting data acquired at the North Slope site.

  1. Physically-Retrieving Cloud and Thermodynamic Parameters from Ultraspectral IR Measurements

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Mango, Stephen A.; Huang, Hung-Lung

    2007-01-01

    A physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). NPOESS Airborne Sounder Testbed Interferometer (NAST-I) retrievals from the Atlantic-THORPEX Regional Campaign are compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL). This work was motivated by the need to obtain solutions for atmospheric soundings from infrared radiances observed for every individual field of view, regardless of cloud cover, from future ultraspectral geostationary satellite sounding instruments, such as the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and the Hyperspectral Environmental Suite (HES). However, this retrieval approach can also be applied to the ultraspectral sounding instruments to fly on Polar satellites, such as the Infrared Atmospheric Sounding Interferometer (IASI) on the European MetOp satellite, the Cross-track Infrared Sounder (CrIS) on the NPOESS Preparatory Project and the following NPOESS series of satellites.

  2. A demonstration of remote survey and characterization of a buried waste site using the SRIP (Soldier Robot Interface Project) testbed

    SciTech Connect

    Burks, B.L.; Richardson, B.S.; Armstrong, G.A.; Hamel, W.R.; Jansen, J.F.; Killough, S.M.; Thompson, D.H.; Emery, M.S.

    1990-01-01

    During FY 1990, the Oak Ridge National Laboratory (ORNL) supported the Department of Energy (DOE) Environmental Restoration and Waste Management (ER WM) Office of Technology Development through several projects including the development of a semiautonomous survey of a buried waste site using a remotely operated all-terrain robotic testbed borrowed from the US Army. The testbed was developed for the US Army's Human Engineering Laboratory (HEL) for the US Army's Soldier Robot Interface Project (SRIP). Initial development of the SRIP testbed was performed by a team including ORNL, HEL, Tooele Army Depot, and Odetics, Inc., as an experimental testbed for a variety of human factors issues related to military applications of robotics. The SRIP testbed was made available to the DOE and ORNL for the further development required for a remote landfill survey. The robot was modified extensively, equipped with environmental sensors, and used to demonstrate an automated remote survey of Solid Waste Storage Area No. 3 (SWSA 3) at ORNL on Tuesday, September 18, 1990. Burial trenches in this area containing contaminated materials were covered with soil nearly twenty years ago. This paper describes the SRIP testbed and work performed in FY 1990 to demonstrate a semiautonomous landfill survey at ORNL. 5 refs.

  3. TESTING THE APODIZED PUPIL LYOT CORONAGRAPH ON THE LABORATORY FOR ADAPTIVE OPTICS EXTREME ADAPTIVE OPTICS TESTBED

    SciTech Connect

    Thomas, Sandrine J.; Dillon, Daren; Gavel, Donald; Macintosh, Bruce; Sivaramakrishnan, Anand E-mail: dillon@ucolick.org E-mail: soummer@stsci.edu E-mail: anand@amnh.org

    2011-10-15

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  4. Testing the Apodized Pupil Lyot Coronagraph on the Laboratory for Adaptive Optics Extreme Adaptive Optics Testbed

    NASA Astrophysics Data System (ADS)

    Thomas, Sandrine J.; Soummer, Rémi; Dillon, Daren; Macintosh, Bruce; Gavel, Donald; Sivaramakrishnan, Anand

    2011-10-01

    We present testbed results of the Apodized Pupil Lyot Coronagraph (APLC) at the Laboratory for Adaptive Optics (LAO). These results are part of the validation and tests of the coronagraph and of the Extreme Adaptive Optics (ExAO) for the Gemini Planet Imager (GPI). The apodizer component is manufactured with a halftone technique using black chrome microdots on glass. Testing this APLC (like any other coronagraph) requires extremely good wavefront correction, which is obtained to the 1 nm rms level using the microelectricalmechanical systems (MEMS) technology, on the ExAO visible testbed of the LAO at the University of Santa Cruz. We used an APLC coronagraph without central obstruction, both with a reference super-polished flat mirror and with the MEMS to obtain one of the first images of a dark zone in a coronagraphic image with classical adaptive optics using a MEMS deformable mirror (without involving dark hole algorithms). This was done as a complementary test to the GPI coronagraph testbed at American Museum of Natural History, which studied the coronagraph itself without wavefront correction. Because we needed a full aperture, the coronagraph design is very different from the GPI design. We also tested a coronagraph with central obstruction similar to that of GPI. We investigated the performance of the APLC coronagraph and more particularly the effect of the apodizer profile accuracy on the contrast. Finally, we compared the resulting contrast to predictions made with a wavefront propagation model of the testbed to understand the effects of phase and amplitude errors on the final contrast.

  5. Homothetic mapping as means to obtain a wide field of view: the Delft Testbed Interferometer

    NASA Astrophysics Data System (ADS)

    van Brug, Hedser; Oostdijck, Bastiaan; Snijders, Bart; van der Avoort, Casper; Gori, Pierre-Marie

    2004-10-01

    The Delft Testbed Interferometer (DTI) will be presented. The basics of homothetic mapping will be explained together with the method of fulfilling the requirements as chosen in the DTI setup. The optical layout incorporates a novel tracking concept enabling the use of homothetic mapping in real telescope systems for observations on the sky. The requirements for homothetic mapping and the choices made in the DTI setup will be discussed. Finally the first results and the planned experiments will be presented.

  6. Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed

    NASA Technical Reports Server (NTRS)

    Tian, Ye; Song, Qi; Cattafesta, Louis

    2005-01-01

    This report summarizes the activities on "Implementation of Real-Time Feedback Flow Control Algorithms on a Canonical Testbed." The work summarized consists primarily of two parts. The first part summarizes our previous work and the extensions to adaptive ID and control algorithms. The second part concentrates on the validation of adaptive algorithms by applying them to a vibration beam test bed. Extensions to flow control problems are discussed.

  7. Remote sensing of sediment and chlorophyll with the test-bed aircraft multispectral scanner

    NASA Technical Reports Server (NTRS)

    Bowker, D. E.; Hardesty, C. A.; Jobson, D. J.

    1983-01-01

    An instrument known as the test-bed aircraft multispectral scanner (TBAMS) was used in a research flight over the entrance to the Chesapeake Bay. Upwelled radiances from the TBAMS data were correlated with the water parameters, particularly sediment and chlorophyll a. Several algorithms were demonstrated for monitoring sediment and chlorophyll, with a three-band ratio being the best. The primary advantage of the three-band ratio was found to be its apparent insensitivity to atmospheric and Sun-angle variations.

  8. Re-START: The second operational test of the String Thermionic Assembly Research Testbed

    SciTech Connect

    Wyant, F.J.; Luchau, D.; McCarson, T.D.

    1998-01-01

    The second operational test of the String Thermionic Assembly Research Testbed -- Re-START -- was carried out from June 9 to June 14, 1997. This test series was designed to help qualify and validate the designs and test methods proposed for the Integrated Solar Upper Stage (ISUS) power converters for use during critical evaluations of the complete ISUS bimodal system during the Engine Ground Demonstration (EGD). The test article consisted of eight ISUS prototype thermionic converter diodes electrically connected in series.

  9. Atmospheric cloud physics laboratory project study

    NASA Technical Reports Server (NTRS)

    Schultz, W. E.; Stephen, L. A.; Usher, L. H.

    1976-01-01

    Engineering studies were performed for the Zero-G Cloud Physics Experiment liquid cooling and air pressure control systems. A total of four concepts for the liquid cooling system was evaluated, two of which were found to closely approach the systems requirements. Thermal insulation requirements, system hardware, and control sensor locations were established. The reservoir sizes and initial temperatures were defined as well as system power requirements. In the study of the pressure control system, fluid analyses by the Atmospheric Cloud Physics Laboratory were performed to determine flow characteristics of various orifice sizes, vacuum pump adequacy, and control systems performance. System parameters predicted in these analyses as a function of time include the following for various orifice sizes: (1) chamber and vacuum pump mass flow rates, (2) the number of valve openings or closures, (3) the maximum cloud chamber pressure deviation from the allowable, and (4) cloud chamber and accumulator pressure.

  10. Variable Coding and Modulation Experiment Using NASA's Space Communication and Navigation Testbed

    NASA Technical Reports Server (NTRS)

    Downey, Joseph A.; Mortensen, Dale J.; Evans, Michael A.; Tollis, Nicholas S.

    2016-01-01

    National Aeronautics and Space Administration (NASA)'s Space Communication and Navigation Testbed on the International Space Station provides a unique opportunity to evaluate advanced communication techniques in an operational system. The experimental nature of the Testbed allows for rapid demonstrations while using flight hardware in a deployed system within NASA's networks. One example is variable coding and modulation, which is a method to increase data-throughput in a communication link. This paper describes recent flight testing with variable coding and modulation over S-band using a direct-to-earth link between the SCaN Testbed and the Glenn Research Center. The testing leverages the established Digital Video Broadcasting Second Generation (DVB-S2) standard to provide various modulation and coding options. The experiment was conducted in a challenging environment due to the multipath and shadowing caused by the International Space Station structure. Performance of the variable coding and modulation system is evaluated and compared to the capacity of the link, as well as standard NASA waveforms.

  11. Control structural interaction testbed: A model for multiple flexible body verification

    NASA Technical Reports Server (NTRS)

    Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.

    1993-01-01

    Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.

  12. James Webb Space Telescope optical simulation testbed III: first experimental results with linear-control alignment

    NASA Astrophysics Data System (ADS)

    Egron, Sylvain; Lajoie, Charles-Philippe; Leboulleux, Lucie; N'Diaye, Mamadou; Pueyo, Laurent; Choquet, Élodie; Perrin, Marshall D.; Ygouf, Marie; Michau, Vincent; Bonnefois, Aurélie; Fusco, Thierry; Escolle, Clément; Ferrari, Marc; Hugot, Emmanuel; Soummer, Rémi

    2016-07-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop experiment designed to study wavefront sensing and control for a segmented space telescope, including both commissioning and maintenance activities. JOST is complementary to existing testbeds for JWST (e.g. the Ball Aerospace Testbed Telescope TBT) given its compact scale and flexibility, ease of use, and colocation at the JWST Science and Operations Center. The design of JOST reproduces the physics of JWST's three-mirror anastigmat (TMA) using three custom aspheric lenses. It provides similar quality image as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at 633 nm. An Iris AO segmented mirror stands for the segmented primary mirror of JWST. Actuators allow us to control (1) the 18 segments of the segmented mirror in piston, tip, tilt and (2) the second lens, which stands for the secondary mirror, in tip, tilt and x, y, z positions. We present the full linear control alignment infrastructure developed for JOST, with an emphasis on multi-field wavefront sensing and control. Our implementation of the Wavefront Sensing (WFS) algorithms using phase diversity is experimentally tested. The wavefront control (WFC) algorithms, which rely on a linear model for optical aberrations induced by small misalignments of the three lenses, are tested and validated on simulations.

  13. SPHERES as Formation Flight Algorithm Development and Validation Testbed: Current Progress and Beyond

    NASA Technical Reports Server (NTRS)

    Kong, Edmund M.; Saenz-Otero, Alvar; Nolet, Simon; Berkovitz, Dustin S.; Miller, David W.; Sell, Steve W.

    2004-01-01

    The MIT-SSL SPHERES testbed provides a facility for the development of algorithms necessary for the success of Distributed Satellite Systems (DSS). The initial development contemplated formation flight and docking control algorithms; SPHERES now supports the study of metrology, control, autonomy, artificial intelligence, and communications algorithms and their effects on DSS projects. To support this wide range of topics, the SPHERES design contemplated the need to support multiple researchers, as echoed from both the hardware and software designs. The SPHERES operational plan further facilitates the development of algorithms by multiple researchers, while the operational locations incrementally increase the ability of the tests to operate in a representative environment. In this paper, an overview of the SPHERES testbed is first presented. The SPHERES testbed serves as a model of the design philosophies that allow for the various researches being carried out on such a facility. The implementation of these philosophies are further highlighted in the three different programs that are currently scheduled for testing onboard the International Space Station (ISS) and three that are proposed for a re-flight mission: Mass Property Identification, Autonomous Rendezvous and Docking, TPF Multiple Spacecraft Formation Flight in the first flight and Precision Optical Pointing, Tethered Formation Flight and Mars Orbit Sample Retrieval for the re-flight mission.

  14. Design and construction of the magnetic driving vehicle in a two-dimensional testbed

    NASA Astrophysics Data System (ADS)

    Gu, Chen; Guo, Yunzheng; Lin, Jiaying; Qu, Timing; Han, Zhenghe

    2009-07-01

    A two-dimensional testbed that aims to demonstrate the feasibility of magnetically driving several spacecrafts was successfully constructed. The testbed has two vehicles, each of which consists of two Bi2223 coils placed in an orthogonal way. The magnetic force between the two vehicles with a 1 m separation distance is less than 1 N, which is big enough to attract, repel and rotate two 50 kg vehicles in a 1.2 m × 2 m platform. The hardware for the testbed, which includes the coils, a cryostat and an air pressured system, is introduced. A parallel power supply circuit was specifically designed for the high temperature superconductor (HTS) coil operating in space, for the purpose of reducing the Joule heat and providing redundancy. Two key questions relating to the HTS coil used in the magnetic driving are discussed. First, the magnetic force between the two coils, a preliminary index directing the coil parameters, is concisely expressed as an arithmetical equation. The accuracy of the equation is checked by a more accurate model based on the Biot-Savart law and a FEA calculation. Second, the interaction between the orthogonal coils with respect to the Ic(B) characteristic of the Bi2223 tape is discussed. A minimum safe distance between the two coils is defined and numerically calculated. Two coils whose separation distance is larger than the minimum safe distance can be regarded as having no magnetic interaction.

  15. Large Scale Data Mining to Improve Usability of Data: An Intelligent Archive Testbed

    NASA Technical Reports Server (NTRS)

    Ramapriyan, Hampapuram; Isaac, David; Yang, Wenli; Morse, Steve

    2005-01-01

    Research in certain scientific disciplines - including Earth science, particle physics, and astrophysics - continually faces the challenge that the volume of data needed to perform valid scientific research can at times overwhelm even a sizable research community. The desire to improve utilization of this data gave rise to the Intelligent Archives project, which seeks to make data archives active participants in a knowledge building system capable of discovering events or patterns that represent new information or knowledge. Data mining can automatically discover patterns and events, but it is generally viewed as unsuited for large-scale use in disciplines like Earth science that routinely involve very high data volumes. Dozens of research projects have shown promising uses of data mining in Earth science, but all of these are based on experiments with data subsets of a few gigabytes or less, rather than the terabytes or petabytes typically encountered in operational systems. To bridge this gap, the Intelligent Archives project is establishing a testbed with the goal of demonstrating the use of data mining techniques in an operationally-relevant environment. This paper discusses the goals of the testbed and the design choices surrounding critical issues that arose during testbed implementation.

  16. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our Visible Nulling Coronagraph high-contrast result of 109 contrast averaged over a focal planeregion extending from 14 D with the Vacuum Nuller Testbed (VNT) in a vibration isolated vacuum chamber. TheVNC is a hybrid interferometriccoronagraphic approach for exoplanet science. It operates with high Lyot stopefficiency for filled, segmented and sparse or diluted-aperture telescopes, thereby spanning the range of potential futureNASA flight telescopes. NASAGoddard Space Flight Center (GSFC) has a well-established effort to develop the VNCand its technologies, and has developed an incremental sequence of VNC testbeds to advance this approach and itsenabling technologies. These testbeds have enabled advancement of high-contrast, visible light, nulling interferometry tounprecedented levels. The VNC is based on a modified Mach-Zehnder nulling interferometer, with a W configurationto accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters.We give an overview of the VNT and discuss the high-contrast laboratory results, the optical configuration, criticaltechnologies and null sensing and control.

  17. Autonomous power expert fault diagnostic system for Space Station Freedom electrical power system testbed

    NASA Technical Reports Server (NTRS)

    Truong, Long V.; Walters, Jerry L.; Roth, Mary Ellen; Quinn, Todd M.; Krawczonek, Walter M.

    1990-01-01

    The goal of the Autonomous Power System (APS) program is to develop and apply intelligent problem solving and control to the Space Station Freedom Electrical Power System (SSF/EPS) testbed being developed and demonstrated at NASA Lewis Research Center. The objectives of the program are to establish artificial intelligence technology paths, to craft knowledge-based tools with advanced human-operator interfaces for power systems, and to interface and integrate knowledge-based systems with conventional controllers. The Autonomous Power EXpert (APEX) portion of the APS program will integrate a knowledge-based fault diagnostic system and a power resource planner-scheduler. Then APEX will interface on-line with the SSF/EPS testbed and its Power Management Controller (PMC). The key tasks include establishing knowledge bases for system diagnostics, fault detection and isolation analysis, on-line information accessing through PMC, enhanced data management, and multiple-level, object-oriented operator displays. The first prototype of the diagnostic expert system for fault detection and isolation has been developed. The knowledge bases and the rule-based model that were developed for the Power Distribution Control Unit subsystem of the SSF/EPS testbed are described. A corresponding troubleshooting technique is also described.

  18. Cloud Processed CCN Affect Cloud Microphysics

    NASA Astrophysics Data System (ADS)

    Hudson, J. G.; Noble, S. R., Jr.; Tabor, S. S.

    2015-12-01

    Variations in the bimodality/monomodality of CCN spectra (Hudson et al. 2015) exert opposite effects on cloud microphysics in two aircraft field projects. The figure shows two examples, droplet concentration, Nc, and drizzle liquid water content, Ld, against classification of CCN spectral modality. Low ratings go to balanced separated bimodal spectra, high ratings go to single mode spectra, strictly monomodal 8. Intermediate ratings go merged modes, e.g., one mode a shoulder of another. Bimodality is caused by mass or hygroscopicity increases that go only to CCN that made activated cloud droplets. In the Ice in Clouds Experiment-Tropical (ICE-T) small cumuli with lower Nc, greater droplet mean diameters, MD, effective radii, re, spectral widths, σ, cloud liquid water contents, Lc, and Ld were closer to more bimodal (lower modal ratings) below cloud CCN spectra whereas clouds with higher Nc, smaller MD, re, σ, and Ld were closer to more monomodal CCN (higher modal ratings). In polluted stratus clouds of the MArine Stratus/Stratocumulus Experiment (MASE) clouds that had greater Nc, and smaller MD, re, σ, Lc, and Ld were closer to more bimodal CCN spectra whereas clouds with lower Nc, and greater MD, re, σ, Lc, and Ld were closer to more monomodal CCN. These relationships are opposite because the dominant ICE-T cloud processing was coalescence whereas chemical transformations (e.g., SO2 to SO4) were dominant in MASE. Coalescence reduces Nc and thus also CCN concentrations (NCCN) when droplets evaporate. In subsequent clouds the reduced competition increases MD and σ, which further enhance coalescence and drizzle. Chemical transformations do not change Nc but added sulfate enhances droplet and CCN solubility. Thus, lower critical supersaturation (S) CCN can produce more cloud droplets in subsequent cloud cycles, especially for the low W and effective S of stratus. The increased competition reduces MD, re, and σ, which inhibit coalescence and thus reduce drizzle

  19. Absorption of solar radiation in broken clouds

    SciTech Connect

    Zuev, V.E.; Titov, G.A.; Zhuravleva, T.B.

    1996-04-01

    It is recognized now that the plane-parallel model unsatisfactorily describes the transfer of radiation through broken clouds and that, consequently, the radiation codes of general circulation models (GCMs) must be refined. However, before any refinement in a GCM code is made, it is necessary to investigate the dependence of radiative characteristics on the effects caused by the random geometry of cloud fields. Such studies for mean fluxes of downwelling and upwelling solar radiation in the visible and near-infrared (IR) spectral range were performed by Zuev et al. In this work, we investigate the mean spectral and integrated absorption of solar radiation by broken clouds (in what follows, the term {open_quotes}mean{close_quotes} will be implied but not used, for convenience). To evaluate the potential effect of stochastic geometry, we will compare the absorption by cumulus (0.5 {le} {gamma} {le} 2) to that by equivalent stratus ({gamma} <<1) clouds; here {gamma} = H/D, H is the cloud layer thickness and D the characteristic horizontal cloud size. The equivalent stratus clouds differ from cumulus only in the aspect ratio {gamma}, all the other parameters coinciding.

  20. Cloud computing for geophysical applications (Invited)

    NASA Astrophysics Data System (ADS)

    Zhizhin, M.; Kihn, E. A.; Mishin, D.; Medvedev, D.; Weigel, R. S.

    2010-12-01

    Cloud computing offers a scalable on-demand resource allocation model to evolving needs in data intensive geophysical applications, where computational needs in CPU and storage can vary over time depending on modeling or field campaign. Separate, sometimes incompatible cloud platforms and services are already available from major computing vendors (Amazon AWS, Microsoft Azure, Google Apps Engine), government agencies (NASA Nebulae) and Open Source community (Eucalyptus). Multiple cloud platforms with layered virtualization patterns (hardware-platform- software-data-or-everything as a service) provide a feature-rich environment and encourage experimentation with distributed data modeling, processing and storage. However, application and especially database development in the Cloud is different from the desktop and the compute cluster. In this presentation we will review scientific cloud applications relevant to geophysical research and present our results in building software components and cloud services for a virtual geophysical data center. We will discuss in depth economy, scalability and reliability of the distributed array and image data stores, synchronous and asynchronous RESTful services to access and model georefernced data, virtual observatory services for metadata management, and data visualization for web applications in Cloud.

  1. Evaluation of Cloud Parameterizations in a High Resolution Atmospheric General Circulation Model Using ARM Data

    SciTech Connect

    Govindasamy, B; Duffy, P

    2002-04-12

    Typical state of the art atmospheric general circulation models used in climate change studies have horizontal resolution of approximately 300 km. As computing power increases, many climate modeling groups are working toward enhancing the resolution of global models. An important issue that arises when resolution of a model is changed is whether cloud and convective parameterizations, which were developed for use at coarser resolutions, will need to be reformulated or re-tuned. We propose to investigate this issue and specifically cloud statistics using ARM data. The data streams produced by highly instrumented sections of Cloud and Radiation Testbeds (CART) of ARM program will provide a significant aid in the evaluation of cloud and convection parameterization in high-resolution models. Recently, we have performed multiyear global-climate simulations at T170 and T239 resolutions, corresponding to grid cell sizes of 0.7{sup 0} and 0.5{sup 0} respectively, using the NCAR Community Climate Model. We have also a performed climate change simulation at T170. On the scales of a T42 grid cell (300 km) and larger, nearly all quantities we examined in T170 simulation agree better with observations in terms of spatial patterns than do results in a comparable simulation at T42. Increasing the resolution to T239 brings significant further improvement. At T239, the high-resolution model grid cells approach the dimensions of the highly instrumented sections of ARM Cloud and Radiation Testbed (CART) sites. We propose to form a cloud climatology using ARM data for its CART sites and evaluate cloud statistics of the NCAR Community Atmosphere Model (CAM) at higher resolutions over those sites using this ARM cloud climatology. We will then modify the physical parameterizations of CAM for better agreement with ARM data. We will work closely with NCAR in modifying the parameters in cloud and convection parameterizations for the high-resolution model. Our proposal to evaluate the cloud

  2. Noctilucent Cloud Sightings

    NASA Video Gallery

    Polar Mesospheric Clouds form during each polar region's summer months in the coldest place in the atmosphere, 50 miles above Earth's surface. Noctilucent Clouds were first observed in 1885 by an a...

  3. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  4. Private Cloud Communities for Faculty and Students

    ERIC Educational Resources Information Center

    Tomal, Daniel R.; Grant, Cynthia

    2015-01-01

    Massive open online courses (MOOCs) and public and private cloud communities continue to flourish in the field of higher education. However, MOOCs have received criticism in recent years and offer little benefit to students already enrolled at an institution. This article advocates for the collaborative creation and use of institutional, program…

  5. The NSA/SHEBA Cloud & Radiation Comparison Study

    SciTech Connect

    Janet M. Intrieri; Matthew D. Shupe

    2004-08-23

    Cloud and radiation data from two distinctly different Arctic areas are analyzed to study the differences between coastal Alaskan and open Arctic Ocean region clouds and their respective influence on the surface radiation budget. The cloud and radiation datasets were obtained from 1) the DOE North Slope of Alaska (NSA) facility in the coastal town of Barrow, Alaska, and 2) the SHEBA field program, which was conducted from an icebreaker frozen in, and drifting with, the sea-ice for one year in the Western Arctic Ocean. Radar, lidar, radiometer, and sounding measurements from both locations were used to produce annual cycles of cloud occurrence and height, atmospheric temperature and humidity, surface longwave and shortwave broadband fluxes, surface albedo, and cloud radiative forcing. In general, both regions revealed a similar annual trend of cloud occurrence fraction with minimum values in winter (60-75%) and maximum values during spring, summer and fall (80-90%). However, the annual average cloud occurrence fraction for SHEBA (76%) was lower than the 6-year average cloud occurrence at NSA (92%). Both Arctic areas also showed similar annual cycle trends of cloud forcing with clouds warming the surface through most of the year and a period of surface cooling during the summer, when cloud shading effects overwhelm cloud greenhouse effects. The greatest difference between the two regions was observed in the magnitude of the cloud cooling effect (i.e., shortwave cloud forcing), which was significantly stronger at NSA and lasted for a longer period of time than at SHEBA. This is predominantly due to the longer and stronger melt season at NSA (i.e., albedo values that are much lower coupled with Sun angles that are somewhat higher) than the melt season observed over the ice pack at SHEBA. Longwave cloud forcing values were comparable between the two sites indicating a general similarity in cloudiness and atmospheric temperature and humidity structure between the two

  6. Computer animation of clouds

    SciTech Connect

    Max, N.

    1994-01-28

    Computer animation of outdoor scenes is enhanced by realistic clouds. I will discuss several different modeling and rendering schemes for clouds, and show how they evolved in my animation work. These include transparency-textured clouds on a 2-D plane, smooth shaded or textured 3-D clouds surfaces, and 3-D volume rendering. For the volume rendering, I will present various illumination schemes, including the density emitter, single scattering, and multiple scattering models.

  7. Cloud Coverage and Height Distribution from the GLAS Polar Orbiting Lidar: Comparison to Passive Cloud Retrievals

    NASA Technical Reports Server (NTRS)

    Spinhime, J. D.; Palm, S. P.; Hlavka, D. L.; Hart, W. D.; Mahesh, A.

    2004-01-01

    The Geoscience Laser Altimeter System (GLAS) began full on orbit operations in September 2003. A main application of the two-wavelength GLAS lidar is highly accurate detection and profiling of global cloud cover. Initial analysis indicates that cloud and aerosol layers are consistently detected on a global basis to cross-sections down to 10(exp -6) per meter. Images of the lidar data dramatically and accurately show the vertical structure of cloud and aerosol to the limit of signal attenuation. The GLAS lidar has made the most accurate measurement of global cloud coverage and height to date. In addition to the calibrated lidar signal, GLAS data products include multi level boundaries and optical depth of all transmissive layers. Processing includes a multi-variable separation of cloud and aerosol layers. An initial application of the data results is to compare monthly cloud means from several months of GLAS observations in 2003 to existing cloud climatologies from other satellite measurement. In some cases direct comparison to passive cloud retrievals is possible. A limitation of the lidar measurements is nadir only sampling. However monthly means exhibit reasonably good global statistics and coverage results, at other than polar regions, compare well with other measurements but show significant differences in height distribution. For polar regions where passive cloud retrievals are problematic and where orbit track density is greatest, the GLAS results are particularly an advance in cloud cover information. Direct comparison to MODIS retrievals show a better than 90% agreement in cloud detection for daytime, but less than 60% at night. Height retrievals are in much less agreement. GLAS is a part of the NASA EOS project and data products are thus openly available to the science community (see http://glo.gsfc.nasa.gov).

  8. Cloud Scaling Properties and Cloud Parameterization

    NASA Technical Reports Server (NTRS)

    Cahalan, R. F.; Morcrette, J. J.

    1998-01-01

    Cloud liquid and cloud traction variability is studied as a function of horizontal scale in the ECMWF forecast model during several 10-day runs at the highest available model resolution, recently refined from approximately 60 km (T213) down to approximately 20 km (T639). At higher resolutions, model plane-parallel albedo biases are reduced, so that models may be tuned to have larger, more realistic, cloud liquid water amounts, However, the distribution of cloud liquid assumed -within- each gridbox, for radiative and thermodynamic computations, depends on ad hoc assumptions that are not necessarily consistent with observed scaling properties, or with scaling properties produced by the model at larger scales. To study the larger-scale cloud properties, ten locations on the Earth are chosen to coincide with locations having considerable surface data available for validation, and representing a variety of climatic regimes, scaling exponents are determined from a range or scales down to model resolution, and are re-computed every three hours, separately for low, medium and high clouds, as well as column-integrated cloudiness. Cloud variability fluctuates in time, due to diurnal, synoptic and other' processes, but scaling exponents are found to be relatively stable. various approaches are considered for applying computed cloud scaling to subgrid cloud distributions used for radiation, beyond simple random or maximal overlap now in common use. Considerably more work is needed to compare model cloud scaling with observations. This will be aided by increased availability of high-resolution surface, aircraft and satellite data, and by the increasing resolution of global models,

  9. New MISR Cloud Data

    Atmospheric Science Data Center

    2013-08-06

    ... are provided for 70% of clouds observed by MISR with vector RMS difference from atmospheric motion vectors from other sources ranging from ... m/s. Cloud top heights are provided for 80% of clouds with RMS differences of less than 1 km (the same as for the existing Level 2 Stereo ...

  10. Clouds in Planetary Atmospheres

    NASA Technical Reports Server (NTRS)

    West, R.

    1999-01-01

    In the terrestrial atmosphere clouds are familiar as vast collections of small water drops or ice cyrstals suspended in the air. The study of clouds touches on many facets of armospheric science. The chemistry of clouds is tied to the chemistry of the surrounding atmosphere.

  11. Cloud Computing Explained

    ERIC Educational Resources Information Center

    Metz, Rosalyn

    2010-01-01

    While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…

  12. Security in the cloud.

    PubMed

    Degaspari, John

    2011-08-01

    As more provider organizations look to the cloud computing model, they face a host of security-related questions. What are the appropriate applications for the cloud, what is the best cloud model, and what do they need to know to choose the best vendor? Hospital CIOs and security experts weigh in.

  13. Advanced Information Management Services in SCOOP, an IOOS Testbed

    NASA Astrophysics Data System (ADS)

    Conover, H.; Keiser, K.; Graves, S.; Beaumont, B.; Drewry, M.; Maskey, M.

    2006-05-01

    The Integrated Ocean Observing System (IOOS) represents a national initiative to create a new system for collecting and disseminating information about the oceans. The system will support a variety of practical applications, along with enabling research. A key partner in IOOS design and development, the Southeastern Universities Research Association (SURA) is a consortium of over sixty universities across the US. Building on the capabilities of its member universities, SURA seeks to develop a network of sensors and linked computers as part of the SURA Coastal Ocean Observing and Prediction (SCOOP) program, fully integrating several observing systems in the southern US. SCOOP's goal is to create a scalable, modular prediction system for storm surge and wind waves. The system will enable a "transition to operations" of cutting-edge modeling activities from the research community. This network will provide data in real-time and at high speed, for more reliable, accurate and timely information to help guide effective coastal stewardship, plan for extreme events, facilitate safe maritime operations, and support coastal security. The University of Alabama in Huntsville is developing a suite of advanced technologies to provide core data and information management services for SCOOP. This Scientific Catalog for Open Resource Exchange (SCORE) is built on UAH's information technology research for a variety of projects, including the NASA- funded Global Hydrology Resource Center and DISCOVER REASoN projects, NSF-funded Linked Environments for Atmospheric Discovery (LEAD) large Information Technology Research project, as well as for SCOOP, which is funded by NOAA and ONR. Key technologies include an extensible database schema and ontology for the target science domain. Web services provide low level catalog access, while an integrated search capability includes semantic searching and browsing, with the potential for specialized, innovative interfaces for specific research

  14. The Roles of Cloud Drop Effective Radius and LWP in Determining Rain Properties in Marine Stratocumulus

    SciTech Connect

    Rosenfeld, Daniel; Wang, Hailong; Rasch, Philip J.

    2012-07-04

    Numerical simulations described in previous studies showed that adding cloud condensation nuclei to marine stratocumulus can prevent their breakup from closed into open cells. Additional analyses of the same simulations show that the suppression of rain is well described in terms of cloud drop effective radius (re). Rain is initiated when re near cloud top is around 12-14 um. Cloud water starts to get depleted when column-maximum rain intensity (Rmax) exceeds 0.1 mm h-1. This happens when cloud-top re reaches 14 um. Rmax is mostly less than 0.1 mm h-1 at re<14 um, regardless of the cloud water path, but increases rapidly when re exceeds 14 um. This is in agreement with recent aircraft observations and theoretical observations in convective clouds so that the mechanism is not limited to describing marine stratocumulus. These results support the hypothesis that the onset of significant precipitation is determined by the number of nucleated cloud drops and the height (H) above cloud base within the cloud that is required for cloud drops to reach re of 14 um. In turn, this can explain the conditions for initiation of significant drizzle and opening of closed cells providing the basis for a simple parameterization for GCMs that unifies the representation of both precipitating and non-precipitating clouds as well as the transition between them. Furthermore, satellite global observations of cloud depth (from base to top), and cloud top re can be used to derive and validate this parameterization.

  15. Cloud microstructure studies

    NASA Technical Reports Server (NTRS)

    Blau, H. H., Jr.; Fowler, M. G.; Chang, D. T.; Ryan, R. T.

    1972-01-01

    Over two thousand individual cloud droplet size distributions were measured with an optical cloud particle spectrometer flown on the NASA Convair 990 aircraft. Representative droplet spectra and liquid water content, L (gm/cu m) were obtained for oceanic stratiform and cumuliform clouds. For non-precipitating clouds, values of L range from 0.1 gm/cu m to 0.5 gm/cu m; with precipitation, L is often greater than 1 gm/cu m. Measurements were also made in a newly formed contrail and in cirrus clouds.

  16. Intercomparison of model simulations of mixed-phase clouds observed during the ARM Mixed-Phase Arctic Cloud Experiment. Part II: Multi-layered cloud

    SciTech Connect

    Morrison, H; McCoy, R B; Klein, S A; Xie, S; Luo, Y; Avramov, A; Chen, M; Cole, J; Falk, M; Foster, M; Genio, A D; Harrington, J; Hoose, C; Khairoutdinov, M; Larson, V; Liu, X; McFarquhar, G; Poellot, M; Shipway, B; Shupe, M; Sud, Y; Turner, D; Veron, D; Walker, G; Wang, Z; Wolf, A; Xu, K; Yang, F; Zhang, G

    2008-02-27

    Results are presented from an intercomparison of single-column and cloud-resolving model simulations of a deep, multi-layered, mixed-phase cloud system observed during the ARM Mixed-Phase Arctic Cloud Experiment. This cloud system was associated with strong surface turbulent sensible and latent heat fluxes as cold air flowed over the open Arctic Ocean, combined with a low pressure system that supplied moisture at mid-level. The simulations, performed by 13 single-column and 4 cloud-resolving models, generally overestimate the liquid water path and strongly underestimate the ice water path, although there is a large spread among the models. This finding is in contrast with results for the single-layer, low-level mixed-phase stratocumulus case in Part I of this study, as well as previous studies of shallow mixed-phase Arctic clouds, that showed an underprediction of liquid water path. The overestimate of liquid water path and underestimate of ice water path occur primarily when deeper mixed-phase clouds extending into the mid-troposphere were observed. These results suggest important differences in the ability of models to simulate Arctic mixed-phase clouds that are deep and multi-layered versus shallow and single-layered. In general, models with a more sophisticated, two-moment treatment of the cloud microphysics produce a somewhat smaller liquid water path that is closer to observations. The cloud-resolving models tend to produce a larger cloud fraction than the single-column models. The liquid water path and especially the cloud fraction have a large impact on the cloud radiative forcing at the surface, which is dominated by the longwave flux for this case.

  17. Clouds Over Crater Rim

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Clouds above the rim of 'Endurance Crater' in this image from NASA's Mars Exploration Rover Opportunity can remind the viewer that Mars, our celestial neighbor, is subject to weather. On Earth, clouds like these would be referred to as 'cirrus' or the aptly nicknamed 'mares' tails.' These clouds occur in a region of strong vertical shear. The cloud particles (ice in this martian case) fall out, and get dragged along away from the location where they originally condensed, forming characteristic streamers. Opportunity took this picture with its navigation camera during the rover's 269th martian day (Oct. 26, 2004).

    The mission's atmospheric science team is studying cloud observations to deduce seasonal and time-of-day behavior of the clouds. This helps them gain a better understanding of processes that control cloud formation.

  18. Study of Multi-Scale Cloud Processes Over the Tropical Western Pacific Using Cloud-Resolving Models Constrained by Satellite Data

    SciTech Connect

    Dudhia, Jimy

    2013-03-12

    TWP-ICE using satellite and ground-based observations. -- Perform numerical experiments using WRF to investigate how convection over tropical islands in the Maritime Continent interacts with large-scale circulation and affects convection in nearby regions. -- Evaluate and apply WRF as a testbed for GCM cloud parameterizations, utilizing the ability of WRF to run on multiple scales (from cloud resolving to global) to isolate resolution and physics issues from dynamical and model framework issues. Key products will be disseminated to the ARM and larger community through distribution of data archives, including model outputs from the data assimilation products and cloud resolving simulations, and publications.

  19. Towards Efficient Scientific Data Management Using Cloud Storage

    NASA Technical Reports Server (NTRS)

    He, Qiming

    2013-01-01

    A software prototype allows users to backup and restore data to/from both public and private cloud storage such as Amazon's S3 and NASA's Nebula. Unlike other off-the-shelf tools, this software ensures user data security in the cloud (through encryption), and minimizes users operating costs by using space- and bandwidth-efficient compression and incremental backup. Parallel data processing utilities have also been developed by using massively scalable cloud computing in conjunction with cloud storage. One of the innovations in this software is using modified open source components to work with a private cloud like NASA Nebula. Another innovation is porting the complex backup to- cloud software to embedded Linux, running on the home networking devices, in order to benefit more users.

  20. Dual-wavelength millimeter-wave radar measurements of cirrus clouds

    SciTech Connect

    Sekelsky, S.M.; Firda, J.M.; McIntosh, R.E.

    1996-04-01

    In April 1994, the University of Massachusetts` 33-GHz/95-GHz Cloud Profiling Radar System (CPRS) participated in the multi-sensor Remote Cloud Sensing (RCS) Intensive Operation Period (IOP), which was conducted at the Southern Great Plains Cloud and Radiation Testbed (CART). During the 3-week experiment, CPRS measured a variety of cloud types and severe weather. In the context of global warming, the most significant measurements are dual-frequency observations of cirrus clouds, which may eventually be used to estimate ice crystal size and shape. Much of the cirrus data collected with CPRS show differences between 33-GHz and 95-GHz reflectivity measurements that are correlated with Doppler estimates of fall velocity. Because of the small range of reflectivity differences, a precise calibration of the radar is required and differential attenuation must also be removed from the data. Depolarization, which is an indicator of crystal shape, was also observed in several clouds. In this abstract we present examples of Mie scattering from cirrus and estimates of differential attenuation due to water vapor and oxygen that were derived from CART radiosonde measurements.