Science.gov

Sample records for open cloud testbed

  1. Southern Great Plains cloud and radiation testbed site

    SciTech Connect

    1996-09-01

    This document presents information about the Cloud and Radiation Testbed Site and the Atmospheric Radiation Measurement program. Topics include; measuring methods, general circulation methods, milestones, instrumentation, meteorological observations, and computing facilities.

  2. Use of cloud and radiation testbed measurements to evaluate cloud cover and convective parameterizations

    SciTech Connect

    Walcek, C.J.; Hu, Q.

    1995-04-01

    We have used temperature and humidity soundings and radiation measurements from the Atmospheric Radiation Measurement (ARM) Cloud and Radiation Testbed (CART) site in northern Oklahoma to evaluate an improved cloud cover algorithm. We have also used a new single-column model cumulus parameterization to estimate convective heating and moistening tendencies at the CART site. Our earlier analysis of cloud cover showed that relatively dry atmospheres contain small cloud amounts. We have found numerous periods during 1993 where maximum relative humidities within any layer of the atmosphere over the CART site are well below 60-80%, yet clouds are clearly reducing shortwave irradiance measured by a rotating shadowband radiometer. These ARM measurements support our earlier findings that most current climate models probably underestimate cloud coverage when relative humidities fall below the threshold humidities where clear skies are assumed. We have applied a {open_quotes}detraining-plume{close_quotes} model of cumulus convection to the June 1993 intensive observation period (16-25 June 1993). This model was previously verified with GARP Atlantic Tropical Experiment (GATE) measurements. During the June intensive observing period (IOP), relative humidities over the CART site are typically 20% less than tropical Atlantic GATE relative humidities. Our convective model calculates that evaporation of convectively induced cloud and rainwater plays a much more important role in the heating and moistening convective tendencies at the drier CART location. In particular, we predict that considerable cooling and moistening in the lower troposphere should occur due to the evaporation of convectively initiated precipitation.

  3. A boundary-layer cloud study using Southern Great Plains Cloud and radiation testbed (CART) data

    SciTech Connect

    Albrecht, B.; Mace, G.; Dong, X.; Syrett, W.

    1996-04-01

    Boundary layer clouds-stratus and fairweather cumulus - are closely coupled involves the radiative impact of the clouds on the surface energy budget and the strong dependence of cloud formation and maintenance on the turbulent fluxes of heat and moisture in the boundary layer. The continuous data collection at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site provides a unique opportunity to study components of the coupling processes associated with boundary layer clouds and to provide descriptions of cloud and boundary layer structure that can be used to test parameterizations used in climate models. But before the CART data can be used for process studies and parameterization testing, it is necessary to evaluate and validate data and to develop techniques for effectively combining the data to provide meaningful descriptions of cloud and boundary layer characteristics. In this study we use measurements made during an intensive observing period we consider a case where low-level stratus were observed at the site for about 18 hours. This case is being used to examine the temporal evolution of cloud base, cloud top, cloud liquid water content, surface radiative fluxes, and boundary layer structure. A method for inferring cloud microphysics from these parameters is currently being evaluated.

  4. Automatic Integration Testbeds validation on Open Science Grid

    NASA Astrophysics Data System (ADS)

    Caballero, J.; Thapa, S.; Gardner, R.; Potekhin, M.

    2011-12-01

    A recurring challenge in deploying high quality production middleware is the extent to which realistic testing occurs before release of the software into the production environment. We describe here an automated system for validating releases of the Open Science Grid software stack that leverages the (pilot-based) PanDA job management system developed and used by the ATLAS experiment. The system was motivated by a desire to subject the OSG Integration Testbed to more realistic validation tests. In particular those which resemble to every extent possible actual job workflows used by the experiments thus utilizing job scheduling at the compute element (CE), use of the worker node execution environment, transfer of data to/from the local storage element (SE), etc. The context is that candidate releases of OSG compute and storage elements can be tested by injecting large numbers of synthetic jobs varying in complexity and coverage of services tested. The native capabilities of the PanDA system can thus be used to define jobs, monitor their execution, and archive the resulting run statistics including success and failure modes. A repository of generic workflows and job types to measure various metrics of interest has been created. A command-line toolset has been developed so that testbed managers can quickly submit "VO-like" jobs into the system when newly deployed services are ready for testing. A system for automatic submission has been crafted to send jobs to integration testbed sites, collecting the results in a central service and generating regular reports for performance and reliability.

  5. Status of instrumentation for the Southern Great Plains Clouds and Radiation Testbed

    SciTech Connect

    Wesely, M.L.

    1991-12-31

    Planning for the initial complement of instrumentation at the first Clouds and Radiation Testbed (CART) site has concentrated on obtaining a sufficient level of instrumentation at the central facility for studies of radiative transfer processes in a narrow column above the site. The auxiliary facilities, whose sole purpose is cloud mapping above the central facility, will not be activated as such until provisions are made for all-sky imaging systems. In the meantime, the auxiliary facilities wig be instrumented as extended facilities if the locations are suitable, which would be the case if they serve the primary purpose of the extended facilities of obtaining representative observations of surface energy exchanges, state variables, precipitation, soil and vegetative conditions, and other factors that must be considered in terms of boundary conditions by single-column and related models. The National Oceanic and Atmospheric Administration (NOAA) radar wind profiler network is being considered to provide observations of vertical profiles at the boundaries of the CART site. If possible, these locations will be used for boundary facilities. Efforts are proceeding to gain access to the wind profiler network data and to determine if a sufficient number of the profilers can be equipped as Radio Acoustic Sounding Systems (RASS). Profiles of temperature as well as winds are needed at the boundary facilities for studies with single-column models and four-dimensional data assimilation models. Balloon-home sounding systems will be used there initially for both temperature and moisture profiles. Infrared spectrometers will eventually be used to infer moisture profiles at these boundary facilities.

  6. An open science cloud for scientific research

    NASA Astrophysics Data System (ADS)

    Jones, Bob

    2016-04-01

    The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.

  7. Environmental assessment for the Atmospheric Radiation Measurement (ARM) Program: Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Policastro, A.J.; Pfingston, J.M.; Maloney, D.M.; Wasmer, F.; Pentecost, E.D.

    1992-03-01

    The Atmospheric Radiation Measurement (ARM) Program is aimed at supplying improved predictive capability of climate change, particularly the prediction of cloud-climate feedback. The objective will be achieved by measuring the atmospheric radiation and physical and meteorological quantities that control solar radiation in the earth`s atmosphere and using this information to test global climate and related models. The proposed action is to construct and operate a Cloud and Radiation Testbed (CART) research site in the southern Great Plains as part of the Department of Energy`s Atmospheric Radiation Measurement Program whose objective is to develop an improved predictive capability of global climate change. The purpose of this CART research site in southern Kansas and northern Oklahoma would be to collect meteorological and other scientific information to better characterize the processes controlling radiation transfer on a global scale. Impacts which could result from this facility are described.

  8. Evaluating open-source cloud computing solutions for geosciences

    NASA Astrophysics Data System (ADS)

    Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong

    2013-09-01

    Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.

  9. ProteoCloud: a full-featured open source proteomics cloud computing pipeline.

    PubMed

    Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart

    2013-08-02

    We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com.

  10. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    NASA Astrophysics Data System (ADS)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  11. Open-cell cloud formation over the Bahamas

    NASA Technical Reports Server (NTRS)

    2002-01-01

    What atmospheric scientists refer to as open cell cloud formation is a regular occurrence on the back side of a low-pressure system or cyclone in the mid-latitudes. In the Northern Hemisphere, a low-pressure system will draw in surrounding air and spin it counterclockwise. That means that on the back side of the low-pressure center, cold air will be drawn in from the north, and on the front side, warm air will be drawn up from latitudes closer to the equator. This movement of an air mass is called advection, and when cold air advection occurs over warmer waters, open cell cloud formations often result. This MODIS image shows open cell cloud formation over the Atlantic Ocean off the southeast coast of the United States on February 19, 2002. This particular formation is the result of a low-pressure system sitting out in the North Atlantic Ocean a few hundred miles east of Massachusetts. (The low can be seen as the comma-shaped figure in the GOES-8 Infrared image from February 19, 2002.) Cold air is being drawn down from the north on the western side of the low and the open cell cumulus clouds begin to form as the cold air passes over the warmer Caribbean waters. For another look at the scene, check out the MODIS Direct Broadcast Image from the University of Wisconsin. Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC

  12. Open Reading Frame Phylogenetic Analysis on the Cloud

    PubMed Central

    2013-01-01

    Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843

  13. Analytical study of the effects of the Low-Level Jet on moisture convergence and vertical motion fields at the Southern Great Plains Cloud and Radiation Testbed site

    SciTech Connect

    Bian, X.; Zhong, S.; Whiteman, C.D.; Stage, S.A.

    1996-04-01

    The Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) is located in a region that is strongly affected by a prominent meteorological phenomenon--the Great Plains Low-Level Jet (LLJ). Observations have shown that the LLJ plays a vital role in spring and summertime cloud formation and precipitation over the Great Plains. An improved understanding of the LLJ characteristics and its impact on the environment is necessary for addressing the fundamental issue of development and testing of radiational transfer and cloud parameterization schemes for the general circulation models (GCMs) using data from the SGP CART site. A climatological analysis of the summertime LLJ over the SGP has been carried out using hourly observations from the National Oceanic and Atmospheric Administration (NOAA) Wind Profiler Demonstration Network and from the ARM June 1993 Intensive Observation Period (IOP). The hourly data provide an enhanced temporal and spatial resolution relative to earlier studies which used 6- and 12-hourly rawinsonde observations at fewer stations.

  14. Cloud-Based Model Calibration Using OpenStudio: Preprint

    SciTech Connect

    Hale, E.; Lisell, L.; Goldwasser, D.; Macumber, D.; Dean, J.; Metzger, I.; Parker, A.; Long, N.; Ball, B.; Schott, M.; Weaver, E.; Brackney, L.

    2014-03-01

    OpenStudio is a free, open source Software Development Kit (SDK) and application suite for performing building energy modeling and analysis. The OpenStudio Parametric Analysis Tool has been extended to allow cloud-based simulation of multiple OpenStudio models parametrically related to a baseline model. This paper describes the new cloud-based simulation functionality and presents a model cali-bration case study. Calibration is initiated by entering actual monthly utility bill data into the baseline model. Multiple parameters are then varied over multiple iterations to reduce the difference between actual energy consumption and model simulation results, as calculated and visualized by billing period and by fuel type. Simulations are per-formed in parallel using the Amazon Elastic Cloud service. This paper highlights model parameterizations (measures) used for calibration, but the same multi-nodal computing architecture is available for other purposes, for example, recommending combinations of retrofit energy saving measures using the calibrated model as the new baseline.

  15. Tidal disruption of open clusters in their parent molecular clouds

    NASA Technical Reports Server (NTRS)

    Long, Kevin

    1989-01-01

    A simple model of tidal encounters has been applied to the problem of an open cluster in a clumpy molecular cloud. The parameters of the clumps are taken from the Blitz, Stark, and Long (1988) catalog of clumps in the Rosette molecular cloud. Encounters are modeled as impulsive, rectilinear collisions between Plummer spheres, but the tidal approximation is not invoked. Mass and binding energy changes during an encounter are computed by considering the velocity impulses given to individual stars in a random realization of a Plummer sphere. Mean rates of mass and binding energy loss are then computed by integrating over many encounters. Self-similar evolutionary calculations using these rates indicate that the disruption process is most sensitive to the cluster radius and relatively insensitive to cluster mass. The calculations indicate that clusters which are born in a cloud similar to the Rosette with a cluster radius greater than about 2.5 pc will not survive long enough to leave the cloud. The majority of clusters, however, have smaller radii and will survive the passage through their parent cloud.

  16. The EPOS Vision for the Open Science Cloud

    NASA Astrophysics Data System (ADS)

    Jeffery, Keith; Harrison, Matt; Cocco, Massimo

    2016-04-01

    Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be

  17. A comparison of radiometric fluxes influenced by parameterization cirrus clouds with observed fluxes at the Southern Great Plains (SGP) cloud and radiation testbed (CART) site

    SciTech Connect

    Mace, G.G.; Ackerman, T.P.; George, A.T.

    1996-04-01

    The data from the Atmospheric Radiation Measurement (ARM) Program`s Southern Great plains Site (SCP) is a valuable resource. We have developed an operational data processing and analysis methodology that allows us to examine continuously the influence of clouds on the radiation field and to test new and existing cloud and radiation parameterizations.

  18. Cooling Earth's temperature by seeding marine stratocumulus clouds for increasing cloud cover by closing open cells

    NASA Astrophysics Data System (ADS)

    Daniel, R.

    2008-12-01

    The transition from open to closed cellular convection in marine stratocumulus is very sensitive to small concentrations of cloud condensation nuclei (CCN) aerosols. Addition of small amounts of CCN (about 100 cm-3) to the marine boundary layer (MBL) can close the open cells and by that increase the cloud cover from about 40% to nearly 100%, with negative radiative forcing exceeding 100 wm-2. We show satellite measurements that demonstrate this sensitivity by inadvertent experiments of old and diluted ship tracks. With the methodology suggested by Salter and Latham for spraying sub-micron sea water drops that serve as CCN, it is possible to close sufficiently large area of open cells for achieving the negative radiative forcing that is necessary to balance the greenhouse gases positive forcing. We show calculations of the feasibility of such an undertaking, and suggest that this is an economically feasible method with the least potential risks, when compared to seeding marine stratocumulus for enhancing their albedo or with seeding the stratosphere with bright or dark aerosols. Global Circulation models coupled with the ocean and the ice are necessary to calculate the impact and the possible side effects.

  19. Cloud based, Open Source Software Application for Mitigating Herbicide Drift

    NASA Astrophysics Data System (ADS)

    Saraswat, D.; Scott, B.

    2014-12-01

    The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.

  20. OpenID connect as a security service in Cloud-based diagnostic imaging systems

    NASA Astrophysics Data System (ADS)

    Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter

    2015-03-01

    The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.

  1. Fast Physics Testbed for the FASTER Project

    SciTech Connect

    Lin, W.; Liu, Y.; Hogan, R.; Neggers, R.; Jensen, M.; Fridlind, A.; Lin, Y.; Wolf, A.

    2010-03-15

    This poster describes the Fast Physics Testbed for the new FAst-physics System Testbed and Research (FASTER) project. The overall objective is to provide a convenient and comprehensive platform for fast turn-around model evaluation against ARM observations and to facilitate development of parameterizations for cloud-related fast processes represented in global climate models. The testbed features three major components: a single column model (SCM) testbed, an NWP-Testbed, and high-resolution modeling (HRM). The web-based SCM-Testbed features multiple SCMs from major climate modeling centers and aims to maximize the potential of SCM approach to enhance and accelerate the evaluation and improvement of fast physics parameterizations through continuous evaluation of existing and evolving models against historical as well as new/improved ARM and other complementary measurements. The NWP-Testbed aims to capitalize on the large pool of operational numerical weather prediction products. Continuous evaluations of NWP forecasts against observations at ARM sites are carried out to systematically identify the biases and skills of physical parameterizations under all weather conditions. The highresolution modeling (HRM) activities aim to simulate the fast processes at high resolution to aid in the understanding of the fast processes and their parameterizations. A four-tier HRM framework is established to augment the SCM- and NWP-Testbeds towards eventual improvement of the parameterizations.

  2. OpenID Connect as a security service in cloud-based medical imaging systems.

    PubMed

    Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter

    2016-04-01

    The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.

  3. Cloud computing geospatial application for water resources based on free and open source software and open standards - a prototype

    NASA Astrophysics Data System (ADS)

    Delipetrev, Blagoj

    2016-04-01

    Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state

  4. Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide

    DTIC Science & Technology

    2016-03-01

    managing adaptive training content. GIFT Cloud allows learners, authors, and researchers to access GIFT without downloading and installing software...feedback on GIFT Cloud. 15. SUBJECT TERMS GIFT, generalized intelligent framework for tutoring, quick start, help and documentation, adaptive tutoring...primary interfaces and basic functions of GIFT. GIFT is a modular, open-source framework for building, deploying, and managing adaptive training

  5. A Business-to-Business Interoperability Testbed: An Overview

    SciTech Connect

    Kulvatunyou, Boonserm; Ivezic, Nenad; Monica, Martin; Jones, Albert

    2003-10-01

    In this paper, we describe a business-to-business (B2B) testbed co-sponsored by the Open Applications Group, Inc. (OAGI) and the National Institute of Standard and Technology (NIST) to advance enterprise e-commerce standards. We describe the business and technical objectives and initial activities within the B2B Testbed. We summarize our initial lessons learned to form the requirements that drive the next generation testbed development. We also give an overview of a promising testing framework architecture in which to drive the testbed developments. We outline the future plans for the testbed development.

  6. Point Cloud Visualization in AN Open Source 3d Globe

    NASA Astrophysics Data System (ADS)

    De La Calle, M.; Gómez-Deck, D.; Koehler, O.; Pulido, F.

    2011-09-01

    During the last years the usage of 3D applications in GIS is becoming more popular. Since the appearance of Google Earth, users are familiarized with 3D environments. On the other hand, nowadays computers with 3D acceleration are common, broadband access is widespread and the public information that can be used in GIS clients that are able to use data from the Internet is constantly increasing. There are currently several libraries suitable for this kind of applications. Based on these facts, and using libraries that are already developed and connected to our own developments, we are working on the implementation of a real 3D GIS with analysis capabilities. Since a 3D GIS such as this can be very interesting for tasks like LiDAR or Laser Scanner point clouds rendering and analysis, special attention is given to get an optimal handling of very large data sets. Glob3 will be a multidimensional GIS in which 3D point clouds could be explored and analysed, even if they are consist of several million points.The latest addition to our visualization libraries is the development of a points cloud server that works regardless of the cloud's size. The server receives and processes petitions from a 3d client (for example glob3, but could be any other, such as one based on WebGL) and delivers the data in the form of pre-processed tiles, depending on the required level of detail.

  7. Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun

    NASA Technical Reports Server (NTRS)

    Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.

    2008-01-01

    Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.

  8. The Integration of CloudStack and OCCI/OpenNebula with DIRAC

    NASA Astrophysics Data System (ADS)

    Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan

    2012-12-01

    The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License

  9. Enabling Open Cloud Markets Through WS-Agreement Extensions

    NASA Astrophysics Data System (ADS)

    Risch, Marcel; Altmann, Jörn

    Research into computing resource markets has mainly considered the question of which market mechanisms provide a fair resource allocation. However, while developing such markets, the definition of the unit of trade (i.e. the definition of resource) has not been given much attention. In this paper, we analyze the requirements for tradable resource goods. Based on the results, we suggest a detailed goods definition, which is easy to understand, can be used with many market mechanisms, and addresses the needs of a Cloud resource market. The goods definition captures the complete system resource, including hardware specifications, software specifications, the terms of use, and a pricing function. To demonstrate the usefulness of such a standardized goods definition, we demonstrate its application in the form of a WS-Agreement template for a number of market mechanisms for commodity system resources.

  10. ATHLETE: Low Gravity Testbed

    NASA Technical Reports Server (NTRS)

    Qi, Jay Y.

    2011-01-01

    The All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) is a vehicle concept developed at Jet Propulsion Laboratory as a multipurpose robot for exploration. Currently, the ATHLETE team is working on creating a low gravity testbed to physically simulate ATHLETE landing on an asteroid. Several projects were worked on this summer to support the low gravity testbed.

  11. Aspects of the quality of data from the Southern Great Plains (SGP) cloud and radiation testbed (CART) site broadband radiation sensors

    SciTech Connect

    Splitt, M.E.; Wesely, M.L.

    1996-04-01

    A systmatic evaluation of the performance of broadband radiometers at the Radiation Testbed (CART) site is needed to estimate the uncertainties of the irradiance observations. Here, net radiation observed with the net radiometer in the enrgy balance Bowen ratio station at the Central facility is compared with the net radiation computed as the sum of component irradiances recorded by nearby pyranameters and pyrgeometers. In addition, data obtained from the central facility pyranometers, pyrgeometers, and pyrheliometers are examined for April 1994, when intensive operations periods were being carried out. The data used in this study are from central facility radiometers in a solar and infrared observation station, and EBBR station, the so-called `BSRN` set of upward pointing radiometers, and a set of radiometers pointed down at the 25-m level of a 60-m tower.

  12. Building a Parallel Cloud Storage System using OpenStack’s Swift Object Store and Transformative Parallel I/O

    SciTech Connect

    Burns, Andrew J.; Lora, Kaleb D.; Martinez, Esteban; Shorter, Martel L.

    2012-07-30

    Our project consists of bleeding-edge research into replacing the traditional storage archives with a parallel, cloud-based storage solution. It used OpenStack's Swift Object Store cloud software. It's Benchmarked Swift for write speed and scalability. Our project is unique because Swift is typically used for reads and we are mostly concerned with write speeds. Cloud Storage is a viable archive solution because: (1) Container management for larger parallel archives might ease the migration workload; (2) Many tools that are written for cloud storage could be utilized for local archive; and (3) Current large cloud storage practices in industry could be utilized to manage a scalable archive solution.

  13. The Fizeau Interferometer Testbed

    NASA Technical Reports Server (NTRS)

    Zhang, Xiaolei; Carpenter, Kenneth G.; Lyon, Richard G,; Huet, Hubert; Marzouk, Joe; Solyar, Gregory

    2003-01-01

    The Fizeau Interferometer Testbed (FIT) is a collaborative effort between NASA's Goddard Space Flight Center, the Naval Research Laboratory, Sigma Space Corporation, and the University of Maryland. The testbed will be used to explore the principles of and the requirements for the full, as well as the pathfinder, Stellar Imager mission concept. It has a long term goal of demonstrating closed-loop control of a sparse array of numerous articulated mirrors to keep optical beams in phase and optimize interferometric synthesis imaging. In this paper we present the optical and data acquisition system design of the testbed, and discuss the wavefront sensing and control algorithms to be used. Currently we have completed the initial design and hardware procurement for the FIT. The assembly and testing of the Testbed will be underway at Goddard's Instrument Development Lab in the coming months.

  14. Features of Transitional Regimes for Hydrocarbon Combustion in Closed Volumes and in Opened Clouds

    NASA Astrophysics Data System (ADS)

    Lin, E. E.; Tanakov, Z. V.

    2006-08-01

    We present brief review and analysis of experimental results concerned to simulation of processes both in power-plants and in open-air surface space, when burning hydrocarbons gaseous mixtures. Combustion regimes in closed volumes are considered for acetylene mixtures C2H2 + mO2 + nN2, C2H2 + mN2O + nN2 in tubes with relative length L/d = 4 - 60. Combustion of opened fuel-air clouds under regime of their collisions is considered for propane-butane, when dispersing in atmosphere from several closely located reservoirs with liquefied gas.

  15. Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem

    NASA Astrophysics Data System (ADS)

    Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.

    2015-12-01

    Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.

  16. AutoGNC Testbed

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Vaughan, Andrew T.; Bayard, David S.; Riedel, Joseph E.; Balaram, J.

    2010-01-01

    A simulation testbed architecture was developed and implemented for the integration, test, and development of a TRL-6 flight software set called Auto- GNC. The AutoGNC software will combine the TRL-9 Deep Impact AutoNAV flight software suite, the TRL-9 Virtual Machine Language (VML) executive, and the TRL-3 G-REX guidance, estimation, and control algorithms. The Auto- GNC testbed was architected to provide software interface connections among the AutoNAV and VML flight code written in C, the G-REX algorithms in MATLAB and C, stand-alone image rendering algorithms in C, and other Fortran algorithms, such as the OBIRON landmark tracking suite. The testbed architecture incorporates software components for propagating a high-fidelity truth model of the environment and the spacecraft dynamics, along with the flight software components for onboard guidance, navigation, and control (GN&C). The interface allows for the rapid integration and testing of new algorithms prior to development of the C code for implementation in flight software. This testbed is designed to test autonomous spacecraft proximity operations around small celestial bodies, moons, or other spacecraft. The software is baselined for upcoming comet and asteroid sample return missions. This architecture and testbed will provide a direct improvement upon the onboard flight software utilized for missions such as Deep Impact, Stardust, and Deep Space 1.

  17. The WFI Hα spectroscopic survey of the Magellanic Clouds: Be stars in SMC open clusters

    NASA Astrophysics Data System (ADS)

    Martayan, Christophe; Baade, Dietrich; Fabregat, Juan

    2009-03-01

    At low metallicity, B-type stars show lower loss of mass and, therefore, angular momentum so that it is expected that there are more Be stars in the Magellanic Clouds than in the Milky Way. However, till now, searches for Be stars were only performed in a very small number of open clusters in the Magellanic Clouds. Using the ESO/WFI in its slitless spectroscopic mode, we performed a Hα survey of the Large and Small Magellanic Cloud. Eight million low-resolution spectra centered on Hα were obtained. For their automatic analysis, we developed the ALBUM code. Here, we present the observations, the method to exploit the data and first results for 84 open clusters in the SMC. In particular, cross-correlating our catalogs with OGLE positional and photometric data, we classified more than 4000 stars and were able to find the B and Be stars in them. We show the evolution of the rates of Be stars as functions of area density, metallicity, spectral type, and age.

  18. plas.io: Open Source, Browser-based WebGL Point Cloud Visualization

    NASA Astrophysics Data System (ADS)

    Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.

    2014-12-01

    Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.

  19. MIT's interferometer CST testbed

    NASA Technical Reports Server (NTRS)

    Hyde, Tupper; Kim, ED; Anderson, Eric; Blackwood, Gary; Lublin, Leonard

    1990-01-01

    The MIT Space Engineering Research Center (SERC) has developed a controlled structures technology (CST) testbed based on one design for a space-based optical interferometer. The role of the testbed is to provide a versatile platform for experimental investigation and discovery of CST approaches. In particular, it will serve as the focus for experimental verification of CSI methodologies and control strategies at SERC. The testbed program has an emphasis on experimental CST--incorporating a broad suite of actuators and sensors, active struts, system identification, passive damping, active mirror mounts, and precision component characterization. The SERC testbed represents a one-tenth scaled version of an optical interferometer concept based on an inherently rigid tetrahedral configuration with collecting apertures on one face. The testbed consists of six 3.5 meter long truss legs joined at four vertices and is suspended with attachment points at three vertices. Each aluminum leg has a 0.2 m by 0.2 m by 0.25 m triangular cross-section. The structure has a first flexible mode at 31 Hz and has over 50 global modes below 200 Hz. The stiff tetrahedral design differs from similar testbeds (such as the JPL Phase B) in that the structural topology is closed. The tetrahedral design minimizes structural deflections at the vertices (site of optical components for maximum baseline) resulting in reduced stroke requirements for isolation and pointing of optics. Typical total light path length stability goals are on the order of lambda/20, with a wavelength of light, lambda, of roughly 500 nanometers. It is expected that active structural control will be necessary to achieve this goal in the presence of disturbances.

  20. Use of AVHRR-derived spectral reflectances to estimate surface albedo across the Southern Great Plains Cloud and Radiation Testbed (CART) site

    SciTech Connect

    Qiu, J.; Gao, W.

    1997-03-01

    Substantial variations in surface albedo across a large area cause difficulty in estimating regional net solar radiation and atmospheric absorption of shortwave radiation when only ground point measurements of surface albedo are used to represent the whole area. Information on spatial variations and site-wide averages of surface albedo, which vary with the underlying surface type and conditions and the solar zenith angle, is important for studies of clouds and atmospheric radiation over a large surface area. In this study, a bidirectional reflectance model was used to inversely retrieve surface properties such as leaf area index and then the bidirectional reflectance distribution was calculated by using the same radiation model. The albedo was calculated by converting the narrowband reflectance to broadband reflectance and then integrating over the upper hemisphere.

  1. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program is developing initial recommendations for requirements and design approaches for the information systems of the Space Station era. During this quarter, drafting of the final reports of the various participants was initiated. Several drafts are included in this report as the University technical reports.

  2. Establishment of an NWP testbed using ARM data

    SciTech Connect

    O'Connor, E.; Liu, Y.; Hogan, R.

    2010-03-15

    The aim of the FAst-physics System TEstbed and Research (FASTER) project is to evaluate and improve the parameterizations of fast physics (involving clouds, precipitation, aerosol) in numerical models using ARM measurements. One objective within FASTER is to evaluate model representations of fast physics with long-term continuous cloud observations by use of an 'NWP testbed'. This approach was successful in the European Cloudnet project. NWP model data (NCEP, ECMWF, etc.) is routinely output at ARM sites, and model evaluation can potentially be achieved in quasi-real time. In this poster, we will outline our progress in the development of the NWP testbed and discuss the successful integration of ARM algorithms, such as ARSCL, with algorithms and lessons learned from Cloudnet. Preliminary results will be presented of the evaluation of the ECMWF, NCEP, and UK Met Office models over the SGP site using this approach.

  3. Continuation: The EOSDIS testbed data system

    NASA Technical Reports Server (NTRS)

    Emery, Bill; Kelley, Timothy D.

    1995-01-01

    The continuation of the EOSDIS testbed ('Testbed') has materialized from a multi-task system to a fully functional stand-alone data archive distribution center that once was only X-Windows driven to a system that is accessible by all types of users and computers via the World Wide Web. Throughout the past months, the Testbed has evolved into a completely new system. The current system is now accessible through Netscape, Mosaic, and all other servers that can contact the World Wide Web. On October 1, 1995 we will open to the public and we expect that the statistics of the type of user, where they are located, and what they are looking for will drastically change. What is the most important change in the Testbed has been the Web interface. This interface will allow more users access to the system and walk them through the data types with more ease than before. All of the callbacks are written in such a way that icons can be used to easily move around in the programs interface. The homepage offers the user the opportunity to go and get more information about each satellite data type and also information on free programs. These programs are grouped into categories for types of computers that the programs are compiled for, along with information on how to FTP the programs back to the end users computer. The heart of the Testbed is still the acquisition of satellite data. From the Testbed homepage, the user selects the 'access to data system' icon, which will take them to the world map and allow them to select an area that they would like coverage on by simply clicking that area of the map. This creates a new map where other similar choices can be made to get the latitude and longitude of the region the satellite data will cover. Once a selection has been made the search parameters page will appear to be filled out. Afterwards, the browse image will be called for once the search is completed and the images for viewing can be selected. There are several other option pages

  4. Network testbed creation and validation

    DOEpatents

    Thai, Tan Q.; Urias, Vincent; Van Leeuwen, Brian P.; Watts, Kristopher K.; Sweeney, Andrew John

    2017-03-21

    Embodiments of network testbed creation and validation processes are described herein. A "network testbed" is a replicated environment used to validate a target network or an aspect of its design. Embodiments describe a network testbed that comprises virtual testbed nodes executed via a plurality of physical infrastructure nodes. The virtual testbed nodes utilize these hardware resources as a network "fabric," thereby enabling rapid configuration and reconfiguration of the virtual testbed nodes without requiring reconfiguration of the physical infrastructure nodes. Thus, in contrast to prior art solutions which require a tester manually build an emulated environment of physically connected network devices, embodiments receive or derive a target network description and build out a replica of this description using virtual testbed nodes executed via the physical infrastructure nodes. This process allows for the creation of very large (e.g., tens of thousands of network elements) and/or very topologically complex test networks.

  5. Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud

    NASA Astrophysics Data System (ADS)

    Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde

    2014-06-01

    The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.

  6. Robot graphic simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.

    1991-01-01

    The objective of this research was twofold. First, the basic capabilities of ROBOSIM (graphical simulation system) were improved and extended by taking advantage of advanced graphic workstation technology and artificial intelligence programming techniques. Second, the scope of the graphic simulation testbed was extended to include general problems of Space Station automation. Hardware support for 3-D graphics and high processing performance make high resolution solid modeling, collision detection, and simulation of structural dynamics computationally feasible. The Space Station is a complex system with many interacting subsystems. Design and testing of automation concepts demand modeling of the affected processes, their interactions, and that of the proposed control systems. The automation testbed was designed to facilitate studies in Space Station automation concepts.

  7. The Palomar Testbed Interferometer

    NASA Technical Reports Server (NTRS)

    Colavita, M. M.; Wallace, J. K.; Hines, B. E.; Gursel, Y.; Malbet, F.; Palmer, D. L.; Pan, X. P.; Shao, M.; Yu, J. W.; Boden, A. F.

    1999-01-01

    The Palomar Testbed Interferometer (PTI) is a long-baseline infrared interferometer located at Palomar Observatory, California. It was built as a testbed for interferometric techniques applicable to the Keck Interferometer. First fringes were obtained in 1995 July. PTI implements a dual-star architecture, tracking two stars simultaneously for phase referencing and narrow-angle astrometry. The three fixed 40 cm apertures can be combined pairwise to provide baselines to 110 m. The interferometer actively tracks the white-light fringe using an array detector at 2.2 microns and active delay lines with a range of +/-38 m. Laser metrology of the delay lines allows for servo control, and laser metrology of the complete optical path enables narrow-angle astrometric measurements. The instrument is highly automated, using a multiprocessing computer system for instrument control and sequencing.

  8. Testbed for LISA Photodetectors

    NASA Technical Reports Server (NTRS)

    Guzman, Felipe; Livas, Jeffrey; Silverberg, Robert

    2009-01-01

    The Laser Interferometer Space Antenna (LISA) is a gravitational wave observatory consisting of three spacecraft separated by 5 million km in an equilateral triangle whose center follows the Earth in orbit around the Sun but offset in orbital phase by 20 degrees. LISA is designed to observe sources in the frequency range of 0.1 mHz-100 mHz by measuring fluctuations of the inter-spacecraft separation with laser interferometry. Quadrant photodetectors are used to measure both separation and angular orientation. Noise level, phase and amplitude inhomogeneities of the semiconductor response, and channel cross-talk between quadrant cells need to be assessed in order to ensure the 10 pm/Square root(Hz) sensitivity required for the interferometric length measurement in LISA. To this end, we are currently developing a testbed that allows us to evaluate photodetectors to the sensitivity levels required for LISA. A detailed description of the testbed and preliminary results will be presented.

  9. Telescience Testbed Pilot Program

    NASA Technical Reports Server (NTRS)

    Gallagher, Maria L. (Editor); Leiner, Barry M. (Editor)

    1988-01-01

    The Telescience Testbed Pilot Program (TTPP) is intended to develop initial recommendations for requirements and design approaches for the information system of the Space Station era. Multiple scientific experiments are being performed, each exploring advanced technologies and technical approaches and each emulating some aspect of Space Station era science. The aggregate results of the program will serve to guide the development of future NASA information systems.

  10. Marshall Avionics Testbed System (MAST)

    NASA Technical Reports Server (NTRS)

    Smith, Wayne D.

    1989-01-01

    Work accomplished in the summer of 1989 in association with the NASA/ASEE Summer Faculty Research Fellowship Program at Marshall Space Flight Center is summarized. The project was aimed at developing detailed specifications for the Marshall Avionics System Testbed (MAST). This activity was to include the definition of the testbed requirements and the development of specifications for a set of standard network nodes for connecting the testbed to a variety of networks. The project was also to include developing a timetable for the design, implementation, programming and testing of the testbed. Specifications of both hardware and software components for the system were to be included.

  11. Advanced turboprop testbed systems study

    NASA Technical Reports Server (NTRS)

    Goldsmith, I. M.

    1982-01-01

    The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.

  12. Single link flexible beam testbed project. Thesis

    NASA Technical Reports Server (NTRS)

    Hughes, Declan

    1992-01-01

    This thesis describes the single link flexible beam testbed at the CLaMS laboratory in terms of its hardware, software, and linear model, and presents two controllers, each including a hub angle proportional-derivative (PD) feedback compensator and one augmented by a second static gain full state feedback loop, based upon a synthesized strictly positive real (SPR) output, that increases specific flexible mode pole damping ratios w.r.t the PD only case and hence reduces unwanted residual oscillation effects. Restricting full state feedback gains so as to produce a SPR open loop transfer function ensures that the associated compensator has an infinite gain margin and a phase margin of at least (-90, 90) degrees. Both experimental and simulation data are evaluated in order to compare some different observer performance when applied to the real testbed and to the linear model when uncompensated flexible modes are included.

  13. Adjustable Autonomy Testbed

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Schrenkenghost, Debra K.

    2001-01-01

    The Adjustable Autonomy Testbed (AAT) is a simulation-based testbed located in the Intelligent Systems Laboratory in the Automation, Robotics and Simulation Division at NASA Johnson Space Center. The purpose of the testbed is to support evaluation and validation of prototypes of adjustable autonomous agent software for control and fault management for complex systems. The AA T project has developed prototype adjustable autonomous agent software and human interfaces for cooperative fault management. This software builds on current autonomous agent technology by altering the architecture, components and interfaces for effective teamwork between autonomous systems and human experts. Autonomous agents include a planner, flexible executive, low level control and deductive model-based fault isolation. Adjustable autonomy is intended to increase the flexibility and effectiveness of fault management with an autonomous system. The test domain for this work is control of advanced life support systems for habitats for planetary exploration. The CONFIG hybrid discrete event simulation environment provides flexible and dynamically reconfigurable models of the behavior of components and fluids in the life support systems. Both discrete event and continuous (discrete time) simulation are supported, and flows and pressures are computed globally. This provides fast dynamic simulations of interacting hardware systems in closed loops that can be reconfigured during operations scenarios, producing complex cascading effects of operations and failures. Current object-oriented model libraries support modeling of fluid systems, and models have been developed of physico-chemical and biological subsystems for processing advanced life support gases. In FY01, water recovery system models will be developed.

  14. LISA Optical Bench Testbed

    NASA Astrophysics Data System (ADS)

    Lieser, M.; d'Arcio, L.; Barke, S.; Bogenstahl, J.; Diekmann, C.; Diepholz, I.; Fitzsimons, E. D.; Gerberding, O.; Henning, J.-S.; Hewitson, M.; Hey, F. G.; Hogenhuis, H.; Killow, C. J.; Lucarelli, S.; Nikolov, S.; Perreur-Lloyd, M.; Pijnenburg, J.; Robertson, D. I.; Sohmer, A.; Taylor, A.; Tröbs, M.; Ward, H.; Weise, D.; Heinzel, G.; Danzmann, K.

    2013-01-01

    The optical bench (OB) is a part of the LISA spacecraft, situated between the telescope and the testmass. For measuring the inter-spacecraft distances there are several interferometers on the OB. The elegant breadboard of the OB for LISA is developed for the European Space Agency (ESA) by EADS Astrium, TNO Science & Industry, University of Glasgow and the Albert Einstein Intitute (AEI), the performance tests then will be done at the AEI. Here we present the testbed that will be used for the performance tests with the focus on the thermal environment and the laser infrastructure.

  15. Testbed For Telerobotic Servicing

    NASA Technical Reports Server (NTRS)

    Matijevic, Jacob R.

    1991-01-01

    Telerobot testbed used to evaluate technologies for remote servicing, including assembly, maintenance, and repair. Enables study of advantages and disadvantages of modes and problems encountered in implementing them. Best technologies for implementing modes chosen. Provides delays simulating transmission delays between control stations on ground and orbiting spacecraft. Includes five major equipment subsystems, each consisting of such commercially available equipment as video cameras, computers, and robot arms. Used on Space Station and on Space Shuttle and satellites in orbit. Also used in hazardous and underwater environments on Earth.

  16. Advancing global marine biogeography research with open-source GIS software and cloud-computing

    USGS Publications Warehouse

    Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick

    2012-01-01

    Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.

  17. Aviation Communications Emulation Testbed

    NASA Technical Reports Server (NTRS)

    Sheehe, Charles; Mulkerin, Tom

    2004-01-01

    Aviation related applications that rely upon datalink for information exchange are increasingly being developed and deployed. The increase in the quantity of applications and associated data communications will expose problems and issues to resolve. NASA s Glenn Research Center has prepared to study the communications issues that will arise as datalink applications are employed within the National Airspace System (NAS) by developing an aviation communications emulation testbed. The Testbed is evolving and currently provides the hardware and software needed to study the communications impact of Air Traffic Control (ATC) and surveillance applications in a densely populated environment. The communications load associated with up to 160 aircraft transmitting and receiving ATC and surveillance data can be generated in realtime in a sequence similar to what would occur in the NAS. The ATC applications that can be studied are the Aeronautical Telecommunications Network s (ATN) Context Management (CM) and Controller Pilot Data Link Communications (CPDLC). The Surveillance applications are Automatic Dependent Surveillance - Broadcast (ADS-B) and Traffic Information Services - Broadcast (TIS-B).

  18. High Resolution Cloud Microphysics and Radiation Studies

    DTIC Science & Technology

    2011-06-16

    characteristics of mid level altocumulus clouds and upper level visible and subvisual cirrus clouds The MPL lidar provided information about the temporal...balloon, lidar, and radar study of cirrus and altocumulus clouds to further investigate the presence of multi- cloud and nearly cloud -free layers...data set of the clouds and thermodynanuc structure to build a mesoscale and LF.S test-bed for cirrus and altocumulus cloud layers. The project was

  19. A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system

    NASA Astrophysics Data System (ADS)

    Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.

    2014-06-01

    The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.

  20. Advanced data management system architectures testbed

    NASA Technical Reports Server (NTRS)

    Grant, Terry

    1990-01-01

    The objective of the Architecture and Tools Testbed is to provide a working, experimental focus to the evolving automation applications for the Space Station Freedom data management system. Emphasis is on defining and refining real-world applications including the following: the validation of user needs; understanding system requirements and capabilities; and extending capabilities. The approach is to provide an open, distributed system of high performance workstations representing both the standard data processors and networks and advanced RISC-based processors and multiprocessor systems. The system provides a base from which to develop and evaluate new performance and risk management concepts and for sharing the results. Participants are given a common view of requirements and capability via: remote login to the testbed; standard, natural user interfaces to simulations and emulations; special attention to user manuals for all software tools; and E-mail communication. The testbed elements which instantiate the approach are briefly described including the workstations, the software simulation and monitoring tools, and performance and fault tolerance experiments.

  1. The International Symposium on Grids and Clouds and the Open Grid Forum

    NASA Astrophysics Data System (ADS)

    The International Symposium on Grids and Clouds 20111 was held at Academia Sinica in Taipei, Taiwan on 19th to 25th March 2011. A series of workshops and tutorials preceded the symposium. The aim of ISGC is to promote the use of grid and cloud computing in the Asia Pacific region. Over the 9 years that ISGC has been running, the programme has evolved to become more user community focused with subjects reaching out to a larger population. Research communities are making widespread use of distributed computing facilities. Linking together data centers, production grids, desktop systems or public clouds, many researchers are able to do more research and produce results more quickly. They could do much more if the computing infrastructures they use worked together more effectively. Changes in the way we approach distributed computing, and new services from commercial providers, mean that boundaries are starting to blur. This opens the way for hybrid solutions that make it easier for researchers to get their job done. Consequently the theme for ISGC2011 was the opportunities that better integrated computing infrastructures can bring, and the steps needed to achieve the vision of a seamless global research infrastructure. 2011 is a year of firsts for ISGC. First the title - while the acronym remains the same, its meaning has changed to reflect the evolution of computing: The International Symposium on Grids and Clouds. Secondly the programming - ISGC 2011 has always included topical workshops and tutorials. But 2011 is the first year that ISGC has been held in conjunction with the Open Grid Forum2 which held its 31st meeting with a series of working group sessions. The ISGC plenary session included keynote speakers from OGF that highlighted the relevance of standards for the research community. ISGC with its focus on applications and operational aspects complemented well with OGF's focus on standards development. ISGC brought to OGF real-life use cases and needs to be

  2. Large Scale Monte Carlo Simulation of Neutrino Interactions Using the Open Science Grid and Commercial Clouds

    NASA Astrophysics Data System (ADS)

    Norman, A.; Boyd, J.; Davies, G.; Flumerfelt, E.; Herner, K.; Mayer, N.; Mhashilhar, P.; Tamsett, M.; Timm, S.

    2015-12-01

    Modern long baseline neutrino experiments like the NOvA experiment at Fermilab, require large scale, compute intensive simulations of their neutrino beam fluxes and backgrounds induced by cosmic rays. The amount of simulation required to keep the systematic uncertainties in the simulation from dominating the final physics results is often 10x to 100x that of the actual detector exposure. For the first physics results from NOvA this has meant the simulation of more than 2 billion cosmic ray events in the far detector and more than 200 million NuMI beam spill simulations. Performing these high statistics levels of simulation have been made possible for NOvA through the use of the Open Science Grid and through large scale runs on commercial clouds like Amazon EC2. We details the challenges in performing large scale simulation in these environments and how the computing infrastructure for the NOvA experiment has been adapted to seamlessly support the running of different simulation and data processing tasks on these resources.

  3. Space Environments Testbed

    NASA Technical Reports Server (NTRS)

    Leucht, David K.; Koslosky, Marie J.; Kobe, David L.; Wu, Jya-Chang C.; Vavra, David A.

    2011-01-01

    The Space Environments Testbed (SET) is a flight controller data system for the Common Carrier Assembly. The SET-1 flight software provides the command, telemetry, and experiment control to ground operators for the SET-1 mission. Modes of operation (see dia gram) include: a) Boot Mode that is initiated at application of power to the processor card, and runs memory diagnostics. It may be entered via ground command or autonomously based upon fault detection. b) Maintenance Mode that allows for limited carrier health monitoring, including power telemetry monitoring on a non-interference basis. c) Safe Mode is a predefined, minimum power safehold configuration with power to experiments removed and carrier functionality minimized. It is used to troubleshoot problems that occur during flight. d) Operations Mode is used for normal experiment carrier operations. It may be entered only via ground command from Safe Mode.

  4. Autonomous Flying Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.

    2005-01-01

    The Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis,Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights.

  5. Holodeck Testbed Project

    NASA Technical Reports Server (NTRS)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control

  6. Long Duration Sorbent Testbed

    NASA Technical Reports Server (NTRS)

    Howard, David F.; Knox, James C.; Long, David A.; Miller, Lee; Cmaric, Gregory; Thomas, John

    2016-01-01

    The Long Duration Sorbent Testbed (LDST) is a flight experiment demonstration designed to expose current and future candidate carbon dioxide removal system sorbents to an actual crewed space cabin environment to assess and compare sorption working capacity degradation resulting from long term operation. An analysis of sorbent materials returned to Earth after approximately one year of operation in the International Space Station's (ISS) Carbon Dioxide Removal Assembly (CDRA) indicated as much as a 70% loss of working capacity of the silica gel desiccant material at the extreme system inlet location, with a gradient of capacity loss down the bed. The primary science objective is to assess the degradation of potential sorbents for exploration class missions and ISS upgrades when operated in a true crewed space cabin environment. A secondary objective is to compare degradation of flight test to a ground test unit with contaminant dosing to determine applicability of ground testing.

  7. Optical Network Testbeds Workshop

    SciTech Connect

    Joe Mambretti

    2007-06-01

    This is the summary report of the third annual Optical Networking Testbed Workshop (ONT3), which brought together leading members of the international advanced research community to address major challenges in creating next generation communication services and technologies. Networking research and development (R&D) communities throughout the world continue to discover new methods and technologies that are enabling breakthroughs in advanced communications. These discoveries are keystones for building the foundation of the future economy, which requires the sophisticated management of extremely large qualities of digital information through high performance communications. This innovation is made possible by basic research and experiments within laboratories and on specialized testbeds. Initial network research and development initiatives are driven by diverse motives, including attempts to solve existing complex problems, the desire to create powerful new technologies that do not exist using traditional methods, and the need to create tools to address specific challenges, including those mandated by large scale science or government agency mission agendas. Many new discoveries related to communications technologies transition to wide-spread deployment through standards organizations and commercialization. These transition paths allow for new communications capabilities that drive many sectors of the digital economy. In the last few years, networking R&D has increasingly focused on advancing multiple new capabilities enabled by next generation optical networking. Both US Federal networking R&D and other national R&D initiatives, such as those organized by the National Institute of Information and Communications Technology (NICT) of Japan are creating optical networking technologies that allow for new, powerful communication services. Among the most promising services are those based on new types of multi-service or hybrid networks, which use new optical networking

  8. Updated Electronic Testbed System

    NASA Technical Reports Server (NTRS)

    Brewer, Kevin L.

    2001-01-01

    As we continue to advance in exploring space frontiers, technology must also advance. The need for faster data recovery and data processing is crucial. In this, the less equipment used, and lighter that equipment is, the better. Because integrated circuits become more sensitive in high altitude, experimental verification and quantification is required. The Center for Applied Radiation Research (CARR) at Prairie View A&M University was awarded a grant by NASA to participate in the NASA ER-2 Flight Program, the APEX balloon flight program, and the Student Launch Program. These programs are to test anomalous errors in integrated circuits due to single event effects (SEE). CARR had already begun experiments characterizing the SEE behavior of high speed and high density SRAM's. The research center built a error testing system using a PC-104 computer unit, an Iomega Zip drive for storage, a test board with the components under test, and a latchup detection and reset unit. A test program was written to continuously monitor a stored data pattern in the SRAM chip and record errors. The devices under test were eight 4Mbit memory chips totaling 4Mbytes of memory. CARR was successful at obtaining data using the Electronic TestBed System (EBS) in various NASA ER-2 test flights. These series of high altitude flights of up to 70,000 feet, were effective at yielding the conditions which single event effects usually occur. However, the data received from the series of flights indicated one error per twenty-four hours. Because flight test time is very expensive, the initial design proved not to be cost effective. The need for orders of magnitude with more memory became essential. Therefore, a project which could test more memory within a given time was created. The goal of this project was not only to test more memory within a given time, but also to have a system with a faster processing speed, and which used less peripherals. This paper will describe procedures used to build an

  9. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea-ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Crosier, Jonathan; Bower, Keith N.; Darbyshire, Eoghan; Taylor, Jonathan W.; Liu, Dantong; Allan, James D.; Williams, Paul I.; Gallagher, Martin W.; Choularton, Thomas W.

    2016-04-01

    The Arctic sea-ice is intricately coupled to the atmosphere[1]. The decreasing sea-ice extent with the changing climate raises questions about how Arctic cloud structure will respond. Any effort to answer these questions is hindered by the scarcity of atmospheric observations in this region. Comprehensive cloud and aerosol measurements could allow for an improved understanding of the relationship between surface conditions and cloud structure; knowledge which could be key in validating weather model forecasts. Previous studies[2] have shown via remote sensing that cloudiness increases over the marginal ice zone (MIZ) and ocean with comparison to the sea-ice; however, to our knowledge, detailed in-situ data of this transition have not been previously presented. In 2013, the Aerosol-Cloud Coupling and Climate Interactions in the Arctic (ACCACIA) campaign was carried out in the vicinity of Svalbard, Norway to collect in-situ observations of the Arctic atmosphere and investigate this issue. Fitted with a suite of remote sensing, cloud and aerosol instrumentation, the FAAM BAe-146 aircraft was used during the spring segment of the campaign (Mar-Apr 2013). One case study (23rd Mar 2013) produced excellent coverage of the atmospheric changes when transitioning from sea-ice, through the MIZ, to the open ocean. Clear microphysical changes were observed, with the cloud liquid-water content increasing by almost four times over the transition. Cloud base, depth and droplet number also increased, whilst ice number concentrations decreased slightly. The surface warmed by ~13 K from sea-ice to ocean, with minor differences in aerosol particle number (of sizes corresponding to Cloud Condensation Nuclei or Ice Nucleating Particles) observed, suggesting that the primary driver of these microphysical changes was the increased heat fluxes and induced turbulence from the warm ocean surface as expected. References: [1] Kapsch, M.L., Graversen, R.G. and Tjernström, M. Springtime

  10. Adaptive Signal Processing Testbed

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1991-09-01

    The design and implementation of a system for the acquisition, processing, and analysis of signal data is described. The initial application for the system is the development and analysis of algorithms for excision of interfering tones from direct sequence spread spectrum communication systems. The system is called the Adaptive Signal Processing Testbed (ASPT) and is an integrated hardware and software system built around the TMS320C30 chip. The hardware consists of a radio frequency data source, digital receiver, and an adaptive signal processor implemented on a Sun workstation. The software components of the ASPT consists of a number of packages including the Sun driver package; UNIX programs that support software development on the TMS320C30 boards; UNIX programs that provide the control, user interaction, and display capabilities for the data acquisition, processing, and analysis components of the ASPT; and programs that perform the ASPT functions including data acquisition, despreading, and adaptive filtering. The performance of the ASPT system is evaluated by comparing actual data rates against their desired values. A number of system limitations are identified and recommendations are made for improvements.

  11. An automation simulation testbed

    NASA Technical Reports Server (NTRS)

    Cook, George E.; Sztipanovits, Janos; Biegl, Csaba; Karsai, Gabor; Springfield, James F.; Mutammara, Atheel

    1988-01-01

    The work being done in porting ROBOSIM (a graphical simulation system developed jointly by NASA-MSFC and Vanderbilt University) to the HP350SRX graphics workstation is described. New additional ROBOSIM features, like collision detection and new kinematics simulation methods are also discussed. Based on the experiences of the work on ROBOSIM, a new graphics structural modeling environment is suggested which is intended to be a part of a new knowledge-based multiple aspect modeling testbed. The knowledge-based modeling methodologies and tools already available are described. Three case studies in the area of Space Station automation are also reported. First a geometrical structural model of the station is presented. This model was developed using the ROBOSIM package. Next the possible application areas of an integrated modeling environment in the testing of different Space Station operations are discussed. One of these possible application areas is the modeling of the Environmental Control and Life Support System (ECLSS), which is one of the most complex subsystems of the station. Using the multiple aspect modeling methodology, a fault propagation model of this system is being built and is described.

  12. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations, In neurosurgery, the needle used in the standard stereotactic CT or MRI guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled "Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification" is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  13. NASA Robotic Neurosurgery Testbed

    NASA Technical Reports Server (NTRS)

    Mah, Robert

    1997-01-01

    The detection of tissue interface (e.g., normal tissue, cancer, tumor) has been limited clinically to tactile feedback, temperature monitoring, and the use of a miniature ultrasound probe for tissue differentiation during surgical operations. In neurosurgery, the needle used in the standard stereotactic CT (Computational Tomography) or MRI (Magnetic Resonance Imaging) guided brain biopsy provides no information about the tissue being sampled. The tissue sampled depends entirely upon the accuracy with which the localization provided by the preoperative CT or MRI scan is translated to the intracranial biopsy site. In addition, no information about the tissue being traversed by the needle (e.g., a blood vessel) is provided. Hemorrhage due to the biopsy needle tearing a blood vessel within the brain is the most devastating complication of stereotactic CT/MRI guided brain biopsy. A robotic neurosurgery testbed has been developed at NASA Ames Research Center as a spin-off of technologies from space, aeronautics and medical programs. The invention entitled 'Robotic Neurosurgery Leading to Multimodality Devices for Tissue Identification' is nearing a state ready for commercialization. The devices will: 1) improve diagnostic accuracy and precision of general surgery, with near term emphasis on stereotactic brain biopsy, 2) automate tissue identification, with near term emphasis on stereotactic brain biopsy, to permit remote control of the procedure, and 3) reduce morbidity for stereotactic brain biopsy. The commercial impact from this work is the potential development of a whole new generation of smart surgical tools to increase the safety, accuracy and efficiency of surgical procedures. Other potential markets include smart surgical tools for tumor ablation in neurosurgery, general exploratory surgery, prostate cancer surgery, and breast cancer surgery.

  14. A Cloud Computing Approach to Personal Risk Management: The Open Hazards Group

    NASA Astrophysics Data System (ADS)

    Graves, W. R.; Holliday, J. R.; Rundle, J. B.

    2010-12-01

    According to the California Earthquake Authority, only about 12% of current California residences are covered by any form of earthquake insurance, down from about 30% in 1996 following the 1994, M6.7 Northridge earthquake. Part of the reason for this decreasing rate of insurance uptake is the high deductible, either 10% or 15% of the value of the structure, and the relatively high cost of the premiums, as much as thousands of dollars per year. The earthquake insurance industry is composed of the CEA, a public-private partnership; modeling companies that produce damage and loss models similar to the FEMA HAZUS model; and financial companies such as the insurance, reinsurance, and investment banking companies in New York, London, the Cayman Islands, Zurich, Dubai, Singapore, and elsewhere. In setting earthquake insurance rates, financial companies rely on models like HAZUS, that calculate on risk and exposure. In California, the process begins with an official earthquake forecast by the Working Group on California Earthquake Probabilities. Modeling companies use these 30 year earthquake probabilities as inputs to their attenuation and damage models to estimate the possible damage factors from scenario earthquakes. Economic loss is then estimated from processes such as structural failure, lost economic activity, demand surge, and fire following the earthquake. Once the potential losses are known, rates can be set so that a target ruin probability of less than 1% or so can be assured. Open Hazards Group was founded with the idea that the global public might be interested in a personal estimate of earthquake risk, computed using data supplied by the public, with models running in a cloud computing environment. These models process data from the ANSS catalog, updated at least daily, to produce rupture forecasts that are backtested with standard Reliability/Attributes and Receiver Operating Characteristic tests, among others. Models for attenuation and structural damage

  15. INDIGO: Building a DataCloud Framework to support Open Science

    NASA Astrophysics Data System (ADS)

    Chen, Yin; de Lucas, Jesus Marco; Aguilar, Fenando; Fiore, Sandro; Rossi, Massimiliano; Ferrari, Tiziana

    2016-04-01

    New solutions are required to support Data Intensive Science in the emerging panorama of e-infrastructures, including Grid, Cloud and HPC services. The architecture proposed by the INDIGO-DataCloud (INtegrating Distributed data Infrastructures for Global ExplOitation) (https://www.indigo-datacloud.eu/) H2020 project, provides the path to integrate IaaS resources and PaaS platforms to provide SaaS solutions, while satisfying the requirements posed by different Research Communities, including several in Earth Science. This contribution introduces the INDIGO DataCloud architecture, describes the methodology followed to assure the integration of the requirements from different research communities, including examples like ENES, LifeWatch or EMSO, and how they will build their solutions using different INDIGO components.

  16. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  17. The ac power system testbed

    NASA Technical Reports Server (NTRS)

    Mildice, J.; Sundberg, R.

    1987-01-01

    The object of this program was to design, build, test, and deliver a high frequency (20 kHz) Power System Testbed which would electrically approximate a single, separable power channel of an IOC Space Station. That program is described, including the technical background, and the results are discussed showing that the major assumptions about the characteristics of this class of hardware (size, mass, efficiency, control, etc.) were substantially correct. This testbed equipment was completed and delivered and is being operated as part of the Space Station Power System Test Facility.

  18. A Kenyan Cloud School. Massive Open Online & Ongoing Courses for Blended and Lifelong Learning

    ERIC Educational Resources Information Center

    Jobe, William

    2013-01-01

    This research describes the predicted outcomes of a Kenyan Cloud School (KCS), which is a MOOC that contains all courses taught at the secondary school level in Kenya. This MOOC will consist of online, ongoing subjects in both English and Kiswahili. The KCS subjects offer self-testing and peer assessment to maximize scalability, and digital badges…

  19. A Framework to Evaluate Unified Parameterizations for Seasonal Prediction: An LES/SCM Parameterization Test-Bed

    DTIC Science & Technology

    2013-09-30

    Seasonal Prediction: An LES/ SCM Parameterization Test-Bed Joao Teixeira Jet Propulsion Laboratory California Institute of Technology, MS 169-237...a Single Column Model ( SCM ) version of the latest operational NAVGEM that can be used to simulate GEWEX Cloud Systems Study (GCSS) case-studies; ii...use the NAVGEM SCM and the LES model as a parameterization test-bed. APPROACH It is well accepted that sub-grid physical processes such as

  20. A Scalable and Dynamic Testbed for Conducting Penetration-Test Training in a Laboratory Environment

    DTIC Science & Technology

    2015-03-01

    capabilities. C. Support for tactical technologies : a way to incorporate tactical Army networks including wireless sensor networks and mobile ad-hoc networks... security of computer networks. This testbed was generated based on an informal study of the common needs of real-life penetration testers. Only open-source... technologies are used and step-by-step instructions are provided on how to create the testbed. A case study is also included with a sample scenario

  1. Pico Satellite Solar Cell Testbed (PSSC Testbed) Design

    DTIC Science & Technology

    2007-09-30

    addition, there are two reaction wheels , which are aligned along the long axis of the spacecraft. One reaction wheel will be spun up using power from the...launch vehicle before ejection from the Picosatellite launcher. After ejection from the Picosatellite launcher, the reaction wheel will spin down and... reaction wheel will be spun up to reduce the spin rate of the PSSC Testbed to 2 RPM to make meas- urements of the solar cell current-voltage

  2. Testbed Environment for Distributed Observation (testbed omgeving voor gedistribueerde waarneming)

    DTIC Science & Technology

    2006-05-01

    Een voorsiudie die tot de specificaties van het Testbed hebben geleid is heschreven in I Bovendien hevordert een Testhed omigeving, die in meerdere pro...functies loeaal functioneren (bijvoorbeeld pompen inschakelen, ruimtes afsluiten, et cetera) waardoor het systeem onafhankelijk wordt van eon centrale...ongebalanceerde koeling, et cetera) wordt met bebuip van actuatoren (kieppen, cross-overs, pompen , et cetera) bet systeemn gereconfiguroerd zodat de

  3. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    NASA Astrophysics Data System (ADS)

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong; Qian, Yun; Zhang, Kai; Wang, Yuhang; Yang, Xiu-Qun

    2016-11-01

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol-radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m-2 (1σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 W m-2), while fire POM induces a small effect (-0.05 and 0.04 ± 0.01 W m-2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol-cloud interactions (REaci) of all fire aerosols is -0.70 ± 0.05 W m-2, resulting mainly from the fire POM effect (-0.59 ± 0.03 W m-2). REari (0.43 ± 0.03 W m-2) and REaci (-1.38 ± 0.23 W m-2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and -0.82 ± 0.09 W m-2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to -15 W m-2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. The global annual mean RE due to surface-albedo changes (REsac) over land areas (0.03 ± 0.10 W m-2) is small and statistically insignificant and is mainly due to the fire BC-in-snow effect (0.02 W m-2) with the maximum albedo effect

  4. Generalized Nanosatellite Avionics Testbed Lab

    NASA Technical Reports Server (NTRS)

    Frost, Chad R.; Sorgenfrei, Matthew C.; Nehrenz, Matt

    2015-01-01

    The Generalized Nanosatellite Avionics Testbed (G-NAT) lab at NASA Ames Research Center provides a flexible, easily accessible platform for developing hardware and software for advanced small spacecraft. A collaboration between the Mission Design Division and the Intelligent Systems Division, the objective of the lab is to provide testing data and general test protocols for advanced sensors, actuators, and processors for CubeSat-class spacecraft. By developing test schemes for advanced components outside of the standard mission lifecycle, the lab is able to help reduce the risk carried by advanced nanosatellite or CubeSat missions. Such missions are often allocated very little time for testing, and too often the test facilities must be custom-built for the needs of the mission at hand. The G-NAT lab helps to eliminate these problems by providing an existing suite of testbeds that combines easily accessible, commercial-offthe- shelf (COTS) processors with a collection of existing sensors and actuators.

  5. High-contrast imaging testbed

    SciTech Connect

    Baker, K; Silva, D; Poyneer, L; Macintosh, B; Bauman, B; Palmer, D; Remington, T; Delgadillo-Lariz, M

    2008-01-23

    Several high-contrast imaging systems are currently under construction to enable the detection of extra-solar planets. In order for these systems to achieve their objectives, however, there is considerable developmental work and testing which must take place. Given the need to perform these tests, a spatially-filtered Shack-Hartmann adaptive optics system has been assembled to evaluate new algorithms and hardware configurations which will be implemented in these future high-contrast imaging systems. In this article, construction and phase measurements of a membrane 'woofer' mirror are presented. In addition, results from closed-loop operation of the assembled testbed with static phase plates are presented. The testbed is currently being upgraded to enable operation at speeds approaching 500 hz and to enable studies of the interactions between the woofer and tweeter deformable mirrors.

  6. Fading testbed for free-space optical communications

    NASA Astrophysics Data System (ADS)

    Shrestha, Amita; Giggenbach, Dirk; Mustafa, Ahmad; Pacheco-Labrador, Jorge; Ramirez, Julio; Rein, Fabian

    2016-10-01

    Free-space optical (FSO) communication is a very attractive technology offering very high throughput without spectral regulation constraints, yet allowing small antennas (telescopes) and tap-proof communication. However, the transmitted signal has to travel through the atmosphere where it gets influenced by atmospheric turbulence, causing scintillation of the received signal. In addition, climatic effects like fogs, clouds and rain also affect the signal significantly. Moreover, FSO being a line of sight communication requires precise pointing and tracking of the telescopes, which otherwise also causes fading. To achieve error-free transmission, various mitigation techniques like aperture averaging, adaptive optics, transmitter diversity, sophisticated coding and modulation schemes are being investigated and implemented. Evaluating the performance of such systems under controlled conditions is very difficult in field trials since the atmospheric situation constantly changes, and the target scenario (e.g. on aircraft or satellites) is not easily accessible for test purposes. Therefore, with the motivation to be able to test and verify a system under laboratory conditions, DLR has developed a fading testbed that can emulate most realistic channel conditions. The main principle of the fading testbed is to control the input current of a variable optical attenuator such that it attenuates the incoming signal according to the loaded power vector. The sampling frequency and mean power of the vector can be optionally changed according to requirements. This paper provides a brief introduction to software and hardware development of the fading testbed and measurement results showing its accuracy and application scenarios.

  7. Software Defined Networking challenges and future direction: A case study of implementing SDN features on OpenStack private cloud

    NASA Astrophysics Data System (ADS)

    Shamugam, Veeramani; Murray, I.; Leong, J. A.; Sidhu, Amandeep S.

    2016-03-01

    Cloud computing provides services on demand instantly, such as access to network infrastructure consisting of computing hardware, operating systems, network storage, database and applications. Network usage and demands are growing at a very fast rate and to meet the current requirements, there is a need for automatic infrastructure scaling. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Managing complex environments using traditional networks is time-consuming and expensive, especially in the case of generating virtual machines, migration and network configuration. To mitigate the challenges, network operations require efficient, flexible, agile and scalable software defined networks (SDN). This paper discuss various issues in SDN and suggests how to mitigate the network management related issues. A private cloud prototype test bed was setup to implement the SDN on the OpenStack platform to test and evaluate the various network performances provided by the various configurations.

  8. Advanced Wavefront Sensing and Control Testbed (AWCT)

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Basinger, Scott A.; Diaz, Rosemary T.; Gappinger, Robert O.; Tang, Hong; Lam, Raymond K.; Sidick, Erkin; Hein, Randall C.; Rud, Mayer; Troy, Mitchell

    2010-01-01

    The Advanced Wavefront Sensing and Control Testbed (AWCT) is built as a versatile facility for developing and demonstrating, in hardware, the future technologies of wave front sensing and control algorithms for active optical systems. The testbed includes a source projector for a broadband point-source and a suite of extended scene targets, a dispersed fringe sensor, a Shack-Hartmann camera, and an imaging camera capable of phase retrieval wavefront sensing. The testbed also provides two easily accessible conjugated pupil planes which can accommodate the active optical devices such as fast steering mirror, deformable mirror, and segmented mirrors. In this paper, we describe the testbed optical design, testbed configurations and capabilities, as well as the initial results from the testbed hardware integrations and tests.

  9. The NASA/OAST telerobot testbed architecture

    NASA Technical Reports Server (NTRS)

    Matijevic, J. R.; Zimmerman, W. F.; Dolinsky, S.

    1989-01-01

    Through a phased development such as a laboratory-based research testbed, the NASA/OAST Telerobot Testbed provides an environment for system test and demonstration of the technology which will usefully complement, significantly enhance, or even replace manned space activities. By integrating advanced sensing, robotic manipulation and intelligent control under human-interactive supervision, the Testbed will ultimately demonstrate execution of a variety of generic tasks suggestive of space assembly, maintenance, repair, and telescience. The Testbed system features a hierarchical layered control structure compatible with the incorporation of evolving technologies as they become available. The Testbed system is physically implemented in a computing architecture which allows for ease of integration of these technologies while preserving the flexibility for test of a variety of man-machine modes. The development currently in progress on the functional and implementation architectures of the NASA/OAST Testbed and capabilities planned for the coming years are presented.

  10. The computational structural mechanics testbed procedures manual

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1991-01-01

    The purpose of this manual is to document the standard high level command language procedures of the Computational Structural Mechanics (CSM) Testbed software system. A description of each procedure including its function, commands, data interface, and use is presented. This manual is designed to assist users in defining and using command procedures to perform structural analysis in the CSM Testbed User's Manual and the CSM Testbed Data Library Description.

  11. Exploiting Open Environmental Data using Linked Data and Cloud Computing: the MELODIES project

    NASA Astrophysics Data System (ADS)

    Blower, Jon; Gonçalves, Pedro; Caumont, Hervé; Koubarakis, Manolis; Perkins, Bethan

    2015-04-01

    The European Open Data Strategy establishes important new principles that ensure that European public sector data will be released at no cost (or marginal cost), in machine-readable, commonly-understood formats, and with liberal licences enabling wide reuse. These data encompass both scientific data about the environment (from Earth Observation and other fields) and other public sector information, including diverse topics such as demographics, health and crime. Many open geospatial datasets (e.g. land use) are already available through the INSPIRE directive and made available through infrastructures such as the Global Earth Observation System of Systems (GEOSS). The intention of the Open Data Strategy is to stimulate the growth of research and value-adding services that build upon these data streams; however, the potential value inherent in open data, and the benefits that can be gained by combining previously-disparate sources of information are only just starting to become understood. The MELODIES project (Maximising the Exploitation of Linked Open Data In Enterprise and Science) is developing eight innovative and sustainable services, based upon Open Data, for users in research, government, industry and the general public in a broad range of societal and environmental benefit areas. MELODIES (http://melodiesproject.eu) is a European FP7 project that is coordinated by the University of Reading and has sixteen partners (including nine SMEs) from eight European countries. It started in November 2013 and will run for three years. The project is therefore in its early stages and therefore we will value the opportunity that this workshop affords to present our plans and interact with the wider Linked Geospatial Data community. The project is developing eight new services[1] covering a range of domains including agriculture, urban ecosystems, land use management, marine information, desertification, crisis management and hydrology. These services will combine Earth

  12. NASA's telemedicine testbeds: Commercial benefit

    NASA Astrophysics Data System (ADS)

    Doarn, Charles R.; Whitten, Raymond

    1998-01-01

    The National Aeronautics and Space Administration (NASA) has been developing and applying telemedicine to support space flight since the Agency's beginning. Telemetry of physiological parameters from spacecraft to ground controllers is critical to assess the health status of humans in extreme and remote environments. Requisite systems to support medical care and maintain readiness will evolve as mission duration and complexity increase. Developing appropriate protocols and procedures to support multinational, multicultural missions is a key objective of this activity. NASA has created an Agency-wide strategic plan that focuses on the development and integration of technology into the health care delivery systems for space flight to meet these challenges. In order to evaluate technology and systems that can enhance inflight medical care and medical education, NASA has established and conducted several testbeds. Additionally, in June of 1997, NASA established a Commercial Space Center (CSC) for Medical Informatics and Technology Applications at Yale University School of Medicine. These testbeds and the CSC foster the leveraging of technology and resources between government, academia and industry to enhance health care. This commercial endeavor will influence both the delivery of health care in space and on the ground. To date, NASA's activities in telemedicine have provided new ideas in the application of telecommunications and information systems to health care. NASA's Spacebridge to Russia, an Internet-based telemedicine testbed, is one example of how telemedicine and medical education can be conducted using the Internet and its associated tools. Other NASA activities, including the development of a portable telemedicine workstation, which has been demonstrated on the Crow Indian Reservation and in the Texas Prison System, show promise in serving as significant adjuncts to the delivery of health care. As NASA continues to meet the challenges of space flight, the

  13. A Variable Dynamic Testbed Vehicle

    NASA Technical Reports Server (NTRS)

    Marriott, A.

    1995-01-01

    This paper describes the concept of a potential test vehicle for the National Highway Traffic Safety Administration (NHTSA) that is designed to evaluate the dynamics, human factors, and safety aspects of advanced technologies in passenger class automobiles expected to be introduced as a result of the Intelligent Vehicle/Highway System (IVHS) Program. The Variable Dynamic Testbed Vehicle (VDTV) requirements were determined from the inputs of anticipated users and possible research needs of NHTSA. Design and implementation approaches are described, the benefits of the vehicle are discussed and costs for several options presented.

  14. Open Science in the Cloud: Towards a Universal Platform for Scientific and Statistical Computing

    NASA Astrophysics Data System (ADS)

    Chine, Karim

    The UK, through the e-Science program, the US through the NSF-funded cyber infrastructure and the European Union through the ICT Calls aimed to provide "the technological solution to the problem of efficiently connecting data, computers, and people with the goal of enabling derivation of novel scientific theories and knowledge".1 The Grid (Foster, 2002; Foster; Kesselman, Nick, & Tuecke, 2002), foreseen as a major accelerator of discovery, didn't meet the expectations it had excited at its beginnings and was not adopted by the broad population of research professionals. The Grid is a good tool for particle physicists and it has allowed them to tackle the tremendous computational challenges inherent to their field. However, as a technology and paradigm for delivering computing on demand, it doesn't work and it can't be fixed. On one hand, "the abstractions that Grids expose - to the end-user, to the deployers and to application developers - are inappropriate and they need to be higher level" (Jha, Merzky, & Fox), and on the other hand, academic Grids are inherently economically unsustainable. They can't compete with a service outsourced to the Industry whose quality and price would be driven by market forces. The virtualization technologies and their corollary, the Infrastructure-as-a-Service (IaaS) style cloud, hold the promise to enable what the Grid failed to deliver: a sustainable environment for computational sciences that would lower the barriers for accessing federated computational resources, software tools and data; enable collaboration and resources sharing and provide the building blocks of a ubiquitous platform for traceable and reproducible computational research.

  15. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    SciTech Connect

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong; Qian, Yun; Zhang, Kai; Wang, Yuhang; Yang, Xiu-Qun

    2016-11-29

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol–radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m-2 (1σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 W m-2), while fire POM induces a small effect (-0.05 and 0.04 ± 0.01 W m-2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol–cloud interactions (REaci) of all fire aerosols is -0.70 ± 0.05 W m-2, resulting mainly from the fire POM effect (-0.59 ± 0.03 W m-2). REari (0.43 ± 0.03 W m-2) and REaci (-1.38 ± 0.23 W m-2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and -0.82 ± 0.09 W m-2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to -15 Wm-2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. The global annual mean RE due to surface-albedo changes (REsac) over land areas (0.03 ± 0.10 W m-2) is small

  16. Impacts of global open-fire aerosols on direct radiative, cloud and surface-albedo effects simulated with CAM5

    DOE PAGES

    Jiang, Yiquan; Lu, Zheng; Liu, Xiaohong; ...

    2016-11-29

    Aerosols from open-land fires could significantly perturb the global radiation balance and induce climate change. In this study, Community Atmosphere Model version 5 (CAM5) with prescribed daily fire aerosol emissions is used to investigate the spatial and seasonal characteristics of radiative effects (REs, relative to the case of no fires) of open-fire aerosols including black carbon (BC) and particulate organic matter (POM) from 2003 to 2011. The global annual mean RE from aerosol–radiation interactions (REari) of all fire aerosols is 0.16 ± 0.01 W m−2 (1σ uncertainty), mainly due to the absorption of fire BC (0.25 ± 0.01 W m−2), while fire POM induces a small effect (−0.05 andmore » 0.04 ± 0.01 W m−2 based on two different methods). Strong positive REari is found in the Arctic and in the oceanic regions west of southern Africa and South America as a result of amplified absorption of fire BC above low-level clouds, in general agreement with satellite observations. The global annual mean RE due to aerosol–cloud interactions (REaci) of all fire aerosols is −0.70 ± 0.05 W m−2, resulting mainly from the fire POM effect (−0.59 ± 0.03 W m−2). REari (0.43 ± 0.03 W m−2) and REaci (−1.38 ± 0.23 W m−2) in the Arctic are stronger than in the tropics (0.17 ± 0.02 and −0.82 ± 0.09 W m−2 for REari and REaci), although the fire aerosol burden is higher in the tropics. The large cloud liquid water path over land areas and low solar zenith angle of the Arctic favor the strong fire aerosol REaci (up to −15 W m−2) during the Arctic summer. Significant surface cooling, precipitation reduction and increasing amounts of low-level cloud are also found in the Arctic summer as a result of the fire aerosol REaci based on the atmosphere-only simulations. The global annual mean RE due to surface-albedo changes (REsac) over land areas (0.03 ± 0.10 W m−2) is small

  17. Control design for the SERC experimental testbeds

    NASA Technical Reports Server (NTRS)

    Jacques, Robert; Blackwood, Gary; Macmartin, Douglas G.; How, Jonathan; Anderson, Eric

    1992-01-01

    Viewgraphs on control design for the Space Engineering Research Center experimental testbeds are presented. Topics covered include: SISO control design and results; sensor and actuator location; model identification; control design; experimental results; preliminary LAC experimental results; active vibration isolation problem statement; base flexibility coupling into isolation feedback loop; cantilever beam testbed; and closed loop results.

  18. THE HERSCHEL INVENTORY OF THE AGENTS OF GALAXY EVOLUTION IN THE MAGELLANIC CLOUDS, A HERSCHEL OPEN TIME KEY PROGRAM

    SciTech Connect

    Meixner, M.; Roman-Duval, J.; Seale, J.; Gordon, K.; Beck, T.; Boyer, M. L.; Panuzzo, P.; Hony, S.; Sauvage, M.; Okumura, K.; Chanial, P.; Babler, B.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Carlson, L. R.; Clayton, G. C.; and others

    2013-09-15

    We present an overview of the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 {mu}m with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for the LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the criteria used to establish a catalog for each waveband for the HERITAGE program. The 250 {mu}m band is the most sensitive and the source catalogs for this band have {approx}25,000 objects for the LMC and {approx}5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supernova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the community support

  19. The HERschel Inventory of the Agents of Galaxy Evolution in the Magellanic Clouds, a HERschel Open Time Key Program

    NASA Technical Reports Server (NTRS)

    Meixner, Margaret; Panuzzo, P.; Roman-Duval, J.; Engelbracht, C.; Babler, B.; Seale, J.; Hony, S.; Montiel, E.; Sauvage, M.; Gordon, K.; Misselt, K.; Okumura, K.; Chanial, P.; Beck, T.; Bernard, J.-P.; Bolatto, A.; Bot, C.; Boyer, M. L.; Carlson, L. R.; Clayton, G. C.; Chen, C.-H. R.; Cormier, D.; Fukui, Y.; Galametz, M.; Galliano, F.

    2013-01-01

    We present an overview or the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 micron with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the critetia used to establish a catalog for each waveband for the HERITAGE program. The 250 micron band is the most sensitive and the source catalogs for this band have approx. 25,000 objects for the LMC and approx. 5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supemova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the conummity support

  20. Formation Algorithms and Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Wette, Matthew; Sohl, Garett; Scharf, Daniel; Benowitz, Edward

    2004-01-01

    Formation flying for spacecraft is a rapidly developing field that will enable a new era of space science. For one of its missions, the Terrestrial Planet Finder (TPF) project has selected a formation flying interferometer design to detect earth-like planets orbiting distant stars. In order to advance technology needed for the TPF formation flying interferometer, the TPF project has been developing a distributed real-time testbed to demonstrate end-to-end operation of formation flying with TPF-like functionality and precision. This is the Formation Algorithms and Simulation Testbed (FAST) . This FAST was conceived to bring out issues in timing, data fusion, inter-spacecraft communication, inter-spacecraft sensing and system-wide formation robustness. In this paper we describe the FAST and show results from a two-spacecraft formation scenario. The two-spacecraft simulation is the first time that precision end-to-end formation flying operation has been demonstrated in a distributed real-time simulation environment.

  1. CRYOTE (Cryogenic Orbital Testbed) Concept

    NASA Technical Reports Server (NTRS)

    Gravlee, Mari; Kutter, Bernard; Wollen, Mark; Rhys, Noah; Walls, Laurie

    2009-01-01

    Demonstrating cryo-fluid management (CFM) technologies in space is critical for advances in long duration space missions. Current space-based cryogenic propulsion is viable for hours, not the weeks to years needed by space exploration and space science. CRYogenic Orbital TEstbed (CRYOTE) provides an affordable low-risk environment to demonstrate a broad array of critical CFM technologies that cannot be tested in Earth's gravity. These technologies include system chilldown, transfer, handling, health management, mixing, pressure control, active cooling, and long-term storage. United Launch Alliance is partnering with Innovative Engineering Solutions, the National Aeronautics and Space Administration, and others to develop CRYOTE to fly as an auxiliary payload between the primary payload and the Centaur upper stage on an Atlas V rocket. Because satellites are expensive, the space industry is largely risk averse to incorporating unproven systems or conducting experiments using flight hardware that is supporting a primary mission. To minimize launch risk, the CRYOTE system will only activate after the primary payload is separated from the rocket. Flying the testbed as an auxiliary payload utilizes Evolved Expendable Launch Vehicle performance excess to cost-effectively demonstrate enhanced CFM.

  2. Cloud Computing

    DTIC Science & Technology

    2009-11-12

    Eucalyptus Systems • Provides an open-source application that can be used to implement a cloud computing environment on a datacenter • Trying to establish an...Summary Cloud Computing is in essence an economic model • It is a different way to acquire and manage IT resources There are multiple cloud providers...edgeplatform.html • Amazon Elastic Compute Cloud (EC2): http://aws.amazon.com/ec2/ • Amazon Simple Storage Solution (S3): http://aws.amazon.com/s3/ • Eucalyptus

  3. Inferring spatial clouds statistics from limited field-of-view, zenith observations

    SciTech Connect

    Sun, C.H.; Thorne, L.R.

    1996-04-01

    Many of the Cloud and Radiation Testbed (CART) measurements produce a time series of zenith observations, but spatial averages are often the desired data product. One possible approach to deriving spatial averages from temporal averages is to invoke Taylor`s hypothesis where and when it is valid. Taylor`s hypothesis states that when the turbulence is small compared with the mean flow, the covariance in time is related to the covariance in space by the speed of the mean flow. For clouds fields, Taylor`s hypothesis would apply when the {open_quotes}local{close_quotes} turbulence is small compared with advective flow (mean wind). The objective of this study is to determine under what conditions Taylor`s hypothesis holds or does not hold true for broken cloud fields.

  4. Internet-Based Software Tools for Analysis and Processing of LIDAR Point Cloud Data via the OpenTopography Portal

    NASA Astrophysics Data System (ADS)

    Nandigam, V.; Crosby, C. J.; Baru, C.; Arrowsmith, R.

    2009-12-01

    LIDAR is an excellent example of the new generation of powerful remote sensing data now available to Earth science researchers. Capable of producing digital elevation models (DEMs) more than an order of magnitude higher resolution than those currently available, LIDAR data allows earth scientists to study the processes that contribute to landscape evolution at resolutions not previously possible, yet essential for their appropriate representation. Along with these high-resolution datasets comes an increase in the volume and complexity of data that the user must efficiently manage and process in order for it to be scientifically useful. Although there are expensive commercial LIDAR software applications available, processing and analysis of these datasets are typically computationally inefficient on the conventional hardware and software that is currently available to most of the Earth science community. We have designed and implemented an Internet-based system, the OpenTopography Portal, that provides integrated access to high-resolution LIDAR data as well as web-based tools for processing of these datasets. By using remote data storage and high performance compute resources, the OpenTopography Portal attempts to simplify data access and standard LIDAR processing tasks for the Earth Science community. The OpenTopography Portal allows users to access massive amounts of raw point cloud LIDAR data as well as a suite of DEM generation tools to enable users to generate custom digital elevation models to best fit their science applications. The Cyberinfrastructure software tools for processing the data are freely available via the portal and conveniently integrated with the data selection in a single user-friendly interface. The ability to run these tools on powerful Cyberinfrastructure resources instead of their own labs provides a huge advantage in terms of performance and compute power. The system also encourages users to explore data processing methods and the

  5. BluePyOpt: Leveraging Open Source Software and Cloud Infrastructure to Optimise Model Parameters in Neuroscience

    PubMed Central

    Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry

    2016-01-01

    At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471

  6. Testbed for an autonomous system

    NASA Technical Reports Server (NTRS)

    Dikshit, Piyush; Guimaraes, Katia; Ramamurthy, Maya; Agrawala, Ashok K.; Larsen, Ronald L.

    1989-01-01

    In previous works we have defined a general architectural model for autonomous systems, which can easily be mapped to describe the functions of any automated system (SDAG-86-01), and we illustrated that model by applying it to the thermal management system of a space station (SDAG-87-01). In this note, we will further develop that application and design the detail of the implementation of such a model. First we present the environment of our application by describing the thermal management problem and an abstraction, which was called TESTBED, that includes a specific function for each module in the architecture, and the nature of the interfaces between each pair of blocks.

  7. A Space Testbed for Photovoltaics

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.; Bailey, Sheila G.

    1998-01-01

    The Ohio Aerospace Institute and the NASA Lewis Research Center are designing and building a solar-cell calibration facility, the Photovoltaic Engineering Testbed (PET) to fly on the International Space Station to test advanced solar cell types in the space environment. A wide variety of advanced solar cell types have become available in the last decade. Some of these solar cells offer more than twice the power per unit area of the silicon cells used for the space station power system. They also offer the possibilities of lower cost, lighter weight, and longer lifetime. The purpose of the PET facility is to reduce the cost of validating new technologies and bringing them to spaceflight readiness. The facility will be used for three primary functions: calibration, measurement, and qualification. It is scheduled to be launched in June of 2002.

  8. The SMART-NAS Testbed

    NASA Technical Reports Server (NTRS)

    Aquilina, Rudolph A.

    2015-01-01

    The SMART-NAS Testbed for Safe Trajectory Based Operations Project will deliver an evaluation capability, critical to the ATM community, allowing full NextGen and beyond-NextGen concepts to be assessed and developed. To meet this objective a strong focus will be placed on concept integration and validation to enable a gate-to-gate trajectory-based system capability that satisfies a full vision for NextGen. The SMART-NAS for Safe TBO Project consists of six sub-projects. Three of the sub-projects are focused on exploring and developing technologies, concepts and models for evolving and transforming air traffic management operations in the ATM+2 time horizon, while the remaining three sub-projects are focused on developing the tools and capabilities needed for testing these advanced concepts. Function Allocation, Networked Air Traffic Management and Trajectory Based Operations are developing concepts and models. SMART-NAS Test-bed, System Assurance Technologies and Real-time Safety Modeling are developing the tools and capabilities to test these concepts. Simulation and modeling capabilities will include the ability to assess multiple operational scenarios of the national airspace system, accept data feeds, allowing shadowing of actual operations in either real-time, fast-time and/or hybrid modes of operations in distributed environments, and enable integrated examinations of concepts, algorithms, technologies, and NAS architectures. An important focus within this project is to enable the development of a real-time, system-wide safety assurance system. The basis of such a system is a continuum of information acquisition, analysis, and assessment that enables awareness and corrective action to detect and mitigate potential threats to continuous system-wide safety at all levels. This process, which currently can only be done post operations, will be driven towards "real-time" assessments in the 2035 time frame.

  9. Experiments Program for NASA's Space Communications Testbed

    NASA Technical Reports Server (NTRS)

    Chelmins, David; Reinhart, Richard

    2012-01-01

    NASA developed a testbed for communications and navigation that was launched to the International Space Station in 2012. The testbed promotes new software defined radio (SDR) technologies and addresses associated operational concepts for space-based SDRs, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. The experiments program consists of a mix of in-house and external experiments from partners in industry, academia, and government. The experiments will investigate key challenges in communications, networking, and global positioning system navigation both on the ground and on orbit. This presentation will discuss some of the key opportunities and challenges for the testbed experiments program.

  10. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  11. Testbed for Tactical Networking and Collaboration

    DTIC Science & Technology

    2010-01-01

    dodccrp.org Focus & Convergence for Complex Endeavors The International C2 Journal | Vol 4, No 3 Testbed for Tactical Networking and Collaboration...interface, self- aligning directional antennas Hyper -Nodes with 8th Layer (Bordetsky & Hayes-Roth, 2007) Extending tactical self-forming...the “flattened” infra - structure of committee, team, and group team working clusters, as depicted in Figure 18. BORDETSKY & NETZER | Testbed for

  12. Testbed for Satellite and Terrestrial Interoperability (TSTI)

    NASA Technical Reports Server (NTRS)

    Gary, J. Patrick

    1998-01-01

    Various issues associated with the "Testbed for Satellite and Terrestrial Interoperability (TSTI)" are presented in viewgraph form. Specific topics include: 1) General and specific scientific technical objectives; 2) ACTS experiment No. 118: 622 Mbps network tests between ATDNet and MAGIC via ACTS; 3) ATDNet SONET/ATM gigabit network; 4) Testbed infrastructure, collaborations and end sites in TSTI based evaluations; 5) the Trans-Pacific digital library experiment; and 6) ESDCD on-going network projects.

  13. A satellite observation test bed for cloud parameterization development

    NASA Astrophysics Data System (ADS)

    Lebsock, M. D.; Suselj, K.

    2015-12-01

    We present an observational test-bed of cloud and precipitation properties derived from CloudSat, CALIPSO, and the the A-Train. The focus of the test-bed is on marine boundary layer clouds including stratocumulus and cumulus and the transition between these cloud regimes. Test-bed properties include the cloud cover and three dimensional cloud fraction along with the cloud water path and precipitation water content, and associated radiative fluxes. We also include the subgrid scale distribution of cloud and precipitation, and radiaitive quantities, which must be diagnosed by a model parameterization. The test-bed further includes meterological variables from the Modern Era Retrospective-analysis for Research and Applications (MERRA). MERRA variables provide the initialization and forcing datasets to run a parameterization in Single Column Model (SCM) mode. We show comparisons of an Eddy-Diffusivity/Mass-FLux (EDMF) parameterization coupled to micorphsycis and macrophysics packages run in SCM mode with observed clouds. Comparsions are performed regionally in areas of climatological subsidence as well stratified by dynamical and thermodynamical variables. Comparisons demonstrate the ability of the EDMF model to capture the observed transitions between subtropical stratocumulus and cumulus cloud regimes.

  14. GATECloud.net: a platform for large-scale, open-source text processing on the cloud.

    PubMed

    Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina

    2013-01-28

    Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.

  15. Observed microphysical changes in Arctic mixed-phase clouds when transitioning from sea ice to open ocean

    NASA Astrophysics Data System (ADS)

    Young, Gillian; Jones, Hazel M.; Choularton, Thomas W.; Crosier, Jonathan; Bower, Keith N.; Gallagher, Martin W.; Davies, Rhiannon S.; Renfrew, Ian A.; Elvidge, Andrew D.; Darbyshire, Eoghan; Marenco, Franco; Brown, Philip R. A.; Ricketts, Hugo M. A.; Connolly, Paul J.; Lloyd, Gary; Williams, Paul I.; Allan, James D.; Taylor, Jonathan W.; Liu, Dantong; Flynn, Michael J.

    2016-11-01

    In situ airborne observations of cloud microphysics, aerosol properties, and thermodynamic structure over the transition from sea ice to ocean are presented from the Aerosol-Cloud Coupling And Climate Interactions in the Arctic (ACCACIA) campaign. A case study from 23 March 2013 provides a unique view of the cloud microphysical changes over this transition under cold-air outbreak conditions. Cloud base lifted and cloud depth increased over the transition from sea ice to ocean. Mean droplet number concentrations, Ndrop, also increased from 110 ± 36 cm-3 over the sea ice to 145 ± 54 cm-3 over the marginal ice zone (MIZ). Downstream over the ocean, Ndrop decreased to 63 ± 30 cm-3. This reduction was attributed to enhanced collision-coalescence of droplets within the deep ocean cloud layer. The liquid water content increased almost four fold over the transition and this, in conjunction with the deeper cloud layer, allowed rimed snowflakes to develop and precipitate out of cloud base downstream over the ocean. The ice properties of the cloud remained approximately constant over the transition. Observed ice crystal number concentrations averaged approximately 0.5-1.5 L-1, suggesting only primary ice nucleation was active; however, there was evidence of crystal fragmentation at cloud base over the ocean. Little variation in aerosol particle number concentrations was observed between the different surface conditions; however, some variability with altitude was observed, with notably greater concentrations measured at higher altitudes ( > 800 m) over the sea ice. Near-surface boundary layer temperatures increased by 13 °C from sea ice to ocean, with corresponding increases in surface heat fluxes and turbulent kinetic energy. These significant thermodynamic changes were concluded to be the primary driver of the microphysical evolution of the cloud. This study represents the first investigation, using in situ airborne observations, of cloud microphysical changes with

  16. The CMS integration grid testbed

    SciTech Connect

    Graham, Gregory E.

    2004-08-26

    The CMS Integration Grid Testbed (IGT) comprises USCMS Tier-1 and Tier-2 hardware at the following sites: the California Institute of Technology, Fermi National Accelerator Laboratory, the University of California at San Diego, and the University of Florida at Gainesville. The IGT runs jobs using the Globus Toolkit with a DAGMan and Condor-G front end. The virtual organization (VO) is managed using VO management scripts from the European Data Grid (EDG). Gridwide monitoring is accomplished using local tools such as Ganglia interfaced into the Globus Metadata Directory Service (MDS) and the agent based Mona Lisa. Domain specific software is packaged and installed using the Distribution After Release (DAR) tool of CMS, while middleware under the auspices of the Virtual Data Toolkit (VDT) is distributed using Pacman. During a continuous two month span in Fall of 2002, over 1 million official CMS GEANT based Monte Carlo events were generated and returned to CERN for analysis while being demonstrated at SC2002. In this paper, we describe the process that led to one of the world's first continuously available, functioning grids.

  17. Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Melnick, Gary; Tolls, Volker; Woodruff, Robert; Vasudevan, Gopal; Rizzo, Maxime; Thompson, Patrick

    2009-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a NASA Astrophysics Strategic Mission Concept study and a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC would provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 5-year mission lifetime. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables starlight suppression in broadband light from 480-960 nm. To demonstrate the VNC approach and advance it's technology readiness we have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed.

  18. Contrasting sea-ice and open-water boundary layers during melt and freeze-up seasons: Some result from the Arctic Clouds in Summer Experiment.

    NASA Astrophysics Data System (ADS)

    Tjernström, Michael; Sotiropoulou, Georgia; Sedlar, Joseph; Achtert, Peggy; Brooks, Barbara; Brooks, Ian; Persson, Ola; Prytherch, John; Salsbury, Dominic; Shupe, Matthew; Johnston, Paul; Wolfe, Dan

    2016-04-01

    With more open water present in the Arctic summer, an understanding of atmospheric processes over open-water and sea-ice surfaces as summer turns into autumn and ice starts forming becomes increasingly important. The Arctic Clouds in Summer Experiment (ACSE) was conducted in a mix of open water and sea ice in the eastern Arctic along the Siberian shelf during late summer and early autumn 2014, providing detailed observations of the seasonal transition, from melt to freeze. Measurements were taken over both ice-free and ice-covered surfaces, offering an insight to the role of the surface state in shaping the lower troposphere and the boundary-layer conditions as summer turned into autumn. During summer, strong surface inversions persisted over sea ice, while well-mixed boundary layers capped by elevated inversions were frequent over open-water. The former were often associated with advection of warm air from adjacent open-water or land surfaces, whereas the latter were due to a positive buoyancy flux from the warm ocean surface. Fog and stratus clouds often persisted over the ice, whereas low-level liquid-water clouds developed over open water. These differences largely disappeared in autumn, when mixed-phase clouds capped by elevated inversions dominated in both ice-free and ice-covered conditions. Low-level-jets occurred ~20-25% of the time in both seasons. The observations indicate that these jets were typically initiated at air-mass boundaries or along the ice edge in autumn, while in summer they appeared to be inertial oscillations initiated by partial frictional decoupling as warm air was advected in over the sea ice. The start of the autumn season was related to an abrupt change in atmospheric conditions, rather than to the gradual change in solar radiation. The autumn onset appeared as a rapid cooling of the whole atmosphere and the freeze up followed as the warm surface lost heat to the atmosphere. While the surface type had a pronounced impact on boundary

  19. First field demonstration of cloud datacenter workflow automation employing dynamic optical transport network resources under OpenStack and OpenFlow orchestration.

    PubMed

    Szyrkowiec, Thomas; Autenrieth, Achim; Gunning, Paul; Wright, Paul; Lord, Andrew; Elbers, Jörg-Peter; Lumb, Alan

    2014-02-10

    For the first time, we demonstrate the orchestration of elastic datacenter and inter-datacenter transport network resources using a combination of OpenStack and OpenFlow. Programmatic control allows a datacenter operator to dynamically request optical lightpaths from a transport network operator to accommodate rapid changes of inter-datacenter workflows.

  20. Thermal structure analyses for CSM testbed (COMET)

    NASA Technical Reports Server (NTRS)

    Xue, David Y.; Mei, Chuh

    1994-01-01

    This document is the final report for the project entitled 'Thermal Structure Analyses for CSM Testbed (COMET),' for the period of May 16, 1992 - August 15, 1994. The project was focused on the investigation and development of finite element analysis capability of the computational structural mechanics (CSM) testbed (COMET) software system in the field of thermal structural responses. The stages of this project consisted of investigating present capabilities, developing new functions, analysis demonstrations, and research topics. The appendices of this report list the detailed documents of major accomplishments and demonstration runstreams for future references.

  1. Design of testbed and emulation tools

    NASA Technical Reports Server (NTRS)

    Lundstrom, S. F.; Flynn, M. J.

    1986-01-01

    The research summarized was concerned with the design of testbed and emulation tools suitable to assist in projecting, with reasonable accuracy, the expected performance of highly concurrent computing systems on large, complete applications. Such testbed and emulation tools are intended for the eventual use of those exploring new concurrent system architectures and organizations, either as users or as designers of such systems. While a range of alternatives was considered, a software based set of hierarchical tools was chosen to provide maximum flexibility, to ease in moving to new computers as technology improves and to take advantage of the inherent reliability and availability of commercially available computing systems.

  2. Towards a testbed for malicious code detection

    SciTech Connect

    Lo, R.; Kerchen, P.; Crawford, R.; Ho, W.; Crossley, J.; Fink, G.; Levitt, K.; Olsson, R.; Archer, M. . Div. of Computer Science)

    1991-01-01

    This paper proposes an environment for detecting many types of malicious code, including computer viruses, Trojan horses, and time/logic bombs. This malicious code testbed (MCT) is based upon both static and dynamic analysis tools developed at the University of California, Davis, which have been shown to be effective against certain types of malicious code. The testbed extends the usefulness of these tools by using them in a complementary fashion to detect more general cases of malicious code. Perhaps more importantly, the MCT allows administrators and security analysts to check a program before installation, thereby avoiding any damage a malicious program might inflict. 5 refs., 2 figs., 2 tabs.

  3. The design and implementation of the LLNL gigabit testbed

    SciTech Connect

    Garcia, D.

    1994-12-01

    This paper will look at the design and implementation of the LLNL Gigabit testbed (LGTB), where various high speed networking products, can be tested in one environment. The paper will discuss the philosophy behind the design of and the need for the testbed, the tests that are performed in the testbed, and the tools used to implement those tests.

  4. Observed and simulated temperature dependence of the liquid water path of low clouds

    SciTech Connect

    Del Genio, A.D.; Wolf, A.B.

    1996-04-01

    Data being acquired at the Atmospheric Radiation Measurement (ARM) Southern great Plains (SGP) Cloud and Radiation Testbed (CART) site can be used to examine the factors determining the temperature dependence of cloud optical thickness. We focus on cloud liquid water and physical thickness variations which can be derived from existing ARM measurements.

  5. Time-multiplexed open-path TDLAS spectrometer for dynamic, sampling-free, interstitial H2 18O and H2 16O vapor detection in ice clouds

    NASA Astrophysics Data System (ADS)

    Kühnreich, B.; Wagner, S.; Habig, J. C.; Möhler, O.; Saathoff, H.; Ebert, V.

    2015-04-01

    An advanced in situ diode laser hygrometer for simultaneous, sampling-free detection of interstitial H2 16O and H2 18O vapor was developed and tested in the aerosol interaction and dynamics in atmosphere (AIDA) cloud chamber during dynamic cloud formation processes. The spectrometer to measure isotope-resolved water vapor concentrations comprises two rapidly time-multiplexed DFB lasers near 1.4 and 2.7 µm and an open-path White cell with 227-m absorption path length and 4-m mirror separation. A dynamic water concentration range from 2.6 ppb to 87 ppm for H2 16O and 87 ppt to 3.6 ppm for H2 18O could be achieved and was used to enable a fast and direct detection of dynamic isotope ratio changes during ice cloud formation in the AIDA chamber at temperatures between 190 and 230 K. Relative changes in the H2 18O/H2 16O isotope ratio of 1 % could be detected and resolved with a signal-to-noise ratio of 7. This converts to an isotope ratio resolution limit of 0.15 % at 1-s time resolution.

  6. A Survey of Cyber Ranges and Testbeds

    DTIC Science & Technology

    2013-10-01

    9 4.2.5 OPNET -based...level information such as packet-level data is produced. 4.2.5 OPNET -based Other testbeds have used commercial simulation software as their basis... OPNET was used to generate probe and DoS attacks in an evaluation of a frequency-based IDS [54]. It has also been used to examine network

  7. Cognitive nonlinear radar test-bed

    NASA Astrophysics Data System (ADS)

    Hedden, Abigail S.; Wikner, David A.; Martone, Anthony; McNamara, David

    2013-05-01

    Providing situational awareness to the warfighter requires radar, communications, and other electronic systems that operate in increasingly cluttered and dynamic electromagnetic environments. There is a growing need for cognitive RF systems that are capable of monitoring, adapting to, and learning from their environments in order to maintain their effectiveness and functionality. Additionally, radar systems are needed that are capable of adapting to an increased number of targets of interest. Cognitive nonlinear radar may offer critical solutions to these growing problems. This work focuses on ongoing efforts at the U.S. Army Research Laboratory (ARL) to develop a cognitive nonlinear radar test-bed. ARL is working toward developing a test-bed that uses spectrum sensing to monitor the RF environment and dynamically change the transmit waveforms to achieve detection of nonlinear targets with high confidence. This work presents the architecture of the test-bed system along with a discussion of its current capabilities and limitations. A brief outlook is presented for the project along with a discussion of a future cognitive nonlinear radar test-bed.

  8. A Laboratory Testbed for Embedded Fuzzy Control

    ERIC Educational Resources Information Center

    Srivastava, S.; Sukumar, V.; Bhasin, P. S.; Arun Kumar, D.

    2011-01-01

    This paper presents a novel scheme called "Laboratory Testbed for Embedded Fuzzy Control of a Real Time Nonlinear System." The idea is based upon the fact that project-based learning motivates students to learn actively and to use their engineering skills acquired in their previous years of study. It also fosters initiative and focuses…

  9. ARA testbed template based UHE neutrino search

    NASA Astrophysics Data System (ADS)

    Prohira, Steven

    2014-03-01

    The Askaryan Radio Array (ARA) is an in-ice Antarctic neutrino detector deployed near the South Pole. ARA is designed to detect ultra high energy (UHE) neutrinos in the range of 0.1-10 EeV. Data from the ARA testbed, deployed in the 2010-2011 season, is used for a template based neutrino search. Askaryan Radio Array.

  10. Embedded Data Processor and Portable Computer Technology testbeds

    NASA Technical Reports Server (NTRS)

    Alena, Richard; Liu, Yuan-Kwei; Goforth, Andre; Fernquist, Alan R.

    1993-01-01

    Attention is given to current activities in the Embedded Data Processor and Portable Computer Technology testbed configurations that are part of the Advanced Data Systems Architectures Testbed at the Information Sciences Division at NASA Ames Research Center. The Embedded Data Processor Testbed evaluates advanced microprocessors for potential use in mission and payload applications within the Space Station Freedom Program. The Portable Computer Technology (PCT) Testbed integrates and demonstrates advanced portable computing devices and data system architectures. The PCT Testbed uses both commercial and custom-developed devices to demonstrate the feasibility of functional expansion and networking for portable computers in flight missions.

  11. Rover Attitude and Pointing System Simulation Testbed

    NASA Technical Reports Server (NTRS)

    Vanelli, Charles A.; Grinblat, Jonathan F.; Sirlin, Samuel W.; Pfister, Sam

    2009-01-01

    The MER (Mars Exploration Rover) Attitude and Pointing System Simulation Testbed Environment (RAPSSTER) provides a simulation platform used for the development and test of GNC (guidance, navigation, and control) flight algorithm designs for the Mars rovers, which was specifically tailored to the MERs, but has since been used in the development of rover algorithms for the Mars Science Laboratory (MSL) as well. The software provides an integrated simulation and software testbed environment for the development of Mars rover attitude and pointing flight software. It provides an environment that is able to run the MER GNC flight software directly (as opposed to running an algorithmic model of the MER GNC flight code). This improves simulation fidelity and confidence in the results. Further more, the simulation environment allows the user to single step through its execution, pausing, and restarting at will. The system also provides for the introduction of simulated faults specific to Mars rover environments that cannot be replicated in other testbed platforms, to stress test the GNC flight algorithms under examination. The software provides facilities to do these stress tests in ways that cannot be done in the real-time flight system testbeds, such as time-jumping (both forwards and backwards), and introduction of simulated actuator faults that would be difficult, expensive, and/or destructive to implement in the real-time testbeds. Actual flight-quality codes can be incorporated back into the development-test suite of GNC developers, closing the loop between the GNC developers and the flight software developers. The software provides fully automated scripting, allowing multiple tests to be run with varying parameters, without human supervision.

  12. Sparse matrix methods research using the CSM testbed software system

    NASA Technical Reports Server (NTRS)

    Chu, Eleanor; George, J. Alan

    1989-01-01

    Research is described on sparse matrix techniques for the Computational Structural Mechanics (CSM) Testbed. The primary objective was to compare the performance of state-of-the-art techniques for solving sparse systems with those that are currently available in the CSM Testbed. Thus, one of the first tasks was to become familiar with the structure of the testbed, and to install some or all of the SPARSPAK package in the testbed. A suite of subroutines to extract from the data base the relevant structural and numerical information about the matrix equations was written, and all the demonstration problems distributed with the testbed were successfully solved. These codes were documented, and performance studies comparing the SPARSPAK technology to the methods currently in the testbed were completed. In addition, some preliminary studies were done comparing some recently developed out-of-core techniques with the performance of the testbed processor INV.

  13. Automatic Cloud Bursting under FermiCloud

    SciTech Connect

    Wu, Hao; Shangping, Ren; Garzoglio, Gabriele; Timm, Steven; Bernabeu, Gerard; Kim, Hyun Woo; Chadwick, Keith; Jang, Haengjin; Noh, Seo-Young

    2013-01-01

    Cloud computing is changing the infrastructure upon which scientific computing depends from supercomputers and distributed computing clusters to a more elastic cloud-based structure. The service-oriented focus and elasticity of clouds can not only facilitate technology needs of emerging business but also shorten response time and reduce operational costs of traditional scientific applications. Fermi National Accelerator Laboratory (Fermilab) is currently in the process of building its own private cloud, FermiCloud, which allows the existing grid infrastructure to use dynamically provisioned resources on FermiCloud to accommodate increased but dynamic computation demand from scientists in the domains of High Energy Physics (HEP) and other research areas. Cloud infrastructure also allows to increase a private cloud’s resource capacity through “bursting” by borrowing or renting resources from other community or commercial clouds when needed. This paper introduces a joint project on building a cloud federation to support HEP applications between Fermi National Accelerator Laboratory and Korea Institution of Science and Technology Information, with technical contributions from the Illinois Institute of Technology. In particular, this paper presents two recent accomplishments of the joint project: (a) cloud bursting automation and (b) load balancer. Automatic cloud bursting allows computer resources to be dynamically reconfigured to meet users’ demands. The load balance algorithm which the cloud bursting depends on decides when and where new resources need to be allocated. Our preliminary prototyping and experiments have shown promising success, yet, they also have opened new challenges to be studied

  14. Mini-mast CSI testbed user's guide

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Pappa, Richard S.; Sulla, Jeffrey L.; Elliott, Kenny B.; Miserentino, Robert; Bailey, James P.; Cooper, Paul A.; Williams, Boyd L., Jr.; Bruner, Anne M.

    1992-01-01

    The Mini-Mast testbed is a 20 m generic truss highly representative of future deployable trusses for space applications. It is fully instrumented for system identification and active vibrations control experiments and is used as a ground testbed at NASA-Langley. The facility has actuators and feedback sensors linked via fiber optic cables to the Advanced Real Time Simulation (ARTS) system, where user defined control laws are incorporated into generic controls software. The object of the facility is to conduct comprehensive active vibration control experiments on a dynamically realistic large space structure. A primary goal is to understand the practical effects of simplifying theoretical assumptions. This User's Guide describes the hardware and its primary components, the dynamic characteristics of the test article, the control law implementation process, and the necessary safeguards employed to protect the test article. Suggestions for a strawman controls experiment are also included.

  15. VCE testbed program planning and definition study

    NASA Technical Reports Server (NTRS)

    Westmoreland, J. S.; Godston, J.

    1978-01-01

    The flight definition of the Variable Stream Control Engine (VSCE) was updated to reflect design improvements in the two key components: (1) the low emissions duct burner, and (2) the coannular exhaust nozzle. The testbed design was defined and plans for the overall program were formulated. The effect of these improvements was evaluated for performance, emissions, noise, weight, and length. For experimental large scale testing of the duct burner and coannular nozzle, a design definition of the VCE testbed configuration was made. This included selecting the core engine, determining instrumentation requirements, and selecting the test facilities, in addition to defining control system and assembly requirements. Plans for a comprehensive test program to demonstrate the duct burner and nozzle technologies were formulated. The plans include both aeroacoustic and emissions testing.

  16. Overview of the Telescience Testbed Program

    NASA Technical Reports Server (NTRS)

    Rasmussen, Daryl N.; Mian, Arshad; Leiner, Barry M.

    1991-01-01

    The NASA's Telescience Testbed Program (TTP) conducted by the Ames Research Center is described with particular attention to the objectives, the approach used to achieve these objectives, and the expected benefits of the program. The goal of the TTP is to gain operational experience for the Space Station Freedom and the Earth Observing System programs, using ground testbeds, and to define the information and communication systems requirements for the development and operation of these programs. The results of TTP are expected to include the requirements for the remote coaching, command and control, monitoring and maintenance, payload design, and operations management. In addition, requirements for technologies such as workstations, software, video, automation, data management, and networking will be defined.

  17. DEVELOPMENT OF A FACILITY MONITORING TESTBED

    SciTech Connect

    A. M. MIELKE; C. M. BOYLE; ET AL

    2001-06-01

    The Advanced Surveillance Technology (AST) project at Los Alamos National Laboratory (LANL), funded by the Nonproliferation Research and Engineering Group (NN-20) of the National Nuclear Security Administration (NNSA), is fielding a facility monitoring application testbed at the National High Magnetic Field Laboratory-Pulsed Field Laboratory (NHMFL-PFL). This application is designed to utilize continuous remote monitoring technology to provide an additional layer of personnel safety assurance and equipment fault prediction capability in the laboratory. Various off-the-shelf surveillance sensor technologies are evaluated. In this testbed environment, several of the deployed monitoring sensors have detected transient precursor equipment-fault events. Additionally the prototype remote monitoring system employs specialized video state recognition software to determine whether the operations occurring within the facility are acceptable, given the observed equipment status. By integrating the Guardian reasoning system developed at LANL, anomalous facility events trigger alarms signaling personnel to the likelihood of an equipment failure or unsafe operation.

  18. Dynamic federation of grid and cloud storage

    NASA Astrophysics Data System (ADS)

    Furano, Fabrizio; Keeble, Oliver; Field, Laurence

    2016-09-01

    The Dynamic Federations project ("Dynafed") enables the deployment of scalable, distributed storage systems composed of independent storage endpoints. While the Uniform Generic Redirector at the heart of the project is protocol-agnostic, we have focused our effort on HTTP-based protocols, including S3 and WebDAV. The system has been deployed on testbeds covering the majority of the ATLAS and LHCb data, and supports geography-aware replica selection. The work done exploits the federation potential of HTTP to build systems that offer uniform, scalable, catalogue-less access to the storage and metadata ensemble and the possibility of seamless integration of other compatible resources such as those from cloud providers. Dynafed can exploit the potential of the S3 delegation scheme, effectively federating on the fly any number of S3 buckets from different providers and applying a uniform authorization to them. This feature has been used to deploy in production the BOINC Data Bridge, which uses the Uniform Generic Redirector with S3 buckets to harmonize the BOINC authorization scheme with the Grid/X509. The Data Bridge has been deployed in production with good results. We believe that the features of a loosely coupled federation of open-protocolbased storage elements open many possibilities of smoothly evolving the current computing models and of supporting new scientific computing projects that rely on massive distribution of data and that would appreciate systems that can more easily be interfaced with commercial providers and can work natively with Web browsers and clients.

  19. Thermodynamic and cloud parameter retrieval using infrared spectral data

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Huang, Hung-Lung A.; Li, Jun; McGill, Matthew J.; Mango, Stephen A.

    2005-01-01

    High-resolution infrared radiance spectra obtained from near nadir observations provide atmospheric, surface, and cloud property information. A fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. The retrieval algorithm is presented along with its application to recent field experiment data from the NPOESS Airborne Sounding Testbed - Interferometer (NAST-I). The retrieval accuracy dependence on cloud properties is discussed. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with an accuracy of approximately 1.0 km. Preliminary NAST-I retrieval results from the recent Atlantic-THORPEX Regional Campaign (ATReC) are presented and compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL).

  20. Commissioning Results on the JWST Testbed Telescope

    NASA Technical Reports Server (NTRS)

    Dean, Bruce H.; Acton, D. Scott

    2006-01-01

    The one-meter 18 segment JWST Testbed Telescope (TBT) has been developed at Ball Aerospace to facilitate commissioning operations for the JWST Observatory. Eight different commissioning activities were tested on the TBT: telescope focus sweep, segment ID and Search, image array, global alignment, image stacking, coarse phasing, fine phasing, and multi-field phasing. This paper describes recent commissioning results from experiments performed on the TBT.

  1. Variable Dynamic Testbed Vehicle: Dynamics Analysis

    NASA Technical Reports Server (NTRS)

    Lee, A. Y.; Le, N. T.; Marriott, A. T.

    1997-01-01

    The Variable Dynamic Testbed Vehicle (VDTV) concept has been proposed as a tool to evaluate collision avoidance systems and to perform driving-related human factors research. The goal of this study is to analytically investigate to what extent a VDTV with adjustable front and rear anti-roll bar stiffnesses, programmable damping rates, and four-wheel-steering can emulate the lateral dynamics of a broad range of passenger vehicles.

  2. SSERVI Analog Regolith Simulant Testbed Facility

    NASA Astrophysics Data System (ADS)

    Minafra, Joseph; Schmidt, Gregory; Bailey, Brad; Gibbs, Kristina

    2016-10-01

    The Solar System Exploration Research Virtual Institute (SSERVI) at NASA's Ames Research Center in California's Silicon Valley was founded in 2013 to act as a virtual institute that provides interdisciplinary research centered on the goals of its supporting directorates: NASA Science Mission Directorate (SMD) and the Human Exploration & Operations Mission Directorate (HEOMD).Primary research goals of the Institute revolve around the integration of science and exploration to gain knowledge required for the future of human space exploration beyond low Earth orbit. SSERVI intends to leverage existing JSC1A regolith simulant resources into the creation of a regolith simulant testbed facility. The purpose of this testbed concept is to provide the planetary exploration community with a readily available capability to test hardware and conduct research in a large simulant environment.SSERVI's goals include supporting planetary researchers within NASA, other government agencies; private sector and hardware developers; competitors in focused prize design competitions; and academic sector researchers.SSERVI provides opportunities for research scientists and engineers to study the effects of regolith analog testbed research in the planetary exploration field. This capability is essential to help to understand the basic effects of continued long-term exposure to a simulated analog test environment.The current facility houses approximately eight tons of JSC-1A lunar regolith simulant in a test bin consisting of a 4 meter by 4 meter area, including dust mitigation and safety oversight.Facility hardware and environment testing scenarios could include, Lunar surface mobility, Dust exposure and mitigation, Regolith handling and excavation, Solar-like illumination, Lunar surface compaction profile, Lofted dust, Mechanical properties of lunar regolith, Surface features (i.e. grades and rocks)Numerous benefits vary from easy access to a controlled analog regolith simulant testbed, and

  3. Evaluating Aerosol Process Modules within the Framework of the Aerosol Modeling Testbed

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Velu, V.; Gustafson, W. I.; Chapman, E.; Easter, R. C.; Shrivastava, M.; Singh, B.

    2012-12-01

    Factors that influence predictions of aerosol direct and indirect forcing, such as aerosol mass, composition, size distribution, hygroscopicity, and optical properties, still contain large uncertainties in both regional and global models. New aerosol treatments are usually implemented into a 3-D atmospheric model and evaluated using a limited number of measurements from a specific case study. Under this modeling paradigm, the performance and computational efficiency of several treatments for a specific aerosol process cannot be adequately quantified because many other processes among various modeling studies (e.g. grid configuration, meteorology, emission rates) are different as well. The scientific community needs to know the advantages and disadvantages of specific aerosol treatments when the meteorology, chemistry, and other aerosol processes are identical in order to reduce the uncertainties associated with aerosols predictions. To address these issues, an Aerosol Modeling Testbed (AMT) has been developed that systematically and objectively evaluates new aerosol treatments for use in regional and global models. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from the Community Atmosphere Model version 5 (CAM5) have also been ported to WRF so that they can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. Data from several campaigns, including the 2006

  4. The Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Hines, Braden; Bell, Charles; Shen, Tsae-Pyng; Bloemhof, Eric; Zhao, Feng; Regehr, Martin; Holmes, Howard; Irigoyen, Robert; Neat, Gregory

    2003-01-01

    The Micro-Arcsecond Metrology (MAM) testbed is a ground-based system of optical and electronic equipment for testing components, systems, and engineering concepts for the Space Interferometer Mission (SIM) and similar future missions, in which optical interferometers will be operated in outer space. In addition, the MAM testbed is of interest in its own right as a highly precise metrological system. The designs of the SIM interferometer and the MAM testbed reflect a requirement to measure both the position of the starlight central fringe and the change in the internal optical path of the interferometer with sufficient spatial resolution to generate astrometric data with angular resolution at the microarcsecond level. The internal path is to be measured by use of a small metrological laser beam of 1,319-nm wavelength, whereas the position of the starlight fringe is to be estimated by use of a charge-coupled-device (CCD) image detector sampling a large concentric annular beam. For the SIM to succeed, the optical path length determined from the interferometer fringes must be tracked by the metrological subsystem to within tens of picometers, through all operational motions of an interferometer delay line and siderostats. The purpose of the experiments performed on the MAM testbed is to demonstrate this agreement in a large-scale simulation that includes a substantial portion of the system in the planned configuration for operation in outer space. A major challenge in this endeavor is to align the metrological beam with the starlight beam in order to maintain consistency between the metrological and starlight subsystems at the system level. The MAM testbed includes an optical interferometer with a white light source, all major optical components of a stellar interferometer, and heterodyne metrological sensors. The aforementioned subsystems are installed in a large vacuum chamber in order to suppress atmospheric and thermal disturbances. The MAM is divided into two

  5. Overview on In-Space Internet Node Testbed (ISINT)

    NASA Technical Reports Server (NTRS)

    Richard, Alan M.; Kachmar, Brian A.; Fabian, Theodore; Kerczewski, Robert J.

    2000-01-01

    The Satellite Networks and Architecture Branch has developed the In-Space Internet Node Technology testbed (ISINT) for investigating the use of commercial Internet products for NASA missions. The testbed connects two closed subnets over a tabletop Ka-band transponder by using commercial routers and modems. Since many NASA assets are in low Earth orbits (LEO's), the testbed simulates the varying signal strength, changing propagation delay, and varying connection times that are normally experienced when communicating to the Earth via a geosynchronous orbiting (GEO) communications satellite. Research results from using this testbed will be used to determine which Internet technologies are appropriate for NASA's future communication needs.

  6. ISS Update: ISTAR -- International Space Station Testbed for Analog Research

    NASA Video Gallery

    NASA Public Affairs Officer Kelly Humphries interviews Sandra Fletcher, EVA Systems Flight Controller. They discuss the International Space Station Testbed for Analog Research (ISTAR) activity that...

  7. Airborne Open Polar/Imaging Nephelometer for Ice Particles in Cirrus Clouds and Aerosols Field Campaign Report

    SciTech Connect

    Martins, JV

    2016-04-01

    The Open Imaging Nephelometer (O-I-Neph) instrument is an adaptation of a proven laboratory instrument built and tested at the University of Maryland, Baltimore County (UMBC), the Polarized Imaging Nephelometer (PI-Neph). The instrument design of both imaging nephelometers uses a narrow-beam laser source and a wide-field-of-view imaging camera to capture the entire scattering-phase function in one image, quasi-instantaneously.

  8. Design of the Dual Conjugate Adaptive Optics Test-bed

    NASA Astrophysics Data System (ADS)

    Sharf, Inna; Bell, K.; Crampton, D.; Fitzsimmons, J.; Herriot, Glen; Jolissaint, Laurent; Lee, B.; Richardson, H.; van der Kamp, D.; Veran, Jean-Pierre

    In this paper, we describe the Multi-Conjugate Adaptive Optics laboratory test-bed presently under construction at the University of Victoria, Canada. The test-bench will be used to support research in the performance of multi-conjugate adaptive optics, turbulence simulators, laser guide stars and miniaturizing adaptive optics. The main components of the test-bed include two micro-machined deformable mirrors, a tip-tilt mirror, four wavefront sensors, a source simulator, a dual-layer turbulence simulator, as well as computational and control hardware. The paper will describe in detail the opto-mechanical design of the adaptive optics module, the design of the hot-air turbulence generator and the configuration chosen for the source simulator. Below, we present a summary of these aspects of the bench. The optical and mechanical design of the test-bed has been largely driven by the particular choice of the deformable mirrors. These are continuous micro-machined mirrors manufactured by Boston Micromachines Corporation. They have a clear aperture of 3.3 mm and are deformed with 140 actuators arranged in a square grid. Although the mirrors have an open-loop bandwidth of 6.6 KHz, their shape can be updated at a sampling rate of 100 Hz. In our optical design, the mirrors are conjugated at 0km and 10 km in the atmosphere. A planar optical layout was achieved by using four off-axis paraboloids and several folding mirrors. These optics will be mounted on two solid blocks which can be aligned with respect to each other. The wavefront path design accommodates 3 monochromatic guide stars that can be placed at either 90 km or at infinity. The design relies on the natural separation of the beam into 3 parts because of differences in locations of the guide stars in the field of view. In total four wavefront sensors will be procured from Adaptive Optics Associates (AOA) or built in-house: three for the guide stars and the fourth to collect data from the science source output in

  9. Stochastic Radiative Transfer in Polar Mixed Phase Clouds

    NASA Astrophysics Data System (ADS)

    Brodie, J.; Veron, D. E.

    2004-12-01

    According to recent research, mixed phase clouds comprise one third of the overall annual cloud cover in the Arctic region. These clouds contain distinct regions of liquid water and ice, which have a different impact on radiation than single-phase clouds. Despite the prevalence of mixed phase clouds in the polar regions, many modern atmospheric general circulation models use single-phase clouds in their radiation routines. A stochastic approach to representating the transfer of shortwave radiation through a cloud layer where the distribution of the ice and liquid is governed by observed statistics is being assessed. Data from the Surface Heat Budget of the Arctic (SHEBA) program and the Atmospheric Radiation Measurement (ARM) program's North Slopes of Alaska Cloud and Radiation Testbed site will be used to determine the characteristic features of the cloud field and to evaluate the performance of this statistical model.

  10. Application of Model-based Prognostics to a Pneumatic Valves Testbed

    NASA Technical Reports Server (NTRS)

    Daigle, Matthew; Kulkarni, Chetan S.; Gorospe, George

    2014-01-01

    Pneumatic-actuated valves play an important role in many applications, including cryogenic propellant loading for space operations. Model-based prognostics emphasizes the importance of a model that describes the nominal and faulty behavior of a system, and how faulty behavior progresses in time, causing the end of useful life of the system. We describe the construction of a testbed consisting of a pneumatic valve that allows the injection of faulty behavior and controllable fault progression. The valve opens discretely, and is controlled through a solenoid valve. Controllable leaks of pneumatic gas in the testbed are introduced through proportional valves, allowing the testing and validation of prognostics algorithms for pneumatic valves. A new valve prognostics approach is developed that estimates fault progression and predicts remaining life based only on valve timing measurements. Simulation experiments demonstrate and validate the approach.

  11. xGDBvm: A Web GUI-Driven Workflow for Annotating Eukaryotic Genomes in the Cloud[OPEN

    PubMed Central

    Merchant, Nirav

    2016-01-01

    Genome-wide annotation of gene structure requires the integration of numerous computational steps. Currently, annotation is arguably best accomplished through collaboration of bioinformatics and domain experts, with broad community involvement. However, such a collaborative approach is not scalable at today’s pace of sequence generation. To address this problem, we developed the xGDBvm software, which uses an intuitive graphical user interface to access a number of common genome analysis and gene structure tools, preconfigured in a self-contained virtual machine image. Once their virtual machine instance is deployed through iPlant’s Atmosphere cloud services, users access the xGDBvm workflow via a unified Web interface to manage inputs, set program parameters, configure links to high-performance computing (HPC) resources, view and manage output, apply analysis and editing tools, or access contextual help. The xGDBvm workflow will mask the genome, compute spliced alignments from transcript and/or protein inputs (locally or on a remote HPC cluster), predict gene structures and gene structure quality, and display output in a public or private genome browser complete with accessory tools. Problematic gene predictions are flagged and can be reannotated using the integrated yrGATE annotation tool. xGDBvm can also be configured to append or replace existing data or load precomputed data. Multiple genomes can be annotated and displayed, and outputs can be archived for sharing or backup. xGDBvm can be adapted to a variety of use cases including de novo genome annotation, reannotation, comparison of different annotations, and training or teaching. PMID:27020957

  12. Wavefront Control Testbed (WCT) Experiment Results

    NASA Technical Reports Server (NTRS)

    Burns, Laura A.; Basinger, Scott A.; Campion, Scott D.; Faust, Jessica A.; Feinberg, Lee D.; Hayden, William L.; Lowman, Andrew E.; Ohara, Catherine M.; Petrone, Peter P., III

    2004-01-01

    The Wavefront Control Testbed (WCT) was created to develop and test wavefront sensing and control algorithms and software for the segmented James Webb Space Telescope (JWST). Last year, we changed the system configuration from three sparse aperture segments to a filled aperture with three pie shaped segments. With this upgrade we have performed experiments on fine phasing with line-of-sight and segment-to-segment jitter, dispersed fringe visibility and grism angle;. high dynamic range tilt sensing; coarse phasing with large aberrations, and sampled sub-aperture testing. This paper reviews the results of these experiments.

  13. Telescience testbed result for Japanese experiment module

    NASA Astrophysics Data System (ADS)

    Matsumoto, K.; Higuchi, K.; Kimura, H.; Takeda, N.; Matsubara, S.; Izumita, M.; Toyama, Y.; Kato, M.; Kato, H.

    1990-10-01

    The first telescience testbed experiments for the Japanese Experiment Module (JEM) of the Space Station Freedom, conducted after the three year studies of its system requirements, are described. Three experiment themes of the First Material Processing Test (FMPT) of the Japanese Spacelab Mission are chosen for estimating communications requirements between the JEM and a ground station. A paper folding experiment is used to examine instruction aspects of onboard manual processing and onboard coaching. More than 10 principal investigators partipated in the experiments and were requested to answer a rating questionnaire for data acquisition. The results extracted from the questionnaire are summarized.

  14. ITS detector testbed system design and implementation

    NASA Astrophysics Data System (ADS)

    Chang, Edmond C. P.

    1999-03-01

    Intelligent Transportation Systems (ITS) implemented all over the world, has become an important and practical traffic management technique. Among all ITS subsystems, the detection system plays an integral element that provides all the necessary environmental information to the ITS infrastructure. This paper describes the ITS Detector testbed design, currently being implemented with these potential ITS applications on the State Highway 6 in College Station, Texas to provide a multi-sensor, multi-source fusion environment that utilizes both multi-sensor and distributed sensor system testing environment.

  15. The Magellan Final Report on Cloud Computing

    SciTech Connect

    ,; Coghlan, Susan; Yelick, Katherine

    2011-12-21

    The goal of Magellan, a project funded through the U.S. Department of Energy (DOE) Office of Advanced Scientific Computing Research (ASCR), was to investigate the potential role of cloud computing in addressing the computing needs for the DOE Office of Science (SC), particularly related to serving the needs of mid- range computing and future data-intensive computing workloads. A set of research questions was formed to probe various aspects of cloud computing from performance, usability, and cost. To address these questions, a distributed testbed infrastructure was deployed at the Argonne Leadership Computing Facility (ALCF) and the National Energy Research Scientific Computing Center (NERSC). The testbed was designed to be flexible and capable enough to explore a variety of computing models and hardware design points in order to understand the impact for various scientific applications. During the project, the testbed also served as a valuable resource to application scientists. Applications from a diverse set of projects such as MG-RAST (a metagenomics analysis server), the Joint Genome Institute, the STAR experiment at the Relativistic Heavy Ion Collider, and the Laser Interferometer Gravitational Wave Observatory (LIGO), were used by the Magellan project for benchmarking within the cloud, but the project teams were also able to accomplish important production science utilizing the Magellan cloud resources.

  16. Cross layer optimization for cloud-based radio over optical fiber networks

    NASA Astrophysics Data System (ADS)

    Shao, Sujie; Guo, Shaoyong; Qiu, Xuesong; Yang, Hui; Meng, Luoming

    2016-07-01

    To adapt the 5G communication, the cloud radio access network is a paradigm introduced by operators which aggregates all base stations computational resources into a cloud BBU pool. The interaction between RRH and BBU or resource schedule among BBUs in cloud have become more frequent and complex with the development of system scale and user requirement. It can promote the networking demand among RRHs and BBUs, and force to form elastic optical fiber switching and networking. In such network, multiple stratum resources of radio, optical and BBU processing unit have interweaved with each other. In this paper, we propose a novel multiple stratum optimization (MSO) architecture for cloud-based radio over optical fiber networks (C-RoFN) with software defined networking. Additionally, a global evaluation strategy (GES) is introduced in the proposed architecture. MSO can enhance the responsiveness to end-to-end user demands and globally optimize radio frequency, optical spectrum and BBU processing resources effectively to maximize radio coverage. The feasibility and efficiency of the proposed architecture with GES strategy are experimentally verified on OpenFlow-enabled testbed in terms of resource occupation and path provisioning latency.

  17. Application developer's tutorial for the CSM testbed architecture

    NASA Technical Reports Server (NTRS)

    Underwood, Phillip; Felippa, Carlos A.

    1988-01-01

    This tutorial serves as an illustration of the use of the programmer interface on the CSM Testbed Architecture (NICE). It presents a complete, but simple, introduction to using both the GAL-DBM (Global Access Library-Database Manager) and CLIP (Command Language Interface Program) to write a NICE processor. Familiarity with the CSM Testbed architecture is required.

  18. Testbed for the development of intelligent robot control

    SciTech Connect

    Harrigan, R.W.

    1986-01-01

    The Sensor Driven Robot Systems Testbed has been constructed to provide a working environment to aid in the development of intelligent robot control software. The Testbed employs vision and force as the robot's means of interrogating its environment. The Testbed, which has been operational for approximately 24 months, consists of a PUMA-560 robot manipulator coupled to a 2-dimensional vision system and force and torque sensing wrist. Recent work within the Testbed environment has led to a highly modularized control software concept with emphasis on detection and resolution of error situations. The objective of the Testbed is to develop intelligent robot control concepts incorporating planning and error recovery which are transportable to a wide variety of robot applications. This project is an ongoing, longterm development project and, as such, this paper represents a status report of the development work.

  19. Nuclear Instrumentation and Control Cyber Testbed Considerations – Lessons Learned

    SciTech Connect

    Jonathan Gray; Robert Anderson; Julio G. Rodriguez; Cheol-Kwon Lee

    2014-08-01

    Abstract: Identifying and understanding digital instrumentation and control (I&C) cyber vulnerabilities within nuclear power plants and other nuclear facilities, is critical if nation states desire to operate nuclear facilities safely, reliably, and securely. In order to demonstrate objective evidence that cyber vulnerabilities have been adequately identified and mitigated, a testbed representing a facility’s critical nuclear equipment must be replicated. Idaho National Laboratory (INL) has built and operated similar testbeds for common critical infrastructure I&C for over ten years. This experience developing, operating, and maintaining an I&C testbed in support of research identifying cyber vulnerabilities has led the Korean Atomic Energy Research Institute of the Republic of Korea to solicit the experiences of INL to help mitigate problems early in the design, development, operation, and maintenance of a similar testbed. The following information will discuss I&C testbed lessons learned and the impact of these experiences to KAERI.

  20. A Turbine-powered UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; High, James W.; Guerreiro, Nelson M.; Chambers, Ryan S.; Howard, Keith D.

    2007-01-01

    The latest version of the NASA Flying Controls Testbed (FLiC) integrates commercial-off-the-shelf components including airframe, autopilot, and a small turbine engine to provide a low cost experimental flight controls testbed capable of sustained speeds up to 200 mph. The series of flight tests leading up to the demonstrated performance of the vehicle in sustained, autopiloted 200 mph flight at NASA Wallops Flight Facility's UAV runway in August 2006 will be described. Earlier versions of the FLiC were based on a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate at Fort Eustis, Virginia and NASA Langley Research Center. The newer turbine powered platform (J-FLiC) builds on the successes using the relatively smaller, slower and less expensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches with the implementation of C-coded experimental controllers. Tracking video was taken during the test flights at Wallops and will be available for presentation at the conference. Analysis of flight data from both remotely piloted and autopiloted flights will be presented. Candidate experimental controllers for implementation will be discussed. It is anticipated that flight testing will resume in Spring 2007 and those results will be included, if possible.

  1. The Gemini Planet Imager coronagraph testbed

    NASA Astrophysics Data System (ADS)

    Soummer, Rémi; Sivaramakrishnan, Anand; Oppenheimer, Ben R.; Roberts, Robin; Brenner, Douglas; Carlotti, Alexis; Pueyo, Laurent; Macintosh, Bruce; Bauman, Brian; Saddlemyer, Les; Palmer, David; Erickson, Darren; Dorrer, Christophe; Caputa, Kris; Marois, Christian; Wallace, Kent; Griffiths, Emily; Mey, Jacob

    2009-08-01

    The Gemini Planet Imager (GPI) is a new facility instrument to be commissioned at the 8-m Gemini South telescope in early 2011. It combines of several subsystems including a 1500 subaperture Extreme Adaptive Optics system, an Apodized Pupil Lyot Coronagraph, a near-infrared high-accuracy interferometric wavefront sensor, and an Integral Field Unit Spectrograph, which serves as the science instrument. GPI's main scientific goal is to detect and characterize relatively young (<2GYr), self luminous planets with planet-star brightness ratios of <= 10-7 in the near infrared. Here we present an overview of the coronagraph subsystem, which includes a pupil apodization, a hard-edged focal plane mask and a Lyot stop. We discuss designs optimization, masks fabrication and testing. We describe a near infrared testbed, which achieved broadband contrast (H-band) below 10-6 at separations > 5λ/D, without active wavefront control (no deformable mirror). We use Fresnel propagation modeling to analyze the testbed results.

  2. Gemini Planet Imager coronagraph testbed results

    NASA Astrophysics Data System (ADS)

    Sivaramakrishnan, Anand; Soummer, Rémi; Oppenheimer, Ben R.; Carr, G. Lawrence; Mey, Jacob L.; Brenner, Doug; Mandeville, Charles W.; Zimmerman, Neil; Macintosh, Bruce A.; Graham, James R.; Saddlemyer, Les; Bauman, Brian; Carlotti, Alexis; Pueyo, Laurent; Tuthill, Peter G.; Dorrer, Christophe; Roberts, Robin; Greenbaum, Alexandra

    2010-07-01

    The Gemini Planet Imager (GPI) is an extreme AO coronagraphic integral field unit YJHK spectrograph destined for first light on the 8m Gemini South telescope in 2011. GPI fields a 1500 channel AO system feeding an apodized pupil Lyot coronagraph, and a nIR non-common-path slow wavefront sensor. It targets detection and characterizion of relatively young (<2GYr), self luminous planets up to 10 million times as faint as their primary star. We present the coronagraph subsystem's in-lab performance, and describe the studies required to specify and fabricate the coronagraph. Coronagraphic pupil apodization is implemented with metallic half-tone screens on glass, and the focal plane occulters are deep reactive ion etched holes in optically polished silicon mirrors. Our JH testbed achieves H-band contrast below a million at separations above 5 resolution elements, without using an AO system. We present an overview of the coronagraphic masks and our testbed coronagraphic data. We also demonstrate the performance of an astrometric and photometric grid that enables coronagraphic astrometry relative to the primary star in every exposure, a proven technique that has yielded on-sky precision of the order of a milliarsecond.

  3. Gemini Planet Imager Coronagraph Testbed Results

    SciTech Connect

    Sivaranmakrishnan, A.; Carr, G.; Soummer, R.; Oppenheimer, B.R.; Mey, J.L.; Brenner, D.; Mandeville, C.W.; Zimmerman, N. Macintosh, B.A.; Graham, J.R.; Saddlemyer, L.; Bauman, B.; Carlotti, A.; Pueyo, L.; Tuthill, P.G.; Dorrer, C.; Roberts, R.; Greenbaum, A.

    2010-12-08

    The Gemini Planet Imager (GPI) is an extreme AO coronagraphic integral field unit YJHK spectrograph destined for first light on the 8m Gemini South telescope in 2011. GPI fields a 1500 channel AO system feeding an apodized pupil Lyot coronagraph, and a nIR non-common-path slow wavefront sensor. It targets detection and characterizion of relatively young (<2GYr), self luminous planets up to 10 million times as faint as their primary star. We present the coronagraph subsystem's in-lab performance, and describe the studies required to specify and fabricate the coronagraph. Coronagraphic pupil apodization is implemented with metallic half-tone screens on glass, and the focal plane occulters are deep reactive ion etched holes in optically polished silicon mirrors. Our JH testbed achieves H-band contrast below a million at separations above 5 resolution elements, without using an AO system. We present an overview of the coronagraphic masks and our testbed coronagraphic data. We also demonstrate the performance of an astrometric and photometric grid that enables coronagraphic astrometry relative to the primary star in every exposure, a proven technique that has yielded on-sky precision of the order of a milliarsecond.

  4. Exploring the "what if?" in geology through a RESTful open-source framework for cloud-based simulation and analysis

    NASA Astrophysics Data System (ADS)

    Klump, Jens; Robertson, Jess

    2016-04-01

    The spatial and temporal extent of geological phenomena makes experiments in geology difficult to conduct, if not entirely impossible and collection of data is laborious and expensive - so expensive that most of the time we cannot test a hypothesis. The aim, in many cases, is to gather enough data to build a predictive geological model. Even in a mine, where data are abundant, a model remains incomplete because the information at the level of a blasting block is two orders of magnitude larger than the sample from a drill core, and we have to take measurement errors into account. So, what confidence can we have in a model based on sparse data, uncertainties and measurement error? Our framework consist of two layers: (a) a ground-truth layer that contains geological models, which can be statistically based on historical operations data, and (b) a network of RESTful synthetic sensor microservices which can query the ground-truth for underlying properties and produce a simulated measurement to a control layer, which could be a database or LIMS, a machine learner or a companies' existing data infrastructure. Ground truth data are generated by an implicit geological model which serves as a host for nested models of geological processes as smaller scales. Our two layers are implemented using Flask and Gunicorn, which are open source Python web application framework and server, the PyData stack (numpy, scipy etc) and Rabbit MQ (an open-source queuing library). Sensor data is encoded using a JSON-LD version of the SensorML and Observations and Measurements standards. Containerisation of the synthetic sensors using Docker and CoreOS allows rapid and scalable deployment of large numbers of sensors, as well as sensor discovery to form a self-organized dynamic network of sensors. Real-time simulation of data sources can be used to investigate crucial questions such as the potential information gain from future sensing capabilities, or from new sampling strategies, or the

  5. Expediting Experiments across Testbeds with AnyBed: A Testbed-Independent Topology Configuration System and Its Tool Set

    NASA Astrophysics Data System (ADS)

    Suzuki, Mio; Hazeyama, Hiroaki; Miyamoto, Daisuke; Miwa, Shinsuke; Kadobayashi, Youki

    Building an experimental network within a testbed has been a tiresome process for experimenters, due to the complexity of the physical resource assignment and the configuration overhead. Also, the process could not be expedited across testbeds, because the syntax of a configuration file varies depending on specific hardware and software. Re-configuration of an experimental topology for each testbed wastes time, an experimenter could not carry out his/her experiments during the limited lease time of a testbed at worst. In this paper, we propose the AnyBed: the experimental network-building system. The conceptual idea of AnyBed is “If experimental network topologies can be portable across any kinds of testbed, then, it would expedite building an experimental network on a testbed while manipulating experiments by each testbed support tool”. To achieve this concept, AnyBed divide an experimental network configuration into the logical and physical network topologies. Mapping these two topologies, AnyBed can build intended logical network topology on any PC clusters. We have evaluated the AnyBed implementation using two distinct clusters. The evaluation result shows a BGP topology with 150 nodes can be constructed on a large scale testbed in less than 113 seconds.

  6. easyGWAS: A Cloud-Based Platform for Comparing the Results of Genome-Wide Association Studies[OPEN

    PubMed Central

    Roqueiro, Damian; Salomé, Patrice A.; Kleeberger, Stefan; Zhu, Wangsheng; Lippert, Christoph; Stegle, Oliver; Schölkopf, Bernhard

    2017-01-01

    The ever-growing availability of high-quality genotypes for a multitude of species has enabled researchers to explore the underlying genetic architecture of complex phenotypes at an unprecedented level of detail using genome-wide association studies (GWAS). The systematic comparison of results obtained from GWAS of different traits opens up new possibilities, including the analysis of pleiotropic effects. Other advantages that result from the integration of multiple GWAS are the ability to replicate GWAS signals and to increase statistical power to detect such signals through meta-analyses. In order to facilitate the simple comparison of GWAS results, we present easyGWAS, a powerful, species-independent online resource for computing, storing, sharing, annotating, and comparing GWAS. The easyGWAS tool supports multiple species, the uploading of private genotype data and summary statistics of existing GWAS, as well as advanced methods for comparing GWAS results across different experiments and data sets in an interactive and user-friendly interface. easyGWAS is also a public data repository for GWAS data and summary statistics and already includes published data and results from several major GWAS. We demonstrate the potential of easyGWAS with a case study of the model organism Arabidopsis thaliana, using flowering and growth-related traits. PMID:27986896

  7. A numerical testbed for the characterization and optimization of aerosol remote sensing

    NASA Astrophysics Data System (ADS)

    Wang, J.; Xu, X.; Ding, S.; Zeng, J.; Spurr, R. J.; Liu, X.; Chance, K.; Holben, B. N.; Dubovik, O.; Mishchenko, M. I.

    2013-12-01

    Remote sensing of aerosols from satellite and ground-based platforms provides key datasets to help understand the effect of air-borne particulates on air quality, visibility, surface temperature, clouds, and precipitation. However, global measurements of aerosol parameters have only been generated in the last decade or so, with the advent of dedicated low-earth-orbit sun-synchronous satellite sensors such as those of NASA's Earth Observation System (EOS). Many EOS sensors are now past their design lifetimes. Meanwhile, a number of aerosol-related satellite missions are planned for the future, and several of these will have measurements of polarization. A common question often arises: How can a sensor be optimally configured (in terms of spectral wavelength ranges, viewing angles, and measurement quantities such as radiance and polarization) to best fulfill the scientific requirements within the mission's budget constraints? To address these kind of questions in a cost-effective manner, a numerical testbed for remote sensing aerosols is an important requirement. This testbed is a tool that can generate an objective assessment of aerosol information content anticipated from any (planned or real) instrument configuration. Here, we present a numerical testbed that combines the inverse optimal estimation theory with a forward model containing linearized particle scattering and radiative transfer code. Specifically, the testbed comprises the following components: (1) a linearized vector radiative transfer model that computes the Stokes 4-vector elements and their sensitivities (Jacobians) with respect to the aerosol single scattering parameters at each layer and over the column; (2) linearized Mie and T-matrix electromagnetic scattering codes to compute the macroscopic aerosol single scattering optical properties and their sensitivities with respect to refractive index, size, and shape; (3) a linearized land surface model that uses the Lambertian, Ross-Thick, and Li

  8. Search Cloud

    MedlinePlus

    ... this page: https://medlineplus.gov/cloud.html Search Cloud To use the sharing features on this page, ... chest pa and lateral Share the MedlinePlus search cloud with your users by embedding our search cloud ...

  9. The NASA LeRC regenerative fuel cell system testbed program for goverment and commercial applications

    NASA Astrophysics Data System (ADS)

    Maloney, Thomas M.; Prokopius, Paul R.; Voecks, Gerald E.

    1995-01-01

    The Electrochemical Technology Branch of the NASA Lewis Research Center (LeRC) has initiated a program to develop a renewable energy system testbed to evaluate, characterize, and demonstrate fully integrated regenerative fuel cell (RFC) system for space, military, and commercial applications. A multi-agency management team, led by NASA LeRC, is implementing the program through a unique international coalition which encompasses both government and industry participants. This open-ended teaming strategy optimizes the development for space, military, and commercial RFC system technologies. Program activities to date include system design and analysis, and reactant storage sub-system design, with a major emphasis centered upon testbed fabrication and installation and testing of two key RFC system components, namely, the fuel cells and electrolyzers. Construction of the LeRC 25 kW RFC system testbed at the NASA-Jet Propulsion Labortory (JPL) facility at Edwards Air Force Base (EAFB) is nearly complete and some sub-system components have already been installed. Furthermore, planning for the first commercial RFC system demonstration is underway.

  10. Development of a Scalable Testbed for Mobile Olfaction Verification.

    PubMed

    Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-12-09

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.

  11. Development of a Scalable Testbed for Mobile Olfaction Verification

    PubMed Central

    Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah

    2015-01-01

    The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175

  12. The Goddard Space Flight Center (GSFC) robotics technology testbed

    NASA Technical Reports Server (NTRS)

    Schnurr, Rick; Obrien, Maureen; Cofer, Sue

    1989-01-01

    Much of the technology planned for use in NASA's Flight Telerobotic Servicer (FTS) and the Demonstration Test Flight (DTF) is relatively new and untested. To provide the answers needed to design safe, reliable, and fully functional robotics for flight, NASA/GSFC is developing a robotics technology testbed for research of issues such as zero-g robot control, dual arm teleoperation, simulations, and hierarchical control using a high level programming language. The testbed will be used to investigate these high risk technologies required for the FTS and DTF projects. The robotics technology testbed is centered around the dual arm teleoperation of a pair of 7 degree-of-freedom (DOF) manipulators, each with their own 6-DOF mini-master hand controllers. Several levels of safety are implemented using the control processor, a separate watchdog computer, and other low level features. High speed input/output ports allow the control processor to interface to a simulation workstation: all or part of the testbed hardware can be used in real time dynamic simulation of the testbed operations, allowing a quick and safe means for testing new control strategies. The NASA/National Bureau of Standards Standard Reference Model for Telerobot Control System Architecture (NASREM) hierarchical control scheme, is being used as the reference standard for system design. All software developed for the testbed, excluding some of simulation workstation software, is being developed in Ada. The testbed is being developed in phases. The first phase, which is nearing completion, and highlights future developments is described.

  13. Technology Developments Integrating a Space Network Communications Testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enable its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions. It can simulate entire networks and can interface with external (testbed) systems. The key technology developments enabling the integration of MACHETE into a distributed testbed are the Monitor and Control module and the QualNet IP Network Emulator module. Specifically, the Monitor and Control module establishes a standard interface mechanism to centralize the management of each testbed component. The QualNet IP Network Emulator module allows externally generated network traffic to be passed through MACHETE to experience simulated network behaviors such as propagation delay, data loss, orbital effects and other communications characteristics, including entire network behaviors. We report a successful integration of MACHETE with a space communication testbed modeling a lunar exploration scenario. This document is the viewgraph slides of the presentation.

  14. Custom data support for the FAst -physics System Testbed and Research (FASTER) Project

    SciTech Connect

    Toto, T.; Jensen, M.; Vogelmann, A.; Wagener, R.; Liu, Y.; Lin, W.

    2010-03-15

    The multi-institution FAst -physics System Testbed and Research (FASTER) project, funded by the DOE Earth System Modeling program, aims to evaluate and improve the parameterizations of fast processes (those involving clouds, precipitation and aerosols) in global climate models, using a combination of numerical prediction models, single column models, cloud resolving models, large-eddy simulations, full global climate model output and ARM active and passive remote sensing and in-situ data. This poster presents the Custom Data Support effort for the FASTER project. The effort will provide tailored datasets, statistics, best estimates and quality control data, as needed and defined by FASTER participants, for use in evaluating and improving parameterizations of fast processes in GCMs. The data support will include custom gridding and averaging, for the model of interest, using high time resolution and pixel level data from continuous ARM observations and complementary datasets. In addition to the FASTER team, these datasets will be made available to the ARM Science Team. Initial efforts with respect to data product development, priorities, availability and distribution are summarized here with an emphasis on cloud, atmospheric state and aerosol properties as observed during the Spring 2000 Cloud IOP and the Spring 2003 Aerosol IOP at the ARM Southern Great Plains site.

  15. Construction and Modeling of a Controls Testbed

    NASA Technical Reports Server (NTRS)

    Nagle, James C.; Homaifar, Abdollah; Nasser, Ahmed A.; Bikdash, Marwan

    1997-01-01

    This paper describes the construction and modeling of a control system testbed to be used for the comparison of various control methodologies. We specifically wish to test fuzzy logic control and compare performance of various fuzzy controllers, including Hybrid Fuzzy-PID (HFPID) and Hierarchical Hybrid Fuzzy-PID (HHFPID) to other controllers including localized rate feedback, LQR/LTR, and H2/H(sub infinity). The control problem is that of vibration suppression in a thin plate with inputs coming from accelerometers and outputs going to piezoelectric actuators or 'patches'. A model based on experimental modal analysis of the plate is conducted and compared with an analytical model. The analytical model uses a boundary condition which is a mix of clamped and simply supported.

  16. Supersonic combustion engine testbed, heat lightning

    NASA Technical Reports Server (NTRS)

    Hoying, D.; Kelble, C.; Langenbahn, A.; Stahl, M.; Tincher, M.; Walsh, M.; Wisler, S.

    1990-01-01

    The design of a supersonic combustion engine testbed (SCET) aircraft is presented. The hypersonic waverider will utilize both supersonic combustion ramjet (SCRAMjet) and turbofan-ramjet engines. The waverider concept, system integration, electrical power, weight analysis, cockpit, landing skids, and configuration modeling are addressed in the configuration considerations. The subsonic, supersonic and hypersonic aerodynamics are presented along with the aerodynamic stability and landing analysis of the aircraft. The propulsion design considerations include: engine selection, turbofan ramjet inlets, SCRAMjet inlets and the SCRAMjet diffuser. The cooling requirements and system are covered along with the topics of materials and the hydrogen fuel tanks and insulation system. A cost analysis is presented and the appendices include: information about the subsonic wind tunnel test, shock expansion calculations, and an aerodynamic heat flux program.

  17. Aerodynamic design of the National Rotor Testbed.

    SciTech Connect

    Kelley, Christopher Lee

    2015-10-01

    A new wind turbine blade has been designed for the National Rotor Testbed (NRT) project and for future experiments at the Scaled Wind Farm Technology (SWiFT) facility with a specific focus on scaled wakes. This report shows the aerodynamic design of new blades that can produce a wake that has similitude to utility scale blades despite the difference in size and location in the atmospheric boundary layer. Dimensionless quantities circulation, induction, thrust coefficient, and tip-speed-ratio were kept equal between rotor scales in region 2 of operation. The new NRT design matched the aerodynamic quantities of the most common wind turbine in the United States, the GE 1.5sle turbine with 37c model blades. The NRT blade design is presented along with its performance subject to the winds at SWiFT. The design requirements determined by the SWiFT experimental test campaign are shown to be met.

  18. Introduction to the computational structural mechanics testbed

    NASA Technical Reports Server (NTRS)

    Lotts, C. G.; Greene, W. H.; Mccleary, S. L.; Knight, N. F., Jr.; Paulson, S. S.; Gillian, R. E.

    1987-01-01

    The Computational Structural Mechanics (CSM) testbed software system based on the SPAR finite element code and the NICE system is described. This software is denoted NICE/SPAR. NICE was developed at Lockheed Palo Alto Research Laboratory and contains data management utilities, a command language interpreter, and a command language definition for integrating engineering computational modules. SPAR is a system of programs used for finite element structural analysis developed for NASA by Lockheed and Engineering Information Systems, Inc. It includes many complementary structural analysis, thermal analysis, utility functions which communicate through a common database. The work on NICE/SPAR was motivated by requirements for a highly modular and flexible structural analysis system to use as a tool in carrying out research in computational methods and exploring computer hardware. Analysis examples are presented which demonstrate the benefits gained from a combination of the NICE command language with a SPAR computational modules.

  19. A land-surface Testbed for EOSDIS

    NASA Technical Reports Server (NTRS)

    Emery, William; Kelley, Tim

    1994-01-01

    The main objective of the Testbed project was to deliver satellite images via the Internet to scientific and educational users free of charge. The main method of operations was to store satellite images on a low cost tape library system, visually browse the raw satellite data, access the raw data filed, navigate the imagery through 'C' programming and X-Windows interface software, and deliver the finished image to the end user over the Internet by means of file transfer protocol methods. The conclusion is that the distribution of satellite imagery by means of the Internet is feasible, and the archiving of large data sets can be accomplished with low cost storage systems allowing multiple users.

  20. JINR cloud infrastructure evolution

    NASA Astrophysics Data System (ADS)

    Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.

    2016-09-01

    To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.

  1. Development of Hardware-in-the-loop Microgrid Testbed

    SciTech Connect

    Xiao, Bailu; Prabakar, Kumaraguru; Starke, Michael R; Liu, Guodong; Dowling, Kevin; Ollis, T Ben; Irminger, Philip; Xu, Yan; Dimitrovski, Aleksandar D

    2015-01-01

    A hardware-in-the-loop (HIL) microgrid testbed for the evaluation and assessment of microgrid operation and control system has been presented in this paper. The HIL testbed is composed of a real-time digital simulator (RTDS) for modeling of the microgrid, multiple NI CompactRIOs for device level control, a prototype microgrid energy management system (MicroEMS), and a relay protection system. The applied communication-assisted hybrid control system has been also discussed. Results of function testing of HIL controller, communication, and the relay protection system are presented to show the effectiveness of the proposed HIL microgrid testbed.

  2. Event metadata records as a testbed for scalable data mining

    NASA Astrophysics Data System (ADS)

    van Gemmeren, P.; Malon, D.

    2010-04-01

    At a data rate of 200 hertz, event metadata records ("TAGs," in ATLAS parlance) provide fertile grounds for development and evaluation of tools for scalable data mining. It is easy, of course, to apply HEP-specific selection or classification rules to event records and to label such an exercise "data mining," but our interest is different. Advanced statistical methods and tools such as classification, association rule mining, and cluster analysis are common outside the high energy physics community. These tools can prove useful, not for discovery physics, but for learning about our data, our detector, and our software. A fixed and relatively simple schema makes TAG export to other storage technologies such as HDF5 straightforward. This simplifies the task of exploiting very-large-scale parallel platforms such as Argonne National Laboratory's BlueGene/P, currently the largest supercomputer in the world for open science, in the development of scalable tools for data mining. Using a domain-neutral scientific data format may also enable us to take advantage of existing data mining components from other communities. There is, further, a substantial literature on the topic of one-pass algorithms and stream mining techniques, and such tools may be inserted naturally at various points in the event data processing and distribution chain. This paper describes early experience with event metadata records from ATLAS simulation and commissioning as a testbed for scalable data mining tool development and evaluation.

  3. Extraction of Profile Information from Cloud Contaminated Radiances. Appendixes 2

    NASA Technical Reports Server (NTRS)

    Smith, W. L.; Zhou, D. K.; Huang, H.-L.; Li, Jun; Liu, X.; Larar, A. M.

    2003-01-01

    Clouds act to reduce the signal level and may produce noise dependence on the complexity of the cloud properties and the manner in which they are treated in the profile retrieval process. There are essentially three ways to extract profile information from cloud contaminated radiances: (1) cloud-clearing using spatially adjacent cloud contaminated radiance measurements, (2) retrieval based upon the assumption of opaque cloud conditions, and (3) retrieval or radiance assimilation using a physically correct cloud radiative transfer model which accounts for the absorption and scattering of the radiance observed. Cloud clearing extracts the radiance arising from the clear air portion of partly clouded fields of view permitting soundings to the surface or the assimilation of radiances as in the clear field of view case. However, the accuracy of the clear air radiance signal depends upon the cloud height and optical property uniformity across the two fields of view used in the cloud clearing process. The assumption of opaque clouds within the field of view permits relatively accurate profiles to be retrieved down to near cloud top levels, the accuracy near the cloud top level being dependent upon the actual microphysical properties of the cloud. The use of a physically correct cloud radiative transfer model enables accurate retrievals down to cloud top levels and below semi-transparent cloud layers (e.g., cirrus). It should also be possible to assimilate cloudy radiances directly into the model given a physically correct cloud radiative transfer model using geometric and microphysical cloud parameters retrieved from the radiance spectra as initial cloud variables in the radiance assimilation process. This presentation reviews the above three ways to extract profile information from cloud contaminated radiances. NPOESS Airborne Sounder Testbed-Interferometer radiance spectra and Aqua satellite AIRS radiance spectra are used to illustrate how cloudy radiances can be used

  4. Survalytics: An Open-Source Cloud-Integrated Experience Sampling, Survey, and Analytics and Metadata Collection Module for Android Operating System Apps

    PubMed Central

    Mackey, Sean

    2016-01-01

    Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155

  5. The Living With a Star Space Environment Testbed Payload

    NASA Technical Reports Server (NTRS)

    Xapsos, Mike

    2015-01-01

    This presentation outlines a brief description of the Living With a Star (LWS) Program missions and detailed information about the Space Environment Testbed (SET) payload consisting of a space weather monitor and carrier containing 4 board experiments.

  6. The telerobot testbed: An architecture for remote servicing

    NASA Technical Reports Server (NTRS)

    Matijevic, J. R.

    1990-01-01

    The NASA/OAST Telerobot Testbed will reach its next increment in development by the end of FY-89. The testbed will have the capability for: force reflection in teleoperation, shared control, traded control, operator designate and relative update. These five capabilities will be shown in a module release and exchange operation using mockups of Orbital Replacement Units (ORU). This development of the testbed shows examples of the technologies needed for remote servicing, particularly under conditions of delay in transmissions to the servicing site. Here, the following topics are presented: the system architecture of the testbed which incorporates these telerobotic technologies for servicing, the implementation of the five capabilities and the operation of the ORU mockups.

  7. Situational descriptions of behavioral procedures: the in situ testbed.

    PubMed Central

    Kemp, S M; Eckerman, D A

    2001-01-01

    We demonstrate the In Situ testbed, a system that aids in evaluating computational models of learning, including artificial neural networks. The testbed models contingencies of reinforcement rising an extension of Mechner's (1959) notational system for the description of behavioral procedures. These contingencies are input to the model under test. The model's output is displayed as cumulative records. The cumulative record can then be compared to one produced by a pigeon exposed to the same contingencies. The testbed is tried with three published models of learning. Each model is exposed to up to three reinforcement schedules (testing ends when the model does not produce acceptable cumulative records): continuous reinforcement and extinction, fixed ratio, and fixed interval. The In Sitt testbed appears to be a reliable and valid testing procedure for comparing models of learning. PMID:11394484

  8. CT-directed robotic biopsy testbed: motivation and concept

    NASA Astrophysics Data System (ADS)

    Cleary, Kevin R.; Stoianovici, Dan S.; Glossop, Neil D.; Gary, Kevin A.; Onda, Sumiyo; Cody, Richard; Lindisch, David; Stanimir, Alexandru; Mazilu, Dumitru; Patriciu, Alexandru; Watson, Vance; Levy, Elliot

    2001-05-01

    As a demonstration platform, we are developing a robotic biopsy testbed incorporating a mobile CT scanner, a small needle driver robot, and an optical localizer. This testbed will be used to compare robotically assisted biopsy to the current manual technique, and allow us to investigate software architectures for integrating multiple medical devices. This is a collaboration between engineers and physicians from three universities and a commercial vendor. In this paper we describe the CT-directed biopsy technique, review some other biopsy systems including passive and semi- autonomous devices, describe our testbed components, and present our software architecture. This testbed is a first step in developing the image-guided, robotically assisted, physician directed, biopsy systems of the future.

  9. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system are documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  10. The computational structural mechanics testbed data library description

    NASA Technical Reports Server (NTRS)

    Stewart, Caroline B. (Compiler)

    1988-01-01

    The datasets created and used by the Computational Structural Mechanics Testbed software system is documented by this manual. A description of each dataset including its form, contents, and organization is presented.

  11. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    NASA Astrophysics Data System (ADS)

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-07-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.

  12. Experimental demonstration of multi-dimensional resources integration for service provisioning in cloud radio over fiber network

    PubMed Central

    Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young

    2016-01-01

    Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296

  13. Phoenix Missile Hypersonic Testbed (PMHT): System Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    A viewgraph presentation of the Phoenix Missile Hypersonic Testbed (PMHT) is shown. The contents include: 1) Need and Goals; 2) Phoenix Missile Hypersonic Testbed; 3) PMHT Concept; 4) Development Objectives; 5) Possible Research Payloads; 6) Possible Research Program Participants; 7) PMHT Configuration; 8) AIM-54 Internal Hardware Schematic; 9) PMHT Configuration; 10) New Guidance and Armament Section Profiles; 11) Nomenclature; 12) PMHT Stack; 13) Systems Concept; 14) PMHT Preflight Activities; 15) Notional Ground Path; and 16) Sample Theoretical Trajectories.

  14. X.509 Authentication/Authorization in FermiCloud

    SciTech Connect

    Kim, Hyunwoo; Timm, Steven

    2014-11-11

    We present a summary of how X.509 authentication and authorization are used with OpenNebula in FermiCloud. We also describe a history of why the X.509 authentication was needed in FermiCloud, and review X.509 authorization options, both internal and external to OpenNebula. We show how these options can be and have been used to successfully run scientific workflows on federated clouds, which include OpenNebula on FermiCloud and Amazon Web Services as well as other community clouds. We also outline federation options being used by other commercial and open-source clouds and cloud research projects.

  15. Ames life science telescience testbed evaluation

    NASA Technical Reports Server (NTRS)

    Haines, Richard F.; Johnson, Vicki; Vogelsong, Kristofer H.; Froloff, Walt

    1989-01-01

    Eight surrogate spaceflight mission specialists participated in a real-time evaluation of remote coaching using the Ames Life Science Telescience Testbed facility. This facility consisted of three remotely located nodes: (1) a prototype Space Station glovebox; (2) a ground control station; and (3) a principal investigator's (PI) work area. The major objective of this project was to evaluate the effectiveness of telescience techniques and hardware to support three realistic remote coaching science procedures: plant seed germinator charging, plant sample acquisition and preservation, and remote plant observation with ground coaching. Each scenario was performed by a subject acting as flight mission specialist, interacting with a payload operations manager and a principal investigator expert. All three groups were physically isolated from each other yet linked by duplex audio and color video communication channels and networked computer workstations. Workload ratings were made by the flight and ground crewpersons immediately after completing their assigned tasks. Time to complete each scientific procedural step was recorded automatically. Two expert observers also made performance ratings and various error assessments. The results are presented and discussed.

  16. Optical testbed for the LISA phasemeter

    NASA Astrophysics Data System (ADS)

    Schwarze, T. S.; Fernández Barranco, G.; Penkert, D.; Gerberding, O.; Heinzel, G.; Danzmann, K.

    2016-05-01

    The planned spaceborne gravitational wave detector LISA will allow the detection of gravitational waves at frequencies between 0.1 mHz and 1 Hz. A breadboard model for the metrology system aka the phasemeter was developed in the scope of an ESA technology development project by a collaboration between the Albert Einstein Institute, the Technical University of Denmark and the Danish industry partner Axcon Aps. It in particular provides the electronic readout of the main interferometer phases besides auxiliary functions. These include clock noise transfer, ADC pilot tone correction, inter-satellite ranging and data transfer. Besides in LISA, the phasemeter can also be applied in future satellite geodesy missions. Here we show the planning and advances in the implementation of an optical testbed for the full metrology chain. It is based on an ultra-stable hexagonal optical bench. This bench allows the generation of three unequal heterodyne beatnotes with a zero phase combination, thus providing the possibility to probe the phase readout for non-linearities in an optical three signal test. Additionally, the utilization of three independent phasemeters will allow the testing of the auxiliary functions. Once working, components can individually be replaced with flight-qualified hardware in this setup.

  17. ooi: OpenStack OCCI interface

    NASA Astrophysics Data System (ADS)

    López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo

    In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.

  18. Cloud regimes as phase transitions

    NASA Astrophysics Data System (ADS)

    Stechmann, Samuel N.; Hottovy, Scott

    2016-06-01

    Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes -- open versus closed cells -- fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. With this new conceptual viewpoint, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions.

  19. Advanced turboprop testbed systems study. Volume 1: Testbed program objectives and priorities, drive system and aircraft design studies, evaluation and recommendations and wind tunnel test plans

    NASA Technical Reports Server (NTRS)

    Bradley, E. S.; Little, B. H.; Warnock, W.; Jenness, C. M.; Wilson, J. M.; Powell, C. W.; Shoaf, L.

    1982-01-01

    The establishment of propfan technology readiness was determined and candidate drive systems for propfan application were identified. Candidate testbed aircraft were investigated for testbed aircraft suitability and four aircraft selected as possible propfan testbed vehicles. An evaluation of the four candidates was performed and the Boeing KC-135A and the Gulfstream American Gulfstream II recommended as the most suitable aircraft for test application. Conceptual designs of the two recommended aircraft were performed and cost and schedule data for the entire testbed program were generated. The program total cost was estimated and a wind tunnel program cost and schedule is generated in support of the testbed program.

  20. Cloud Computing for DoD

    DTIC Science & Technology

    2012-05-01

    cloud computing 17 NASA Nebula Platform •  Cloud computing pilot program at NASA Ames •  Integrates open-source components into seamless, self...Mission support •  Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research •  Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 •  NASA Nebula (2010). Retrieved from

  1. Ensemble forecasting for a hydrological testbed

    NASA Astrophysics Data System (ADS)

    Jankov, Isidora; Albers, Steve; Wharton, Linda; Tollerud, Ed; Yuan, Huiling; Toth, Zoltan

    2010-05-01

    Significant precipitation events in California during the winter season are often caused by land-falling "atmospheric rivers" associated with extratropical cyclones from the Pacific Ocean. Atmospheric rivers are narrow, elongated plumes of enhanced water vapor transport over the Pacific and Atlantic oceans that can extend from the tropics and subtropics into the extratropics. Large values of integrated water vapor are advected within the warm sector of extratropical cyclones immediately ahead of polar cold fronts, although the source of these vapor plumes can originate in the tropics beyond the cyclone warm sector. When an atmospheric river makes a landfall on the coast of California, the northwest to southeast orientation of the Sierra Mountain chain exerts orographic forcing on the southwesterly low-level flow in the warm sector of approaching extratropical cyclones. As a result, sustained precipitation is typically enhanced and modified by the complex terrain. This has major hydrological consequences. The National Oceanic Atmospheric Administration (NOAA) has established the Hydrometeorological Testbed (HMT) to design and support a series of field and numerical modeling experiments to better understand and forecast precipitation in the Central Valley. The main role of the Forecast Application Branch (NOAA/ESRL/GSD) in HMT has been in supporting the real time numerical forecasts as well as research activities targeting better understanding and improvement of Quantitative Precipitation Forecasts (QPF). For this purpose ensemble modeling system has been developed. The ensemble system consists of mixed dynamic cores, mixed physics and mixed lateral boundary conditions. Performance evaluation results for this system will be presented at the conference.

  2. Optimization Testbed Cometboards Extended into Stochastic Domain

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Pai, Shantaram S.; Coroneos, Rula M.; Patnaik, Surya N.

    2010-01-01

    COMparative Evaluation Testbed of Optimization and Analysis Routines for the Design of Structures (CometBoards) is a multidisciplinary design optimization software. It was originally developed for deterministic calculation. It has now been extended into the stochastic domain for structural design problems. For deterministic problems, CometBoards is introduced through its subproblem solution strategy as well as the approximation concept in optimization. In the stochastic domain, a design is formulated as a function of the risk or reliability. Optimum solution including the weight of a structure, is also obtained as a function of reliability. Weight versus reliability traced out an inverted-S-shaped graph. The center of the graph corresponded to 50 percent probability of success, or one failure in two samples. A heavy design with weight approaching infinity could be produced for a near-zero rate of failure that corresponded to unity for reliability. Weight can be reduced to a small value for the most failure-prone design with a compromised reliability approaching zero. The stochastic design optimization (SDO) capability for an industrial problem was obtained by combining three codes: MSC/Nastran code was the deterministic analysis tool, fast probabilistic integrator, or the FPI module of the NESSUS software, was the probabilistic calculator, and CometBoards became the optimizer. The SDO capability requires a finite element structural model, a material model, a load model, and a design model. The stochastic optimization concept is illustrated considering an academic example and a real-life airframe component made of metallic and composite materials.

  3. A Testbed for Evaluating Lunar Habitat Autonomy Architectures

    NASA Technical Reports Server (NTRS)

    Lawler, Dennis G.

    2008-01-01

    A lunar outpost will involve a habitat with an integrated set of hardware and software that will maintain a safe environment for human activities. There is a desire for a paradigm shift whereby crew will be the primary mission operators, not ground controllers. There will also be significant periods when the outpost is uncrewed. This will require that significant automation software be resident in the habitat to maintain all system functions and respond to faults. JSC is developing a testbed to allow for early testing and evaluation of different autonomy architectures. This will allow evaluation of different software configurations in order to: 1) understand different operational concepts; 2) assess the impact of failures and perturbations on the system; and 3) mitigate software and hardware integration risks. The testbed will provide an environment in which habitat hardware simulations can interact with autonomous control software. Faults can be injected into the simulations and different mission scenarios can be scripted. The testbed allows for logging, replaying and re-initializing mission scenarios. An initial testbed configuration has been developed by combining an existing life support simulation and an existing simulation of the space station power distribution system. Results from this initial configuration will be presented along with suggested requirements and designs for the incremental development of a more sophisticated lunar habitat testbed.

  4. Development of the CSI phase-3 evolutionary model testbed

    NASA Technical Reports Server (NTRS)

    Gronet, M. J.; Davis, D. A.; Tan, M. K.

    1994-01-01

    This report documents the development effort for the reconfiguration of the Controls-Structures Integration (CSI) Evolutionary Model (CEM) Phase-2 testbed into the CEM Phase-3 configuration. This step responds to the need to develop and test CSI technologies associated with typical planned earth science and remote sensing platforms. The primary objective of the CEM Phase-3 ground testbed is to simulate the overall on-orbit dynamic behavior of the EOS AM-1 spacecraft. Key elements of the objective include approximating the low-frequency appendage dynamic interaction of EOS AM-1, allowing for the changeout of components, and simulating the free-free on-orbit environment using an advanced suspension system. The fundamentals of appendage dynamic interaction are reviewed. A new version of the multiple scaling method is used to design the testbed to have the full-scale geometry and dynamics of the EOS AM-1 spacecraft, but at one-tenth the weight. The testbed design is discussed, along with the testing of the solar array, high gain antenna, and strut components. Analytical performance comparisons show that the CEM Phase-3 testbed simulates the EOS AM-1 spacecraft with good fidelity for the important parameters of interest.

  5. Kite: Status of the External Metrology Testbed for SIM

    NASA Technical Reports Server (NTRS)

    Dekens, Frank G.; Alvarez-Salazar, Oscar; Azizi, Alireza; Moser, Steven; Nemati, Bijan; Negron, John; Neville, Timothy; Ryan, Daniel

    2004-01-01

    Kite is a system level testbed for the External Metrology system of the Space Interferometry Mission (SIM). The External Metrology System is used to track the fiducial that are located at the centers of the interferometer's siderostats. The relative changes in their positions needs to be tracked to tens of picometers in order to correct for thermal measurements, the Kite testbed was build to test both the metrology gauges and out ability to optically model the system at these levels. The Kite testbed is an over-constraint system where 6 lengths are measured, but only 5 are needed to determine the system. The agreement in the over-constrained length needs to be on the order of 140 pm for the SIM Wide-Angle observing scenario and 8 pm for the Narrow-Angle observing scenario. We demonstrate that we have met the Wide-Angle goal with our current setup. For the Narrow-Angle case, we have only reached the goal for on-axis observations. We describe the testbed improvements that have been made since our initial results, and outline the future Kite changes that will add further effects that SIM faces in order to make the testbed more SIM like.

  6. TopCAT and PySESA: Open-source software tools for point cloud decimation, roughness analyses, and quantitative description of terrestrial surfaces

    NASA Astrophysics Data System (ADS)

    Hensleigh, J.; Buscombe, D.; Wheaton, J. M.; Brasington, J.; Welcker, C. W.; Anderson, K.

    2015-12-01

    The increasing use of high-resolution topography (HRT) constructed from point clouds obtained from technology such as LiDAR, SoNAR, SAR, SfM and a variety of range-imaging techniques, has created a demand for custom analytical tools and software for point cloud decimation (data thinning and gridding) and spatially explicit statistical analysis of terrestrial surfaces. We will present on a number of analytical and computational tools designed to quantify surface roughness and texture, directly from point clouds in a variety of ways (using spatial- and frequency-domain statistics). TopCAT (Topographic Point Cloud Analysis Toolkit; Brasington et al., 2012) and PySESA (Python program for Spatially Explicit Spectral Analysis) both work by applying a small moving window to (x,y,z) data to calculate a suite of (spatial and spectral domain) statistics, which are then spatially-referenced on a regular (x,y) grid at a user-defined resolution. Collectively, these tools facilitate quantitative description of surfaces and may allow, for example, fully automated texture characterization and segmentation, roughness and grain size calculation, and feature detection and classification, on very large point clouds with great computational efficiency. Using tools such as these, it may be possible to detect geomorphic change in surfaces which have undergone minimal elevation difference, for example deflation surfaces which have coarsened but undergone no net elevation change, or surfaces which have eroded and accreted, leaving behind a different textural surface expression than before. The functionalities of the two toolboxes are illustrated with example high-resolution bathymetric point cloud data collected with multibeam echosounder, and topographic data collected with LiDAR.

  7. Design optimization of the JPL Phase B testbed

    NASA Technical Reports Server (NTRS)

    Milman, Mark H.; Salama, M.; Wette, M.; Chu, Cheng-Chih

    1993-01-01

    Increasingly complex spacecraft will benefit from integrated design and optimization of structural, optical, and control subsystems. Integrated design optimization will allow designers to make tradeoffs in objectives and constraints across these subsystems. The location, number, and types of passive and active devices distributed along the structure can have a dramatic impact on overall system performance. In addition, the manner in which structural mass is distributed can also serve as an effective mechanism for attenuating disturbance transmission between source and sensitive system components. This paper presents recent experience using optimization tools that have been developed for addressing some of these issues on a challenging testbed design problem. This particular testbed is one of a series of testbeds at the Jet Propulsion Laboratory under the sponsorship of the NASA Control Structure Interaction (CSI) Program to demonstrate nanometer level optical pathlength control on a flexible truss structure that emulates a spaceborne interferometer.

  8. Laser Metrology in the Micro-Arcsecond Metrology Testbed

    NASA Technical Reports Server (NTRS)

    An, Xin; Marx, D.; Goullioud, Renaud; Zhao, Feng

    2004-01-01

    The Space Interferometer Mission (SIM), scheduled for launch in 2009, is a space-born visible light stellar interferometer capable of micro-arcsecond-level astrometry. The Micro-Arcsecond Metrology testbed (MAM) is the ground-based testbed that incorporates all the functionalities of SIM minus the telescope, for mission-enabling technology development and verification. MAM employs a laser heterodyne metrology system using the Sub-Aperture Vertex-to-Vertex (SAVV) concept. In this paper, we describe the development and modification of the SAVV metrology launchers and the metrology instrument electronics, precision alignments and pointing control, locating cyclic error sources in the MAM testbed and methods to mitigate the cyclic errors, as well as the performance under the MAM performance metrics.

  9. High Contrast Imaging Testbed for the Terrestrial Planet Finder Coronagraph

    NASA Technical Reports Server (NTRS)

    Lowmman, Andrew E.; Trauger, John T.; Gordon, Brian; Green, Joseph J.; Moody, Dwight; Niessner, Albert F.; Shi, Fang

    2004-01-01

    The Terrestrial Planet Finder (TPF) mission is planning to launch a visible coronagraphic space telescope in 2014. To achieve TPF science goals, the coronagraph must have extreme levels of wavefront correction (less than 1 Angstrom rms over controllable spatial frequencies) and stability to get the necessary suppression of diffracted starlight (approximately l0(exp -10)) contrast at an angular separation approximately 4 (lamda)/D). TPF Coronagraph's primary platform for experimentation is the High Contrast Imaging Testbed, which will provide laboratory validation of key technologies as well as demonstration of a flight-traceable approach to implementation. Precision wavefront control in the testbed is provided by a high actuator density deformable mirror. Diffracted light control is achieved through use of occulting or apodizing masks and stops. Contrast measurements will establish the technical feasibility of TPF requirements, while model and error budget validation will demonstrate implementation viability. This paper describes the current testbed design, development approach, and recent experimental results.

  10. A Testbed for Deploying Distributed State Estimation in Power Grid

    SciTech Connect

    Jin, Shuangshuang; Chen, Yousu; Rice, Mark J.; Liu, Yan; Gorton, Ian

    2012-07-22

    Abstract—With the increasing demand, scale and data information of power systems, fast distributed applications are becoming more important in power system operation and control. This paper proposes a testbed for evaluating power system distributed applications, considering data exchange among distributed areas. A high-performance computing (HPC) version of distributed state estimation is implemented and used as a distributed application example. The IEEE 118-bus system is used to deploy the parallel distributed state estimation, and the MeDICi middleware is used for data communication. The performance of the testbed demonstrates its capability to evaluate parallel distributed state estimation by leveraging the HPC paradigm. This testbed can also be applied to evaluate other distributed applications.

  11. Spherical Air Bearing testbed for nanosatellite attitude control development

    NASA Astrophysics Data System (ADS)

    Ustrzycki, Tyler

    Spherical Air Bearing systems have been used as a test bed for attitude control systems for many decades. With the advancements of nanosatellite technologies as a platform for scientific missions, there is an increased demand for comprehensive, pre-launch testing of nanosatellites. Several spherical air bearing systems have been developed for larger satellite applications and add too much parasitic mass to be applicable for nanosatellite applications. This thesis details the design and validation of a Nanosatellite Three Axis Attitude Control Testbed. The testbed consists of the physical design of the system, a complete electronics system, and validation of the testbed using low-cost reaction wheels as actuators. The design of the air bearing platform includes a manual balancing system to align the centre of gravity with the centre of rotation. The electronics system is intended to measure the attitude of the platform and control the actuator system. Validation is achieved through a controlled slew maneuver of the air bearing platform.

  12. Experimental Test-Bed for Intelligent Passive Array Research

    NASA Technical Reports Server (NTRS)

    Solano, Wanda M.; Torres, Miguel; David, Sunil; Isom, Adam; Cotto, Jose; Sharaiha, Samer

    2004-01-01

    This document describes the test-bed designed for the investigation of passive direction finding, recognition, and classification of speech and sound sources using sensor arrays. The test-bed forms the experimental basis of the Intelligent Small-Scale Spatial Direction Finder (ISS-SDF) project, aimed at furthering digital signal processing and intelligent sensor capabilities of sensor array technology in applications such as rocket engine diagnostics, sensor health prognostics, and structural anomaly detection. This form of intelligent sensor technology has potential for significant impact on NASA exploration, earth science and propulsion test capabilities. The test-bed consists of microphone arrays, power and signal distribution modules, web-based data acquisition, wireless Ethernet, modeling, simulation and visualization software tools. The Acoustic Sensor Array Modeler I (ASAM I) is used for studying steering capabilities of acoustic arrays and testing DSP techniques. Spatial sound distribution visualization is modeled using the Acoustic Sphere Analysis and Visualization (ASAV-I) tool.

  13. Technology developments integrating a space network communications testbed

    NASA Technical Reports Server (NTRS)

    Kwong, Winston; Jennings, Esther; Clare, Loren; Leang, Dee

    2006-01-01

    As future manned and robotic space explorations missions involve more complex systems, it is essential to verify, validate, and optimize such systems through simulation and emulation in a low cost testbed environment. The goal of such a testbed is to perform detailed testing of advanced space and ground communications networks, technologies, and client applications that are essential for future space exploration missions. We describe the development of new technologies enhancing our Multi-mission Advanced Communications Hybrid Environment for Test and Evaluation (MACHETE) that enables its integration in a distributed space communications testbed. MACHETE combines orbital modeling, link analysis, and protocol and service modeling to quantify system performance based on comprehensive considerations of different aspects of space missions.

  14. Telescience testbed pilot program, volume 2: Program results

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, contains the integrated results. Background is provided of the program and highlights of the program results. The various testbed experiments and the programmatic approach is summarized. The results are summarized on a discipline by discipline basis, highlighting the lessons learned for each discipline. Then the results are integrated across each discipline, summarizing the lessons learned overall.

  15. Experiences with the JPL telerobot testbed: Issues and insights

    NASA Technical Reports Server (NTRS)

    Stone, Henry W.; Balaram, Bob; Beahan, John

    1989-01-01

    The Jet Propulsion Laboratory's (JPL) Telerobot Testbed is an integrated robotic testbed used to develop, implement, and evaluate the performance of advanced concepts in autonomous, tele-autonomous, and tele-operated control of robotic manipulators. Using the Telerobot Testbed, researchers demonstrated several of the capabilities and technological advances in the control and integration of robotic systems which have been under development at JPL for several years. In particular, the Telerobot Testbed was recently employed to perform a near completely automated, end-to-end, satellite grapple and repair sequence. The task of integrating existing as well as new concepts in robot control into the Telerobot Testbed has been a very difficult and timely one. Now that researchers have completed the first major milestone (i.e., the end-to-end demonstration) it is important to reflect back upon experiences and to collect the knowledge that has been gained so that improvements can be made to the existing system. It is also believed that the experiences are of value to the others in the robotics community. Therefore, the primary objective here will be to use the Telerobot Testbed as a case study to identify real problems and technological gaps which exist in the areas of robotics and in particular systems integration. Such problems have surely hindered the development of what could be reasonably called an intelligent robot. In addition to identifying such problems, researchers briefly discuss what approaches have been taken to resolve them or, in several cases, to circumvent them until better approaches can be developed.

  16. Performance evaluation of multi-stratum resources optimization with network functions virtualization for cloud-based radio over optical fiber networks.

    PubMed

    Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young

    2016-04-18

    Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme.

  17. The Living With a Star Space Environment Testbed Program

    NASA Technical Reports Server (NTRS)

    Barth, Janet; LaBel, Kenneth; Day, John H. (Technical Monitor)

    2001-01-01

    NASA has initiated the Living with a Star (LWS) Program to develop the scientific understanding to address the aspects of the Connected Sun-Earth system that affects life and society. The Program Architecture includes science missions, theory and modeling and Space Environment Testbeds (SET). This current paper discusses the Space Environment Testbeds. The goal of the SET program is to improve the engineering approach to accomodate and/or mitigate the effects of solar variability on spacecraft design and operations. The SET Program will infuse new technologies into the space programs through collection of data in space and subsequent design and validation of technologies. Examples of these technologies are cited and discussed.

  18. CSM Testbed Development and Large-Scale Structural Applications

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Gillian, R. E.; Mccleary, Susan L.; Lotts, C. G.; Poole, E. L.; Overman, A. L.; Macy, S. C.

    1989-01-01

    A research activity called Computational Structural Mechanics (CSM) conducted at the NASA Langley Research Center is described. This activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM Testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM Testbed methods development environment is presented and some new numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  19. Telescience testbed pilot program, volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth sciences, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, is the executive summary.

  20. UltraSciencenet: High- Performance Network Research Test-Bed

    SciTech Connect

    Rao, Nageswara S; Wing, William R; Poole, Stephen W; Hicks, Susan Elaine; DeNap, Frank A; Carter, Steven M; Wu, Qishi

    2009-04-01

    The high-performance networking requirements for next generation large-scale applications belong to two broad classes: (a) high bandwidths, typically multiples of 10Gbps, to support bulk data transfers, and (b) stable bandwidths, typically at much lower bandwidths, to support computational steering, remote visualization, and remote control of instrumentation. Current Internet technologies, however, are severely limited in meeting these demands because such bulk bandwidths are available only in the backbone, and stable control channels are hard to realize over shared connections. The UltraScience Net (USN) facilitates the development of such technologies by providing dynamic, cross-country dedicated 10Gbps channels for large data transfers, and 150 Mbps channels for interactive and control operations. Contributions of the USN project are two-fold: (a) Infrastructure Technologies for Network Experimental Facility: USN developed and/or demonstrated a number of infrastructure technologies needed for a national-scale network experimental facility. Compared to Internet, USN's data-plane is different in that it can be partitioned into isolated layer-1 or layer-2 connections, and its control-plane is different in the ability of users and applications to setup and tear down channels as needed. Its design required several new components including a Virtual Private Network infrastructure, a bandwidth and channel scheduler, and a dynamic signaling daemon. The control-plane employs a centralized scheduler to compute the channel allocations and a signaling daemon to generate configuration signals to switches. In a nutshell, USN demonstrated the ability to build and operate a stable national-scale switched network. (b) Structured Network Research Experiments: A number of network research experiments have been conducted on USN that cannot be easily supported over existing network facilities, including test-beds and production networks. It settled an open matter by demonstrating

  1. Open source IPSEC software in manned and unmanned space missions

    NASA Astrophysics Data System (ADS)

    Edwards, Jacob

    Network security is a major topic of research because cyber attackers pose a threat to national security. Securing ground-space communications for NASA missions is important because attackers could endanger mission success and human lives. This thesis describes how an open source IPsec software package was used to create a secure and reliable channel for ground-space communications. A cost efficient, reproducible hardware testbed was also created to simulate ground-space communications. The testbed enables simulation of low-bandwidth and high latency communications links to experiment how the open source IPsec software reacts to these network constraints. Test cases were built that allowed for validation of the testbed and the open source IPsec software. The test cases also simulate using an IPsec connection from mission control ground routers to points of interest in outer space. Tested open source IPsec software did not meet all the requirements. Software changes were suggested to meet requirements.

  2. An Integrated Testbed for Cooperative Perception with Heterogeneous Mobile and Static Sensors

    PubMed Central

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679

  3. An integrated testbed for cooperative perception with heterogeneous mobile and static sensors.

    PubMed

    Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal

    2011-01-01

    Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper.

  4. Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Mount, Frances; Carreon, Patricia; Torney, Susan E.

    2001-01-01

    The Engineering and Mission Operations Directorates at NASA Johnson Space Center are combining laboratories and expertise to establish the Human Centered Autonomous and Assistant Systems Testbed for Exploration Operations. This is a testbed for human centered design, development and evaluation of intelligent autonomous and assistant systems that will be needed for human exploration and development of space. This project will improve human-centered analysis, design and evaluation methods for developing intelligent software. This software will support human-machine cognitive and collaborative activities in future interplanetary work environments where distributed computer and human agents cooperate. We are developing and evaluating prototype intelligent systems for distributed multi-agent mixed-initiative operations. The primary target domain is control of life support systems in a planetary base. Technical approaches will be evaluated for use during extended manned tests in the target domain, the Bioregenerative Advanced Life Support Systems Test Complex (BIO-Plex). A spinoff target domain is the International Space Station (ISS) Mission Control Center (MCC). Prodl}cts of this project include human-centered intelligent software technology, innovative human interface designs, and human-centered software development processes, methods and products. The testbed uses adjustable autonomy software and life support systems simulation models from the Adjustable Autonomy Testbed, to represent operations on the remote planet. Ground operations prototypes and concepts will be evaluated in the Exploration Planning and Operations Center (ExPOC) and Jupiter Facility.

  5. Integrated microfluidic test-bed for energy conversion devices.

    PubMed

    Modestino, Miguel A; Diaz-Botia, Camilo A; Haussener, Sophia; Gomez-Sjoberg, Rafael; Ager, Joel W; Segalman, Rachel A

    2013-05-21

    Energy conversion devices require the parallel functionality of a variety of components for efficient operation. We present a versatile microfluidic test-bed for facile testing of integrated catalysis and mass transport components for energy conversion via water electrolysis. This system can be readily extended to solar-fuels generators and fuel-cell devices.

  6. Extending the Information Commons: From Instructional Testbed to Internet2

    ERIC Educational Resources Information Center

    Beagle, Donald

    2002-01-01

    The author's conceptualization of an Information Commons (IC) is revisited and elaborated in reaction to Bailey and Tierney's article. The IC's role as testbed for instructional support and knowledge discovery is explored, and progress on pertinent research is reviewed. Prospects for media-rich learning environments relate the IC to the…

  7. In-Space Networking On NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Brooks, David; Eddy, Wesley M.; Clark, Gilbert J., III; Johnson, Sandra K.

    2016-01-01

    The NASA Space Communications and Navigation (SCaN) Testbed, an external payload onboard the International Space Station, is equipped with three software defined radios (SDRs) and a programmable flight computer. The purpose of the Testbed is to conduct inspace research in the areas of communication, navigation, and networking in support of NASA missions and communication infrastructure. Multiple reprogrammable elements in the end to end system, along with several communication paths and a semi-operational environment, provides a unique opportunity to explore networking concepts and protocols envisioned for the future Solar System Internet (SSI). This paper will provide a general description of the system's design and the networking protocols implemented and characterized on the testbed, including Encapsulation, IP over CCSDS, and Delay-Tolerant Networking (DTN). Due to the research nature of the implementation, flexibility and robustness are considered in the design to enable expansion for future adaptive and cognitive techniques. Following a detailed design discussion, lessons learned and suggestions for future missions and communication infrastructure elements will be provided. Plans for the evolving research on SCaN Testbed as it moves towards a more adaptive, autonomous system will be discussed.

  8. Asynchronous Message Passing in the JPL Flight System Testbed

    NASA Technical Reports Server (NTRS)

    Burleigh, Scott

    1996-01-01

    The flight mission simulation software in the Jet Propulsion Laboratory's Flight System Testbed (FST) is a heterogeneous, distributed system that is built on an interprocess communication model of asynchronous message passing rather than remote procedure calls (RPCs). The reasoning behind this design decision is discussed; the mechanism used to implement it (.

  9. Survey of Two-Way Cable Television Testbeds.

    ERIC Educational Resources Information Center

    Cable Television Information Center, Washington, DC.

    Surveys of 10 two-way interactive cable experiments indicate that little is happening in this field. The location of the testbeds, the names and addresses of both the parent company and its local subsidiary are included in the survey together with a description of each project. A brief note on the subscriber's right to privacy concludes this short…

  10. Cloud Computing

    SciTech Connect

    Pete Beckman and Ian Foster

    2009-12-04

    Chicago Matters: Beyond Burnham (WTTW). Chicago has become a world center of "cloud computing." Argonne experts Pete Beckman and Ian Foster explain what "cloud computing" is and how you probably already use it on a daily basis.

  11. Developmental Cryogenic Active Telescope Testbed, a Wavefront Sensing and Control Testbed for the Next Generation Space Telescope

    NASA Technical Reports Server (NTRS)

    Leboeuf, Claudia M.; Davila, Pamela S.; Redding, David C.; Morell, Armando; Lowman, Andrew E.; Wilson, Mark E.; Young, Eric W.; Pacini, Linda K.; Coulter, Dan R.

    1998-01-01

    As part of the technology validation strategy of the next generation space telescope (NGST), a system testbed is being developed at GSFC, in partnership with JPL and Marshall Space Flight Center (MSFC), which will include all of the component functions envisioned in an NGST active optical system. The system will include an actively controlled, segmented primary mirror, actively controlled secondary, deformable, and fast steering mirrors, wavefront sensing optics, wavefront control algorithms, a telescope simulator module, and an interferometric wavefront sensor for use in comparing final obtained wavefronts from different tests. The developmental. cryogenic active telescope testbed (DCATT) will be implemented in three phases. Phase 1 will focus on operating the testbed at ambient temperature. During Phase 2, a cryocapable segmented telescope will be developed and cooled to cryogenic temperature to investigate the impact on the ability to correct the wavefront and stabilize the image. In Phase 3, it is planned to incorporate industry developed flight-like components, such as figure controlled mirror segments, cryogenic, low hold power actuators, or different wavefront sensing and control hardware or software. A very important element of the program is the development and subsequent validation of the integrated multidisciplinary models. The Phase 1 testbed objectives, plans, configuration, and design will be discussed.

  12. Trusted computing strengthens cloud authentication.

    PubMed

    Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.

  13. Trusted Computing Strengthens Cloud Authentication

    PubMed Central

    2014-01-01

    Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149

  14. Cloud-Based NoSQL Open Database of Pulmonary Nodules for Computer-Aided Lung Cancer Diagnosis and Reproducible Research.

    PubMed

    Ferreira Junior, José Raniery; Oliveira, Marcelo Costa; de Azevedo-Marques, Paulo Mazzoncini

    2016-12-01

    Lung cancer is the leading cause of cancer-related deaths in the world, and its main manifestation is pulmonary nodules. Detection and classification of pulmonary nodules are challenging tasks that must be done by qualified specialists, but image interpretation errors make those tasks difficult. In order to aid radiologists on those hard tasks, it is important to integrate the computer-based tools with the lesion detection, pathology diagnosis, and image interpretation processes. However, computer-aided diagnosis research faces the problem of not having enough shared medical reference data for the development, testing, and evaluation of computational methods for diagnosis. In order to minimize this problem, this paper presents a public nonrelational document-oriented cloud-based database of pulmonary nodules characterized by 3D texture attributes, identified by experienced radiologists and classified in nine different subjective characteristics by the same specialists. Our goal with the development of this database is to improve computer-aided lung cancer diagnosis and pulmonary nodule detection and classification research through the deployment of this database in a cloud Database as a Service framework. Pulmonary nodule data was provided by the Lung Image Database Consortium and Image Database Resource Initiative (LIDC-IDRI), image descriptors were acquired by a volumetric texture analysis, and database schema was developed using a document-oriented Not only Structured Query Language (NoSQL) approach. The proposed database is now with 379 exams, 838 nodules, and 8237 images, 4029 of them are CT scans and 4208 manually segmented nodules, and it is allocated in a MongoDB instance on a cloud infrastructure.

  15. Real-time video streaming in mobile cloud over heterogeneous wireless networks

    NASA Astrophysics Data System (ADS)

    Abdallah-Saleh, Saleh; Wang, Qi; Grecos, Christos

    2012-06-01

    Recently, the concept of Mobile Cloud Computing (MCC) has been proposed to offload the resource requirements in computational capabilities, storage and security from mobile devices into the cloud. Internet video applications such as real-time streaming are expected to be ubiquitously deployed and supported over the cloud for mobile users, who typically encounter a range of wireless networks of diverse radio access technologies during their roaming. However, real-time video streaming for mobile cloud users across heterogeneous wireless networks presents multiple challenges. The network-layer quality of service (QoS) provision to support high-quality mobile video delivery in this demanding scenario remains an open research question, and this in turn affects the application-level visual quality and impedes mobile users' perceived quality of experience (QoE). In this paper, we devise a framework to support real-time video streaming in this new mobile video networking paradigm and evaluate the performance of the proposed framework empirically through a lab-based yet realistic testing platform. One particular issue we focus on is the effect of users' mobility on the QoS of video streaming over the cloud. We design and implement a hybrid platform comprising of a test-bed and an emulator, on which our concept of mobile cloud computing, video streaming and heterogeneous wireless networks are implemented and integrated to allow the testing of our framework. As representative heterogeneous wireless networks, the popular WLAN (Wi-Fi) and MAN (WiMAX) networks are incorporated in order to evaluate effects of handovers between these different radio access technologies. The H.264/AVC (Advanced Video Coding) standard is employed for real-time video streaming from a server to mobile users (client nodes) in the networks. Mobility support is introduced to enable continuous streaming experience for a mobile user across the heterogeneous wireless network. Real-time video stream packets

  16. Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.

    PubMed

    Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James

    2012-06-01

    Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.

  17. Quality Assessment and Accessibility Applications of Crowdsourced Geospatial Data: A Report on the Development and Extension of the George Mason University Geocrowdsourcing Testbed

    DTIC Science & Technology

    2014-09-01

    Routing map, using Esri’s API for JavaScript and OpenStreetMap base data ................. 70   Figure 48. Jessica Fayne’s microfiber bike map...Visual Impairment: A Comparison of Spatial Displays for Route Guidance,” Journal of Visual Impairment & Blindness 99, no. 4 (2005): 219. 21 Andrea...characteristics of the reports in our GMU Geo- crowdsourcing Testbed are determined from a comparison of the contribu- tor’s position estimate, derived

  18. The Gemini Planet Imager Coronagraph Testbed Preliminary Performance Results

    NASA Astrophysics Data System (ADS)

    Roberts, Robin

    2010-01-01

    The Gemini Planet Imager (GPI) is a new science instrument being developed and slated for first light early 2011 on the twin 8m Gemini telescopes. Operating in the near infrared, this ground-based, extreme Adaptive Optics (ExAO) coronographic instrument will provide the ability to detect, characterize and analyze young (< 2GYr), self-luminous, extrasolar planets with brightness contrast ratios ≤ 10-7 when compared to their parent star. The coronagraph subsystem includes a pupil apodization, a hard-edged focal plane mask as well as a Lyot stop. Preliminary results indicate that the testbed is performing at very high contrast, having achieved broadband contrasts (H-band) below 10-6 at separations > 5λ/D. Fraunhoffer and Fresnel propagation modeling were used to analyze the testbed results.

  19. The advanced orbiting systems testbed program: Results to date

    NASA Technical Reports Server (NTRS)

    Newsome, Penny A.; Otranto, John F.

    1993-01-01

    The Consultative Committee for Space Data Systems Recommendations for Packet Telemetry and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. Central to the AOST Program are the development of an end-to-end Testbed and its use in a comprehensive testing program. Other Program activities include flight-qualifiable component development, supporting studies, and knowledge dissemination. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations. The results presented in this paper include architectural issues, a draft proposed standardized test suite and flight-qualifiable components.

  20. Telescience testbed pilot program, volume 3: Experiment summaries

    NASA Technical Reports Server (NTRS)

    Leiner, Barry M.

    1989-01-01

    Space Station Freedom and its associated labs, coupled with the availability of new computing and communications technologies, have the potential for significantly enhancing scientific research. A Telescience Testbed Pilot Program (TTPP), aimed at developing the experience base to deal with issues in the design of the future information system of the Space Station era. The testbeds represented four scientific disciplines (astronomy and astrophysics, earth science, life sciences, and microgravity sciences) and studied issues in payload design, operation, and data analysis. This volume, of a 3 volume set, which all contain the results of the TTPP, presents summaries of the experiments. This experiment involves the evaluation of the current Internet for the use of file and image transfer between SIRTF instrument teams. The main issue addressed was current network response times.

  1. A MIMO-OFDM Testbed for Wireless Local Area Networks

    NASA Astrophysics Data System (ADS)

    Fàbregas, Albert Guilléni; Guillaud, Maxime; Slock, Dirk TM; Caire, Giuseppe; Gosse, Karine; Rouquette, Stéphanie; Dias, Alexandre Ribeiro; Bernardin, Philippe; Miet, Xavier; Conrat, Jean-Marc; Toutain, Yann; Peden, Alain; Li, Zaiqing

    2006-12-01

    We describe the design steps and final implementation of a MIMO OFDM prototype platform developed to enhance the performance of wireless LAN standards such as HiperLAN/2 and 802.11, using multiple transmit and multiple receive antennas. We first describe the channel measurement campaign used to characterize the indoor operational propagation environment, and analyze the influence of the channel on code design through a ray-tracing channel simulator. We also comment on some antenna and RF issues which are of importance for the final realization of the testbed. Multiple coding, decoding, and channel estimation strategies are discussed and their respective performance-complexity trade-offs are evaluated over the realistic channel obtained from the propagation studies. Finally, we present the design methodology, including cross-validation of the Matlab, C++, and VHDL components, and the final demonstrator architecture. We highlight the increased measured performance of the MIMO testbed over the single-antenna system.

  2. Amplitude variations on the Extreme Adaptive Optics testbed

    SciTech Connect

    Evans, J; Thomas, S; Dillon, D; Gavel, D; Phillion, D; Macintosh, B

    2007-08-14

    High-contrast adaptive optics systems, such as those needed to image extrasolar planets, are known to require excellent wavefront control and diffraction suppression. At the Laboratory for Adaptive Optics on the Extreme Adaptive Optics testbed, we have already demonstrated wavefront control of better than 1 nm rms within controllable spatial frequencies. Corresponding contrast measurements, however, are limited by amplitude variations, including those introduced by the micro-electrical-mechanical-systems (MEMS) deformable mirror. Results from experimental measurements and wave optic simulations of amplitude variations on the ExAO testbed are presented. We find systematic intensity variations of about 2% rms, and intensity variations with the MEMS to be 6%. Some errors are introduced by phase and amplitude mixing because the MEMS is not conjugate to the pupil, but independent measurements of MEMS reflectivity suggest that some error is introduced by small non-uniformities in the reflectivity.

  3. Collaboration in a Wireless Grid Innovation Testbed by Virtual Consortium

    NASA Astrophysics Data System (ADS)

    Treglia, Joseph; Ramnarine-Rieks, Angela; McKnight, Lee

    This paper describes the formation of the Wireless Grid Innovation Testbed (WGiT) coordinated by a virtual consortium involving academic and non-academic entities. Syracuse University and Virginia Tech are primary university partners with several other academic, government, and corporate partners. Objectives include: 1) coordinating knowledge sharing, 2) defining key parameters for wireless grids network applications, 3) dynamically connecting wired and wireless devices, content and users, 4) linking to VT-CORNET, Virginia Tech Cognitive Radio Network Testbed, 5) forming ad hoc networks or grids of mobile and fixed devices without a dedicated server, 6) deepening understanding of wireless grid application, device, network, user and market behavior through academic, trade and popular publications including online media, 7) identifying policy that may enable evaluated innovations to enter US and international markets and 8) implementation and evaluation of the international virtual collaborative process.

  4. Summary of parallel session I: grid testbeds and applications

    NASA Astrophysics Data System (ADS)

    Olson, D. L.

    2003-04-01

    This paper is a summary of talks presented at ACAT 2002 in parallel session I on grid testbeds and applications. There were 12 presentations on this topic that show a lot of enthusiasm and hard work by many people in bringing physics applications onto the grid. There are encouraging success stories and also a clear view that the middleware has a way to go until it is as robust, reliable and complete as we would like it to be.

  5. Summary of Parallel Session I: Grid testbeds and applications

    SciTech Connect

    Olson, Douglas L.

    2002-10-10

    This paper is a summary of talks presented at ACAT 2002 in parallel session I on grid testbeds and applications. There were 12 presentations on this topic that show a lot of enthusiasm and hard work by many people in bringing physics applications onto the grid. There are encouraging success stories and also a clear view that the middleware has a way to go until it is as robust, reliable and complete as we would like it to be.

  6. Sensing and Navigation System for a Multiple-AUV Testbed

    DTIC Science & Technology

    2002-09-30

    on a part-time basis. WORK COMPLETED As part of the larger testbed development, we have designed and constructed three “ grouper ” vehicles. The...technology for relative position/heading measurements of neighboring vehicles. We are currently implementing two vision systems into a grouper vehicle to...tested on a grouper vehicle and is expected to considerably improve the positioning system. In an effort to facilitate controller development and

  7. Advanced Unmanned Search System (AUSS) Testbed. Search Demonstration Testing

    DTIC Science & Technology

    1992-11-01

    AUSS) Testbed Search Demonstration Testing J. Walton NflS (r15 CA&I u1•C IA ,_ D•’,ltr Ib~u tion I - rC;1Availabiity -udes Dit A ,1 w () r NAVAL COMMAND...CONTROL AND OCEAN SURVEILLANCE CENTER RDT&E DIVISION San Diego, California 92152-5000 J. D. FONTANA, CAPT, USN R . T. SHEARER Commanding Officer...1 OBJECTIVES TE ST A R E A .................................................. ....... VEHICLE CONFIGURATION

  8. Helix Project Testbed: Towards the Self-Regenerative Incorruptible Enterprise

    DTIC Science & Technology

    2011-09-01

    browsers and other clients, web and application servers, will become prevalent throughout the enterprise. We have set up the testbed so that researchers...content at the server side. However, different browsers may parse the same Web content differently, partly in an attempt to tolerate or auto-correct...environment. The mobile Internet devices was used to develop Web page sanitization and other policy enforcement in Web browsers and to evaluate their

  9. Software Testbed for Developing and Evaluating Integrated Autonomous Systems

    DTIC Science & Technology

    2015-03-01

    978-1-4799-5380-6/15/$31.00 ©2015 IEEE 1 Software Testbed for Developing and Evaluating Integrated Autonomous Systems James Ong , Emilio...Remolina, Axel Prompt Stottler Henke Associates, Inc. 1670 S. Amphlett Blvd., suite 310 San Mateo, CA 94402 650-931-2700 ong , remolina, aprompt...www.stottlerhenke.com/datamontage/ [13] Ong , J., E. Remolina, D. E. Smith, M. S. Boddy (2013) A Visual Integrated Development Environment for Automated Planning

  10. Remotely Accessible Testbed for Software Defined Radio Development

    NASA Technical Reports Server (NTRS)

    Lux, James P.; Lang, Minh; Peters, Kenneth J.; Taylor, Gregory H.

    2012-01-01

    Previous development testbeds have assumed that the developer was physically present in front of the hardware being used. No provision for remote operation of basic functions (power on/off or reset) was made, because the developer/operator was sitting in front of the hardware, and could just push the button manually. In this innovation, a completely remotely accessible testbed has been created, with all diagnostic equipment and tools set up for remote access, and using standardized interfaces so that failed equipment can be quickly replaced. In this testbed, over 95% of the operating hours were used for testing without the developer being physically present. The testbed includes a pair of personal computers, one running Linux and one running Windows. A variety of peripherals is connected via Ethernet and USB (universal serial bus) interfaces. A private internal Ethernet is used to connect to test instruments and other devices, so that the sole connection to the outside world is via the two PCs. An important design consideration was that all of the instruments and interfaces used stable, long-lived industry standards, such as Ethernet, USB, and GPIB (general purpose interface bus). There are no plug-in cards for the two PCs, so there are no problems with finding replacement computers with matching interfaces, device drivers, and installation. The only thing unique to the two PCs is the locally developed software, which is not specific to computer or operating system version. If a device (including one of the computers) were to fail or become unavailable (e.g., a test instrument needed to be recalibrated), replacing it is a straightforward process with a standard, off-the-shelf device.

  11. Optical modeling in Testbed Environment for Space Situational Awareness (TESSA).

    PubMed

    Nikolaev, Sergei

    2011-08-01

    We describe optical systems modeling in the Testbed Environment for Space Situational Awareness (TESSA) simulator. We begin by presenting a brief outline of the overall TESSA architecture and focus on components for modeling optical sensors. Both image generation and image processing stages are described in detail, highlighting the differences in modeling ground- and space-based sensors. We conclude by outlining the applicability domains for the TESSA simulator, including potential real-life scenarios.

  12. Planning and reasoning in the JPL telerobot testbed

    NASA Technical Reports Server (NTRS)

    Peters, Stephen; Mittman, David; Collins, Carol; Omeara, Jacquie; Rokey, Mark

    1990-01-01

    The Telerobot Interactive Planning System is developed to serve as the highest autonomous-control level of the Telerobot Testbed. A recent prototype is described which integrates an operator interface for supervisory control, a task planner supporting disassembly and re-assembly operations, and a spatial planner for collision-free manipulator motion through the workspace. Each of these components is described in detail. Descriptions of the technical problem, approach, and lessons learned are included.

  13. Development and experimentation of an eye/brain/task testbed

    NASA Technical Reports Server (NTRS)

    Harrington, Nora; Villarreal, James

    1987-01-01

    The principal objective is to develop a laboratory testbed that will provide a unique capability to elicit, control, record, and analyze the relationship of operator task loading, operator eye movement, and operator brain wave data in a computer system environment. The ramifications of an integrated eye/brain monitor to the man machine interface are staggering. The success of such a system would benefit users of space and defense, paraplegics, and the monitoring of boring screens (nuclear power plants, air defense, etc.)

  14. Chowkidar: A Health Monitor for Wireless Sensor Network Testbeds

    DTIC Science & Technology

    2006-02-01

    prefer to use healthy devices and like to know if there are any failures during their experiments. Based on our experience with Kansei , a large WSN...Chowkidar with Kansei , including feedback from both testbed users and administrators who have found Chowkidar to be a useful tool for improving the...can speed WSN development by providing a supporting infrastructure to run, configure and monitor experiments. (a) Physical layout of Kansei (b) A

  15. Cyber security analysis testbed : combining real, emulation, and simulation.

    SciTech Connect

    Villamarin, Charles H.; Eldridge, John M.; Van Leeuwen, Brian P.; Urias, Vincent E.

    2010-07-01

    Cyber security analysis tools are necessary to evaluate the security, reliability, and resilience of networked information systems against cyber attack. It is common practice in modern cyber security analysis to separately utilize real systems of computers, routers, switches, firewalls, computer emulations (e.g., virtual machines) and simulation models to analyze the interplay between cyber threats and safeguards. In contrast, Sandia National Laboratories has developed novel methods to combine these evaluation platforms into a hybrid testbed that combines real, emulated, and simulated components. The combination of real, emulated, and simulated components enables the analysis of security features and components of a networked information system. When performing cyber security analysis on a system of interest, it is critical to realistically represent the subject security components in high fidelity. In some experiments, the security component may be the actual hardware and software with all the surrounding components represented in simulation or with surrogate devices. Sandia National Laboratories has developed a cyber testbed that combines modeling and simulation capabilities with virtual machines and real devices to represent, in varying fidelity, secure networked information system architectures and devices. Using this capability, secure networked information system architectures can be represented in our testbed on a single, unified computing platform. This provides an 'experiment-in-a-box' capability. The result is rapidly-produced, large-scale, relatively low-cost, multi-fidelity representations of networked information systems. These representations enable analysts to quickly investigate cyber threats and test protection approaches and configurations.

  16. The Northrop Grumman External Occulter Testbed: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Lo, Amy; Glassman, T.; Lillie, C.

    2007-05-01

    We have built a subscale testbed to demonstrate and validate the performance of the New Worlds Observer (NWO), a terrestrial planet finder external-occulter mission concept. The external occulter concept allows observations of nearby exo-Earths using two spacecraft: one carrying an occulter that is tens of meters in diameter and the other carrying a generic space telescope. The occulter is completely opaque, resembling a flower, with petals having a hypergaussian profile that enable 10-10 intensity suppression of stars that potentially harbor terrestrial planets. The baseline flight NWO system has a 30 meter occulter flying 30,000 km in front of a 4 meter class telescope. Testing the flight configuration on the ground is not feasible, so we have matched the Fresnel number of the flight configuration ( 10) using a subscale occulter. Our testbed consists of an 80 meter length evacuated tube, with a high precision occulter in the center of the tube. The occulter is 4 cm in diameter, manufactured with ¼ micron metrological accuracy and less than 2 micron tip truncation. This mimics a 30 meter occulter with millimeter figure accuracy and less than centimeter tip truncation. Our testbed is an evolving experiment, and we report here the first, preliminary, results using a single wavelength laser (532 nm) as the source.

  17. Visible Nulling Coronagraphy Testbed Development for Exoplanet Detection

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Woodruff, Robert A.; Vasudevan, Gopal; Thompson, Patrick; Chen, Andrew; Petrone, Peter; Booth, Andrew; Madison, Timothy; Bolcar, Matthew; Noecker, M. Charley; Kendrick, Stephen; Melnick, Gary; Tolls, Volker

    2010-01-01

    Three of the recently completed NASA Astrophysics Strategic Mission Concept (ASMC) studies addressed the feasibility of using a Visible Nulling Coronagraph (VNC) as the prime instrument for exoplanet science. The VNC approach is one of the few approaches that works with filled, segmented and sparse or diluted aperture telescope systems and thus spans the space of potential ASMC exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies and has developed an incremental sequence of VNC testbeds to advance the this approach and the technologies associated with it. Herein we report on the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under high bandwidth closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible light nulling milestones of sequentially higher contrasts of 10(exp 8) , 10(exp 9) and 10(exp 10) at an inner working angle of 2*lambda/D and ultimately culminate in spectrally broadband (>20%) high contrast imaging. Each of the milestones, one per year, is traceable to one or more of the ASMC studies. The VNT uses a modified Mach-Zehnder nulling interferometer, modified with a modified "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. Discussed will be the optical configuration laboratory results, critical technologies and the null sensing and control approach.

  18. The Mini-Mast CSI testbed: Lessons learned

    NASA Technical Reports Server (NTRS)

    Tanner, Sharon E.; Belvin, W. Keith; Horta, Lucas G.; Pappa, R. S.

    1993-01-01

    The Mini-Mast testbed was one of the first large scale Controls-Structure-Interaction (CSI) systems used to evaluate state-of-the-art methodology in flexible structure control. Now that all the testing at Langley Research Center has been completed, a look back is warranted to evaluate the program. This paper describes some of the experiences and technology development studies by NASA, university, and industry investigators. Lessons learned are presented from three categories: the testbed development, control methods, and the operation of a guest investigator program. It is shown how structural safety margins provided a realistic environment to simulate on-orbit CSI research, even though they also reduced the research flexibility afforded to investigators. The limited dynamic coupling between the bending and torsion modes of the cantilevered test article resulted in highly successful SISO and MIMO controllers. However, until accurate models were obtained for the torque wheel actuators, sensors, filters, and the structure itself, most controllers were unstable. Controls research from this testbed should be applicable to cantilevered appendages of future large space structures.

  19. Development and Evaluation of a Stochastic Cloud-radiation Parameterization

    NASA Astrophysics Data System (ADS)

    Veron, D. E.; Secora, J.; Foster, M.

    2004-12-01

    Previous studies have shown that a stochastic cloud-radiation model accurately represents the domain-averaged shortwave fluxes when compared to observations. Using continuously sampled cloud property observations from the three Atmospheric Radiation Measurement (ARM) Program's Clouds and Radiation Testbed (CART) sites, we run a multiple-layer stochastic model and compare the results to that of the single-layer version of the model used in previous studies. In addition, we compare both to plane parallel model output and independent observations. We will use these results to develop a shortwave cloud-radiation parameterization that will incorporate the influence of the stochastic approach on the calculated radiative fluxes. Initial results using this resulting parameterization in a single-column model will be shown.

  20. Development of Liquid Propulsion Systems Testbed at MSFC

    NASA Technical Reports Server (NTRS)

    Alexander, Reginald; Nelson, Graham

    2016-01-01

    As NASA, the Department of Defense and the aerospace industry in general strive to develop capabilities to explore near-Earth, Cis-lunar and deep space, the need to create more cost effective techniques of propulsion system design, manufacturing and test is imperative in the current budget constrained environment. The physics of space exploration have not changed, but the manner in which systems are developed and certified needs to change if there is going to be any hope of designing and building the high performance liquid propulsion systems necessary to deliver crew and cargo to the further reaches of space. To further the objective of developing these systems, the Marshall Space Flight Center is currently in the process of formulating a Liquid Propulsion Systems testbed, which will enable rapid integration of components to be tested and assessed for performance in integrated systems. The manifestation of this testbed is a breadboard engine configuration (BBE) with facility support for consumables and/or other components as needed. The goal of the facility is to test NASA developed elements, but can be used to test articles developed by other government agencies, industry or academia. Joint government/private partnership is likely the approach that will be required to enable efficient propulsion system development. MSFC has recently tested its own additively manufactured liquid hydrogen pump, injector, and valves in a BBE hot firing. It is rapidly building toward testing the pump and a new CH4 injector in the BBE configuration to demonstrate a 22,000 lbf, pump-fed LO2/LCH4 engine for the Mars lander or in-space transportation. The value of having this BBE testbed is that as components are developed they may be easily integrated in the testbed and tested. MSFC is striving to enhance its liquid propulsion system development capability. Rapid design, analysis, build and test will be critical to fielding the next high thrust rocket engine. With the maturity of the

  1. Sensor Networking Testbed with IEEE 1451 Compatibility and Network Performance Monitoring

    NASA Technical Reports Server (NTRS)

    Gurkan, Deniz; Yuan, X.; Benhaddou, D.; Figueroa, F.; Morris, Jonathan

    2007-01-01

    Design and implementation of a testbed for testing and verifying IEEE 1451-compatible sensor systems with network performance monitoring is of significant importance. The performance parameters measurement as well as decision support systems implementation will enhance the understanding of sensor systems with plug-and-play capabilities. The paper will present the design aspects for such a testbed environment under development at University of Houston in collaboration with NASA Stennis Space Center - SSST (Smart Sensor System Testbed).

  2. The computational structural mechanics testbed generic structural-element processor manual

    NASA Technical Reports Server (NTRS)

    Stanley, Gary M.; Nour-Omid, Shahram

    1990-01-01

    The usage and development of structural finite element processors based on the CSM Testbed's Generic Element Processor (GEP) template is documented. By convention, such processors have names of the form ESi, where i is an integer. This manual is therefore intended for both Testbed users who wish to invoke ES processors during the course of a structural analysis, and Testbed developers who wish to construct new element processors (or modify existing ones).

  3. Comparisons of cloud cover estimates and cloud fraction profiles from ARM's cloud-detecting instruments and GOES-8 data

    SciTech Connect

    Krueger, S K; Rodriguez, D

    1999-05-07

    The DOE's Atmospheric Radiation Measurement (ARM) Program employs both upward- and downward-looking remote-sensing instruments to measure the horizontal and vertical distributions of clouds across its Southern Great Plains (SGP) site. No single instrument is capable of completely determining these distributions over the scales of interest to ARM's Single Column Modeling (SCM) and Instantaneous Radiative Flux (IRF) groups; these groups embody the primary strategies through which ARM expects to achieve its objectives of developing and testing cloud formation parameterizations (USDOE, 1996). Collectively, however, the data from ARM's cloud-detecting instruments offer the potential for such a three-dimensional characterization. Data intercomparisons, like the ones illustrated in this paper, are steps in this direction. Examples of some initial comparisons, involving satellite, millimeter cloud radar, whole sky imager and ceilometer data, are provided herein. that many of the lessons learned can later be adapted to cloud data at the Boundary and Extended Facilities. Principally, we are concerned about: (1) the accuracy of various estimates of cloud properties at a single point, or within a thin vertical column, above the CF over time, and (2) the accuracy of various estimates of cloud properties over the Cloud and Radiation Testbed (CART) site, which can then be reduced to single, representative profiles over time. In the former case, the results are usable in the IRF and SCM strategies; in the latter case, they satisfy SCM needs specifically. The Whole Sky Imager (WSI) and ceilometer data used in one study were collected at the SGP CF between October 1 and December 31, 1996 (Shields, et. al., 1990). This three-month period, corresponding to the first set of WSI data released by ARM's Experiment Center, was sufficiently long to reveal important trends (Rodriguez, 1998).

  4. Wavefront Amplitude Variation of TPF's High Contrast Imaging Testbed: Modeling and Experiment

    NASA Technical Reports Server (NTRS)

    Shi, Fang; Lowman, Andrew E.; Moody, Dwight C.; Niessner, Albert F.; Trauger, John T.

    2005-01-01

    Knowledge of wavefront amplitude is as important as the knowledge of phase for a coronagraphic high contrast imaging system. Efforts have been made to understand various contributions of the amplitude variation in Terrestrial Planet Finder's (TPF) High Contrast Imaging Testbed (HCIT). Modeling of HCIT with as-built mirror surfaces has shown an amplitude variation of 1.3% due to the phase-amplitude mixing for the testbed's front-end optics. Experimental measurements on the testbed have shown the amplitude variation is about 2.5% with the testbed's illumination pattern has a major contribution as the low order amplitude variation.

  5. Spacelab system analysis: A study of the Marshall Avionics System Testbed (MAST)

    NASA Technical Reports Server (NTRS)

    Ingels, Frank M.; Owens, John K.; Daniel, Steven P.; Ahmad, F.; Couvillion, W.

    1988-01-01

    An analysis of the Marshall Avionics Systems Testbed (MAST) communications requirements is presented. The average offered load for typical nodes is estimated. Suitable local area networks are determined.

  6. Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test S

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Communications, Navigation, and Network Reconfigurable Test-bed Flight Hardware Compatibility Test Sets and Networks Integration Management Office Testing for the Tracking and Data Relay Satellite System

  7. Cloud Modeling

    NASA Technical Reports Server (NTRS)

    Tao, Wei-Kuo; Moncrieff, Mitchell; Einaud, Franco (Technical Monitor)

    2001-01-01

    Numerical cloud models have been developed and applied extensively to study cloud-scale and mesoscale processes during the past four decades. The distinctive aspect of these cloud models is their ability to treat explicitly (or resolve) cloud-scale dynamics. This requires the cloud models to be formulated from the non-hydrostatic equations of motion that explicitly include the vertical acceleration terms since the vertical and horizontal scales of convection are similar. Such models are also necessary in order to allow gravity waves, such as those triggered by clouds, to be resolved explicitly. In contrast, the hydrostatic approximation, usually applied in global or regional models, does allow the presence of gravity waves. In addition, the availability of exponentially increasing computer capabilities has resulted in time integrations increasing from hours to days, domain grids boxes (points) increasing from less than 2000 to more than 2,500,000 grid points with 500 to 1000 m resolution, and 3-D models becoming increasingly prevalent. The cloud resolving model is now at a stage where it can provide reasonably accurate statistical information of the sub-grid, cloud-resolving processes poorly parameterized in climate models and numerical prediction models.

  8. Cloud Control

    ERIC Educational Resources Information Center

    Weinstein, Margery

    2012-01-01

    Your learning curriculum needs a new technological platform, but you don't have the expertise or IT equipment to pull it off in-house. The answer is a learning system that exists online, "in the cloud," where learners can access it anywhere, anytime. For trainers, cloud-based coursework often means greater ease of instruction resulting in greater…

  9. Complex Clouds

    Atmospheric Science Data Center

    2013-04-16

    ...     View Larger Image The complex structure and beauty of polar clouds are highlighted by these images acquired ... corner, the edge of the Antarctic coastline and some sea ice can be seen through some thin, high cirrus clouds. The right-hand panel ...

  10. Cloud Control

    ERIC Educational Resources Information Center

    Ramaswami, Rama; Raths, David; Schaffhauser, Dian; Skelly, Jennifer

    2011-01-01

    For many IT shops, the cloud offers an opportunity not only to improve operations but also to align themselves more closely with their schools' strategic goals. The cloud is not a plug-and-play proposition, however--it is a complex, evolving landscape that demands one's full attention. Security, privacy, contracts, and contingency planning are all…

  11. Cloud Cover

    ERIC Educational Resources Information Center

    Schaffhauser, Dian

    2012-01-01

    This article features a major statewide initiative in North Carolina that is showing how a consortium model can minimize risks for districts and help them exploit the advantages of cloud computing. Edgecombe County Public Schools in Tarboro, North Carolina, intends to exploit a major cloud initiative being refined in the state and involving every…

  12. Arctic Clouds

    Atmospheric Science Data Center

    2013-04-19

    ...   View Larger Image Stratus clouds are common in the Arctic during the summer months, and are important modulators of ... from MISR's two most obliquely forward-viewing cameras. The cold, stable air causes the clouds to persist in stratified layers, and this ...

  13. The Role of Standards in Cloud-Computing Interoperability

    DTIC Science & Technology

    2012-10-01

    Ahronovitz 2010, Harding 2010, Badger 2011, Kundra 2011]. Risks of vendor lock-in include reduced negotiation power in reaction to price increases and...use cases classified into three groups: cloud management, cloud interoperability, and cloud security [ Badger 2010]. These use cases are listed below... Badger 2010]: • Cloud Management Use Cases − Open an Account − Close an Account − Terminate an Account − Copy Data Objects into a Cloud − Copy

  14. COMPARISON OF MILLIMETER-WAVE CLOUD RADAR MEASUREMENTS FOR THE FALL 1997 CLOUD IOP

    SciTech Connect

    SEKELSKY,S.M.; LI,L.; GALLOWAY,J.; MCINTOSH,R.E.; MILLER,M.A.; CLOTHIAUX,E.E.; HAIMOV,S.; MACE,G.; SASSEN,K.

    1998-03-23

    One of the primary objectives of the Fall 1997 IOP was to intercompare Ka-band (35GHz) and W-band (95GHz) cloud radar observations and verify system calibrations. During September 1997, several cloud radars were deployed at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site, including the full time operation 35 GHz CART Millimeter-wave Cloud Radar (MMCR), (Moran, 1997), the University of Massachusetts (UMass) single antenna 33GHz/95 GHz Cloud Profiling Radar System (CPRS), (Sekelsky, 1996), the 95 GHz Wyoming Cloud Radar (WCR) flown on the University of Wyoming King Air (Galloway, 1996), the University of Utah 95 GHz radar and the dual-antenna Pennsylvania State University 94 GHz radar (Clothiaux, 1995). In this paper the authors discuss several issues relevant to comparison of ground-based radars, including the detection and filtering of insect returns. Preliminary comparisons of ground-based Ka-band radar reflectivity data and comparisons with airborne radar reflectivity measurements are also presented.

  15. Delivering Unidata Technology via the Cloud

    NASA Astrophysics Data System (ADS)

    Fisher, Ward; Oxelson Ganter, Jennifer

    2016-04-01

    Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.

  16. Space construction: an experimental testbed to develop enabling technologies

    NASA Astrophysics Data System (ADS)

    Schubert, Heidi C.; How, Jonathan P.

    1997-12-01

    This paper discusses a new testbed developed at the Stanford Aerospace Robotics Laboratory (ARL) to address some of the key issues associated with semi-autonomous construction in a hazardous environment like space. The new testbed consists of a large two-link manipulator carrying two smaller two-link arms. This macro/mini combination was developed to be representative of actual space manipulators, such as the SSRMS/SPDM planned for the Space Station. This new testbed will allow us to investigate several key issues associated with space construction, including teleoperation versus supervised autonomy, dexterous control of a robot with flexibility, and construction with multiple robots. A supervised autonomy approach has several advantages over the traditional teleoperation mode, including operation with time delay, smart control of a redundant manipulator, and improved contact control. To mimic the dynamics found in space manipulators, the main arm was designed to include joint flexibility. The arm operates in 2-D, with the end-point floating on air-bearing. This setup allows cooperation with existing free-flying robots in the ARL. This paper reports the first experiments with the arm which explore the advantages of moving from teleoperation or human-in-the-loop control to the human supervisory or task-level control. A simple task, such as capturing a satellite-like object floating on the table, is attempted first with the human directly driving the end-point and second with the human directing the robot at a task-level. Initial experimental results of these two control approaches are presented and compared.

  17. Automating NEURON Simulation Deployment in Cloud Resources.

    PubMed

    Stockton, David B; Santamaria, Fidel

    2017-01-01

    Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.

  18. The Advanced Orbiting Systems Testbed Program: Results to date

    NASA Technical Reports Server (NTRS)

    Otranto, John F.; Newsome, Penny A.

    1994-01-01

    The Consultative Committee for Space Data Systems (CCSDS) Recommendations for Packet Telemetry (PT) and Advanced Orbiting Systems (AOS) propose standard solutions to data handling problems common to many types of space missions. The Recommendations address only space/ground and space/space data handling systems. Goddard Space Flight Center's (GSFC's) AOS Testbed (AOST) Program was initiated to better understand the Recommendations and their impact on real-world systems, and to examine the extended domain of ground/ground data handling systems. The results and products of the Program will reduce the uncertainties associated with the development of operational space and ground systems that implement the Recommendations.

  19. Vehicle-network development on a communications-network testbed

    NASA Astrophysics Data System (ADS)

    Rapanotti, John L.

    2006-05-01

    Light armoured vehicles will rely on sensors, on-board computing and digital wireless communications to achieve improved performance and survivability. Constrained by low latency response to threats, individual vehicles will share sensory information with other platoon vehicles benefiting from a flexible, dynamic, self-adapting network environment. As sensor and computing capability increases, network communications will become saturated. To understand the operational requirements for these future vehicle networks, the High Capacity Technical Communications Network (HCTCN) Low Bandwidth Testbed (LBTB) has been developed to provide a simulated environment for the radios and candidate database and transmission protocols selected. These concepts and approach to network communications will be discussed in the paper.

  20. Gas dispersion with induced airflow in mobile olfaction testbed

    NASA Astrophysics Data System (ADS)

    Mamduh, S. M.; Kamarudin, K.; Visvanathan, R.; Yeon, A. S. A.; Shakaff, A. Y. M.; Zakaria, A.; Kamarudin, L. M.; Abdullah, A. H.

    2017-03-01

    The unpredictable nature of gas dispersion is a well-known issue in mobile olfaction. As roboticists tend to depend on simulations and try to recreate environmental conditions in such simulations, an accurate representation of the gas plume is needed. Current model based simulations may not be able to capture the time-varying and unpredictable nature of gas distribution accurately. This paper presents the real-time gas distribution dataset collected in a mobile olfaction testbed which captures the time varying nature of a gas plume.

  1. The Living With a Star Space Environment Testbed Experiments

    NASA Technical Reports Server (NTRS)

    Xapsos, Michael A.

    2014-01-01

    The focus of the Living With a Star (LWS) Space Environment Testbed (SET) program is to improve the performance of hardware in the space radiation environment. The program has developed a payload for the Air Force Research Laboratory (AFRL) Demonstration and Science Experiments (DSX) spacecraft that is scheduled for launch in August 2015 on the SpaceX Falcon Heavy rocket. The primary structure of DSX is an Evolved Expendable Launch Vehicle (EELV) Secondary Payload Adapter (ESPA) ring. DSX will be in a Medium Earth Orbit (MEO). This oral presentation will describe the SET payload.

  2. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of

  3. The CSM testbed matrix processors internal logic and dataflow descriptions

    NASA Technical Reports Server (NTRS)

    Regelbrugge, Marc E.; Wright, Mary A.

    1988-01-01

    This report constitutes the final report for subtask 1 of Task 5 of NASA Contract NAS1-18444, Computational Structural Mechanics (CSM) Research. This report contains a detailed description of the coded workings of selected CSM Testbed matrix processors (i.e., TOPO, K, INV, SSOL) and of the arithmetic utility processor AUS. These processors and the current sparse matrix data structures are studied and documented. Items examined include: details of the data structures, interdependence of data structures, data-blocking logic in the data structures, processor data flow and architecture, and processor algorithmic logic flow.

  4. The Living With a Star Program Space Environment Testbed

    NASA Technical Reports Server (NTRS)

    Barth, Janet; Day, John H. (Technical Monitor)

    2001-01-01

    This viewgraph presentation describes the objective, approach, and scope of the Living With a Star (LWS) program at the Marshall Space Flight Center. Scientists involved in the project seek to refine the understanding of space weather and the role of solar variability in terrestrial climate change. Research and the development of improved analytic methods have led to increased predictive capabilities and the improvement of environment specification models. Specifically, the Space Environment Testbed (SET) project of LWS is responsible for the implementation of improved engineering approaches to observing solar effects on climate change. This responsibility includes technology development, ground test protocol development, and the development of a technology application model/engineering tool.

  5. Phoenix Missile Hypersonic Testbed (PMHT): Project Concept Overview

    NASA Technical Reports Server (NTRS)

    Jones, Thomas P.

    2007-01-01

    An over view of research into a low cost hypersonic research flight test capability to increase the amount of hypersonic flight data to help bridge the large developmental gap between ground testing/analysis and major flight demonstrator Xplanes is provided. The major objectives included: develop an air launched missile booster research testbed; accurately deliver research payloads through programmable guidance to hypersonic test conditions; low cost; a high flight rate minimum of two flights per year and utilize surplus air launched missiles and NASA aircraft.

  6. Software Testbed for Developing and Evaluating Integrated Autonomous Subsystems

    NASA Technical Reports Server (NTRS)

    Ong, James; Remolina, Emilio; Prompt, Axel; Robinson, Peter; Sweet, Adam; Nishikawa, David

    2015-01-01

    To implement fault tolerant autonomy in future space systems, it will be necessary to integrate planning, adaptive control, and state estimation subsystems. However, integrating these subsystems is difficult, time-consuming, and error-prone. This paper describes Intelliface/ADAPT, a software testbed that helps researchers develop and test alternative strategies for integrating planning, execution, and diagnosis subsystems more quickly and easily. The testbed's architecture, graphical data displays, and implementations of the integrated subsystems support easy plug and play of alternate components to support research and development in fault-tolerant control of autonomous vehicles and operations support systems. Intelliface/ADAPT controls NASA's Advanced Diagnostics and Prognostics Testbed (ADAPT), which comprises batteries, electrical loads (fans, pumps, and lights), relays, circuit breakers, invertors, and sensors. During plan execution, an experimentor can inject faults into the ADAPT testbed by tripping circuit breakers, changing fan speed settings, and closing valves to restrict fluid flow. The diagnostic subsystem, based on NASA's Hybrid Diagnosis Engine (HyDE), detects and isolates these faults to determine the new state of the plant, ADAPT. Intelliface/ADAPT then updates its model of the ADAPT system's resources and determines whether the current plan can be executed using the reduced resources. If not, the planning subsystem generates a new plan that reschedules tasks, reconfigures ADAPT, and reassigns the use of ADAPT resources as needed to work around the fault. The resource model, planning domain model, and planning goals are expressed using NASA's Action Notation Modeling Language (ANML). Parts of the ANML model are generated automatically, and other parts are constructed by hand using the Planning Model Integrated Development Environment, a visual Eclipse-based IDE that accelerates ANML model development. Because native ANML planners are currently

  7. EMERGE - ESnet/MREN Regional Science Grid Experimental NGI Testbed

    SciTech Connect

    Mambretti, Joe; DeFanti, Tom; Brown, Maxine

    2001-07-31

    This document is the final report on the EMERGE Science Grid testbed research project from the perspective of the International Center for Advanced Internet Research (iCAIR) at Northwestern University, which was a subcontractor to this UIC project. This report is a compilation of information gathered from a variety of materials related to this project produced by multiple EMERGE participants, especially those at Electronic Visualization Lab (EVL) at the University of Illinois at Chicago (UIC), Argonne National Lab and iCAIR. The EMERGE Science Grid project was managed by Tom DeFanti, PI from EVL at UIC.

  8. CCPP-ARM Parameterization Testbed Model Forecast Data

    DOE Data Explorer

    Klein, Stephen

    2008-01-15

    Dataset contains the NCAR CAM3 (Collins et al., 2004) and GFDL AM2 (GFDL GAMDT, 2004) forecast data at locations close to the ARM research sites. These data are generated from a series of multi-day forecasts in which both CAM3 and AM2 are initialized at 00Z every day with the ECMWF reanalysis data (ERA-40), for the year 1997 and 2000 and initialized with both the NASA DAO Reanalyses and the NCEP GDAS data for the year 2004. The DOE CCPP-ARM Parameterization Testbed (CAPT) project assesses climate models using numerical weather prediction techniques in conjunction with high quality field measurements (e.g. ARM data).

  9. Performance of the PARCS Testbed Cesium Fountain Frequency Standard

    NASA Technical Reports Server (NTRS)

    Enzer, Daphna G.; Klipstein, William M.

    2004-01-01

    A cesium fountain frequency standard has been developed as a ground testbed for the PARCS (Primary Atomic Reference Clock in Space) experiment, an experiment intended to fly on the International Space Station. We report on the performance of the fountain and describe some of the implementations motivated in large part by flight considerations, but of relevance for ground fountains. In particular, we report on a new technique for delivering cooling and trapping laser beams to the atom collection region, in which a given beam is recirculated three times effectively providing much more optical power than traditional configurations. Allan deviations down to 10 have been achieved with this method.

  10. Observational evidence linking precipitation and mesoscale cloud fraction in the southeast Pacific

    NASA Astrophysics Data System (ADS)

    Rapp, Anita D.

    2016-07-01

    Precipitation has been hypothesized to play an important role in the transition of low clouds from closed to open cell cumulus in regions of large-scale subsidence. A synthesis of A-Train satellite measurements is used to examine the relationship between precipitation and mesoscale cloud fraction across a transition region in the southeastern Pacific. Low cloud pixels are identified in 4 years of CloudSat/CALIPSO observations and along-track mean cloud fraction within 2.5-500 km surrounding the clouds calculated. Results show that cloud fraction decreases more rapidly in areas surrounding precipitating clouds than around nonprecipitating clouds. The closed to open cell transition region appears especially sensitive, with the surrounding mesoscale cloud fraction decreasing 30% faster in the presence of precipitation compared to nonprecipitating clouds. There is also dependence on precipitation rate and cloud liquid water path (LWP), with higher rain rates or lower LWP showing larger decreases in surrounding cloud fraction.

  11. PORT: A Testbed Paradigm for On-line Digital Archive Development.

    ERIC Educational Resources Information Center

    Keeler, Mary; Kloesel, Christian

    1997-01-01

    Discusses the Peirce On-line Resource Testbed (PORT), a digital archive of primary data. Highlights include knowledge processing testbeds for digital resource development; Peirce's pragmatism in operation; PORT and knowledge processing; obstacles to archive access; and PORT as a paradigm for critical control in knowledge processing. (AEF)

  12. A Simulation Testbed for Adaptive Modulation and Coding in Airborne Telemetry

    DTIC Science & Technology

    2014-05-29

    development, implementation, and testing/verification of algorithms for airborne telemetry applications. This testbed utilizes both SOQPSK and OFDM for...SOQPSK), Orthogonal Frequency Division Multiplexing ( OFDM ), Bit Error Rate, (BER) 16. SECURITY CLASSIFICATION OF: Unclassified 17. LIMITATION OF...implementation, and testing/verification of algorithms for airborne telemetry applications. This testbed utilizes both SOQPSK and OFDM for its modulation

  13. 77 FR 18793 - Spectrum Sharing Innovation Test-Bed Pilot Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-28

    ... National Telecommunications and Information Administration Spectrum Sharing Innovation Test-Bed Pilot... conduct in Phase II/III of the Spectrum Sharing Innovation Test-Bed pilot program to assess whether... Communications Commission (FCC) and other federal agencies, established a Spectrum Sharing Innovation...

  14. Development of a flexible test-bed for robotics, telemanipulation and servicing research

    NASA Technical Reports Server (NTRS)

    Davies, Barry F.

    1989-01-01

    The development of a flexible operation test-bed, based around a commercially available ASEA industrial robot is described. The test-bed was designed to investigate fundamental human factors issues concerned with the unique problems of robotic manipulation in the hostile environment of Space.

  15. Cloud Formation

    NASA Astrophysics Data System (ADS)

    Graham, Mark Talmage

    2004-05-01

    Cloud formation is crucial to the heritage of modern physics, and there is a rich literature on this important topic. In 1927, Charles T.R. Wilson was awarded the Nobel Prize in physics for applications of the cloud chamber.2 Wilson was inspired to study cloud formation after working at a meteorological observatory on top of the highest mountain in Scotland, Ben Nevis, and testified near the end of his life, "The whole of my scientific work undoubtedly developed from the experiments I was led to make by what I saw during my fortnight on Ben Nevis in September 1894."3 To form clouds, Wilson used the sudden expansion of humid air.4 Any structure the cloud may have is spoiled by turbulence in the sudden expansion, but in 1912 Wilson got ion tracks to show up by using strobe photography of the chamber immediately upon expansion.5 In the interim, Millikan's study in 1909 of the formation of cloud droplets around individual ions was the first in which the electron charge was isolated. This study led to his famous oil drop experiment.6 To Millikan, as to Wilson, meteorology and physics were professionally indistinct. With his meteorological physics expertise, in WWI Millikan commanded perhaps the first meteorological observation and forecasting team essential to military operation in history.7 But even during peacetime meteorology is so much of a concern to everyone that a regular news segment is dedicated to it. Weather is the universal conversation topic, and life on land could not exist as we know it without clouds. One wonders then, why cloud formation is never covered in physics texts.

  16. The Wide-Field Imaging Interferometry Testbed: Enabling Techniques for High Angular Resolution Astronomy

    NASA Technical Reports Server (NTRS)

    Rinehart, S. A.; Armstrong, T.; Frey, Bradley J.; Jung, J.; Kirk, J.; Leisawitz, David T.; Leviton, Douglas B.; Lyon, R.; Maher, Stephen; Martino, Anthony J.; Pauls, T.

    2007-01-01

    The Wide-Field Imaging Interferometry Testbed (WIIT) was designed to develop techniques for wide-field of view imaging interferometry, using "double-Fourier" methods. These techniques will be important for a wide range of future spacebased interferometry missions. We have provided simple demonstrations of the methodology already, and continuing development of the testbed will lead to higher data rates, improved data quality, and refined algorithms for image reconstruction. At present, the testbed effort includes five lines of development; automation of the testbed, operation in an improved environment, acquisition of large high-quality datasets, development of image reconstruction algorithms, and analytical modeling of the testbed. We discuss the progress made towards the first four of these goals; the analytical modeling is discussed in a separate paper within this conference.

  17. High performance testbed for four-beam infrared interferometric nulling and exoplanet detection.

    PubMed

    Martin, Stefan; Booth, Andrew; Liewer, Kurt; Raouf, Nasrat; Loya, Frank; Tang, Hong

    2012-06-10

    Technology development for a space-based infrared nulling interferometer capable of earthlike exoplanet detection and characterization started in earnest in the last 10 years. At the Jet Propulsion Laboratory, the planet detection testbed was developed to demonstrate the principal components of the beam combiner train for a high performance four-beam nulling interferometer. Early in the development of the testbed, the importance of "instability noise" for nulling interferometer sensitivity was recognized, and the four-beam testbed would produce this noise, allowing investigation of methods for mitigating this noise source. The testbed contains the required features of a four-beam combiner for a space interferometer and performs at a level matching that needed for the space mission. This paper describes in detail the design, functions, and controls of the testbed.

  18. Development of optical packet and circuit integrated ring network testbed.

    PubMed

    Furukawa, Hideaki; Harai, Hiroaki; Miyazawa, Takaya; Shinada, Satoshi; Kawasaki, Wataru; Wada, Naoya

    2011-12-12

    We developed novel integrated optical packet and circuit switch-node equipment. Compared with our previous equipment, a polarization-independent 4 × 4 semiconductor optical amplifier switch subsystem, gain-controlled optical amplifiers, and one 100 Gbps optical packet transponder and seven 10 Gbps optical path transponders with 10 Gigabit Ethernet (10GbE) client-interfaces were newly installed in the present system. The switch and amplifiers can provide more stable operation without equipment adjustments for the frequent polarization-rotations and dynamic packet-rate changes of optical packets. We constructed an optical packet and circuit integrated ring network testbed consisting of two switch nodes for accelerating network development, and we demonstrated 66 km fiber transmission and switching operation of multiplexed 14-wavelength 10 Gbps optical paths and 100 Gbps optical packets encapsulating 10GbE frames. Error-free (frame error rate < 1×10(-4)) operation was achieved with optical packets of various packet lengths and packet rates, and stable operation of the network testbed was confirmed. In addition, 4K uncompressed video streaming over OPS links was successfully demonstrated.

  19. Airborne Subscale Transport Aircraft Research Testbed: Aircraft Model Development

    NASA Technical Reports Server (NTRS)

    Jordan, Thomas L.; Langford, William M.; Hill, Jeffrey S.

    2005-01-01

    The Airborne Subscale Transport Aircraft Research (AirSTAR) testbed being developed at NASA Langley Research Center is an experimental flight test capability for research experiments pertaining to dynamics modeling and control beyond the normal flight envelope. An integral part of that testbed is a 5.5% dynamically scaled, generic transport aircraft. This remotely piloted vehicle (RPV) is powered by twin turbine engines and includes a collection of sensors, actuators, navigation, and telemetry systems. The downlink for the plane includes over 70 data channels, plus video, at rates up to 250 Hz. Uplink commands for aircraft control include over 30 data channels. The dynamic scaling requirement, which includes dimensional, weight, inertial, actuator, and data rate scaling, presents distinctive challenges in both the mechanical and electrical design of the aircraft. Discussion of these requirements and their implications on the development of the aircraft along with risk mitigation strategies and training exercises are included here. Also described are the first training (non-research) flights of the airframe. Additional papers address the development of a mobile operations station and an emulation and integration laboratory.

  20. A Battery Certification Testbed for Small Satellite Missions

    NASA Technical Reports Server (NTRS)

    Cameron, Zachary; Kulkarni, Chetan S.; Luna, Ali Guarneros; Goebel, Kai; Poll, Scott

    2015-01-01

    A battery pack consisting of standard cylindrical 18650 lithium-ion cells has been chosen for small satellite missions based on previous flight heritage and compliance with NASA battery safety requirements. However, for batteries that transit through the International Space Station (ISS), additional certification tests are required for individual cells as well as the battery packs. In this manuscript, we discuss the development of generalized testbeds for testing and certifying different types of batteries critical to small satellite missions. Test procedures developed and executed for this certification effort include: a detailed physical inspection before and after experiments; electrical cycling characterization at the cell and pack levels; battery-pack overcharge, over-discharge, external short testing; battery-pack vacuum leak and vibration testing. The overall goals of these certification procedures are to conform to requirements set forth by the agency and identify unique safety hazards. The testbeds, procedures, and experimental results are discussed for batteries chosen for small satellite missions to be launched from the ISS.

  1. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2012-01-01

    NASAs Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASAs Space Telecommunications Radio System(STRS) architecture standard. Pre-launch testing with the testbeds software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  2. STRS Radio Service Software for NASA's SCaN Testbed

    NASA Technical Reports Server (NTRS)

    Mortensen, Dale J.; Bishop, Daniel Wayne; Chelmins, David T.

    2013-01-01

    NASA's Space Communication and Navigation(SCaN) Testbed was launched to the International Space Station in 2012. The objective is to promote new software defined radio technologies and associated software application reuse, enabled by this first flight of NASA's Space Telecommunications Radio System (STRS) architecture standard. Pre-launch testing with the testbed's software defined radios was performed as part of system integration. Radio services for the JPL SDR were developed during system integration to allow the waveform application to operate properly in the space environment, especially considering thermal effects. These services include receiver gain control, frequency offset, IQ modulator balance, and transmit level control. Development, integration, and environmental testing of the radio services will be described. The added software allows the waveform application to operate properly in the space environment, and can be reused by future experimenters testing different waveform applications. Integrating such services with the platform provided STRS operating environment will attract more users, and these services are candidates for interface standardization via STRS.

  3. An Overview of NASA's Subsonic Research Aircraft Testbed (SCRAT)

    NASA Technical Reports Server (NTRS)

    Baumann, Ethan; Hernandez, Joe; Ruhf, John C.

    2013-01-01

    National Aeronautics and Space Administration Dryden Flight Research Center acquired a Gulfstream III (GIII) aircraft to serve as a testbed for aeronautics flight research experiments. The aircraft is referred to as SCRAT, which stands for SubsoniC Research Aircraft Testbed. The aircraft's mission is to perform aeronautics research; more specifically raising the Technology Readiness Level (TRL) of advanced technologies through flight demonstrations and gathering high-quality research data suitable for verifying the technologies, and validating design and analysis tools. The SCRAT has the ability to conduct a range of flight research experiments throughout a transport class aircraft's flight envelope. Experiments ranging from flight-testing of a new aircraft system or sensor to those requiring structural and aerodynamic modifications to the aircraft can be accomplished. The aircraft has been modified to include an instrumentation system and sensors necessary to conduct flight research experiments along with a telemetry capability. An instrumentation power distribution system was installed to accommodate the instrumentation system and future experiments. An engineering simulation of the SCRAT has been developed to aid in integrating research experiments. A series of baseline aircraft characterization flights has been flown that gathered flight data to aid in developing and integrating future research experiments. This paper describes the SCRAT's research systems and capabilities.

  4. Characterization of Vegetation using the UC Davis Remote Sensing Testbed

    NASA Astrophysics Data System (ADS)

    Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.

    2006-12-01

    Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.

  5. Function-based integration strategy for an agile manufacturing testbed

    NASA Astrophysics Data System (ADS)

    Park, Hisup

    1997-01-01

    This paper describes an integration strategy for plug-and- play software based on functional descriptions of the software modules. The functional descriptions identify explicitly the role of each module with respect to the overall systems. They define the critical dependencies that affect the individual modules and thus affect the behavior of the system. The specified roles, dependencies and behavioral constraints are then incorporated in a group of shared objects that are distributed over a network. These objects may be interchanged with others without disrupting the system so long as the replacements meet the interface and functional requirements. In this paper, we propose a framework for modeling the behavior of plug-and-play software modules that will be used to (1) design and predict the outcome of the integration, (2) generate the interface and functional requirements of individual modules, and (3) form a dynamic foundation for applying interchangeable software modules. I describe this strategy in the context of the development of an agile manufacturing testbed. The testbed represents a collection of production cells for machining operations, supported by a network of software modules or agents for planning, fabrication, and inspection. A process definition layer holds the functional description of the software modules. A network of distributed objects interact with one another over the Internet and comprise the plug-compatible software nodes that execute these functions. This paper will explore the technical and operational ramifications of using the functional description framework to organize and coordinate the distributed object modules.

  6. Benchmarking Diagnostic Algorithms on an Electrical Power System Testbed

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia, David; Wright, Stephanie

    2009-01-01

    Diagnostic algorithms (DAs) are key to enabling automated health management. These algorithms are designed to detect and isolate anomalies of either a component or the whole system based on observations received from sensors. In recent years a wide range of algorithms, both model-based and data-driven, have been developed to increase autonomy and improve system reliability and affordability. However, the lack of support to perform systematic benchmarking of these algorithms continues to create barriers for effective development and deployment of diagnostic technologies. In this paper, we present our efforts to benchmark a set of DAs on a common platform using a framework that was developed to evaluate and compare various performance metrics for diagnostic technologies. The diagnosed system is an electrical power system, namely the Advanced Diagnostics and Prognostics Testbed (ADAPT) developed and located at the NASA Ames Research Center. The paper presents the fundamentals of the benchmarking framework, the ADAPT system, description of faults and data sets, the metrics used for evaluation, and an in-depth analysis of benchmarking results obtained from testing ten diagnostic algorithms on the ADAPT electrical power system testbed.

  7. Astronomy In The Cloud: Using Mapreduce For Image Coaddition

    NASA Astrophysics Data System (ADS)

    Wiley, Keith; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-01-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computational challenges such as anomaly detection, classification, and moving object tracking. Since such studies require the highest quality data, methods such as image coaddition, i.e., registration, stacking, and mosaicing, will be critical to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources, e.g., asteroids, or transient objects, e.g., supernovae, these datastreams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this paper we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data is partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources, i.e., platforms where Hadoop is offered as a service. We report on our experience implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multi-terabyte imaging dataset provides a good testbed for algorithm development since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image coaddition to the MapReduce framework. Then we describe a number of optimizations to our basic approach and report experimental results compring their performance. This work is funded by

  8. Aerosol-cloud interactions in ship tracks using Terra MODIS/MISR

    NASA Astrophysics Data System (ADS)

    Chen, Yi-Chun; Christensen, Matthew W.; Diner, David J.; Garay, Michael J.

    2015-04-01

    Simultaneous ship track observations from Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Multiangle Imaging Spectroradiometer (MISR) have been compiled to investigate how ship-injected aerosols affect marine warm boundary layer clouds for different cloud types and environmental conditions. By taking advantage of the high spatial resolution multiangle observations available from MISR, we utilized the retrieved cloud albedo, cloud top height, and cloud motion vectors to examine cloud property responses in ship-polluted and nearby unpolluted clouds. The strength of the cloud albedo response to increased aerosol level is primarily dependent on cloud cell structure, dryness of the free troposphere, and boundary layer depth, corroborating a previous study by Chen et al. (2012) where A-Train satellite data were utilized. Under open cell cloud structure the cloud properties are more susceptible to aerosol perturbations as compared to closed cells. Aerosol plumes caused an increase in liquid water amount (+38%), cloud top height (+13%), and cloud albedo (+49%) for open cell clouds, whereas for closed cell clouds, little change in cloud properties was observed. Further capitalizing on MISR's unique capabilities, the MISR cross-track cloud speed was used to derive cloud top divergence. Statistically averaging the results from the identified plume segments to reduce random noise, we found evidence of cloud top divergence in the ship-polluted clouds, whereas the nearby unpolluted clouds showed cloud top convergence, providing observational evidence of a change in local mesoscale circulation associated with enhanced aerosols. Furthermore, open cell polluted clouds revealed stronger cloud top divergence as compared to closed cell clouds, consistent with different dynamical mechanisms driving their responses. These results suggest that detailed cloud responses, classified by cloud type and environmental conditions, must be accounted for in global climate modeling

  9. Development of a hybrid cloud parameterization for general circulation models

    SciTech Connect

    Kao, C.Y.J.; Kristjansson, J.E.; Langley, D.L.

    1995-04-01

    We have developed a cloud package with state-of-the-art physical schemes that can parameterize low-level stratus or stratocumulus, penetrative cumulus, and high-level cirrus. Such parameterizations will improve cloud simulations in general circulation models (GCMs). The principal tool in this development comprises the physically based Arakawa-Schubert scheme for convective clouds and the Sundqvist scheme for layered, nonconvective clouds. The term {open_quotes}hybrid{close_quotes} addresses the fact that the generation of high-attitude layered clouds can be associated with preexisting convective clouds. Overall, the cloud parameterization package developed should better determine cloud heating and drying effects in the thermodynamic budget, realistic precipitation patterns, cloud coverage and liquid/ice water content for radiation purposes, and the cloud-induced transport and turbulent diffusion for atmospheric trace gases.

  10. Space Station technology testbed: 2010 deep space transport

    NASA Technical Reports Server (NTRS)

    Holt, Alan C.

    1993-01-01

    A space station in a crew-tended or permanently crewed configuration will provide major R&D opportunities for innovative, technology and materials development and advanced space systems testing. A space station should be designed with the basic infrastructure elements required to grow into a major systems technology testbed. This space-based technology testbed can and should be used to support the development of technologies required to expand our utilization of near-Earth space, the Moon and the Earth-to-Jupiter region of the Solar System. Space station support of advanced technology and materials development will result in new techniques for high priority scientific research and the knowledge and R&D base needed for the development of major, new commercial product thrusts. To illustrate the technology testbed potential of a space station and to point the way to a bold, innovative approach to advanced space systems' development, a hypothetical deep space transport development and test plan is described. Key deep space transport R&D activities are described would lead to the readiness certification of an advanced, reusable interplanetary transport capable of supporting eight crewmembers or more. With the support of a focused and highly motivated, multi-agency ground R&D program, a deep space transport of this type could be assembled and tested by 2010. Key R&D activities on a space station would include: (1) experimental research investigating the microgravity assisted, restructuring of micro-engineered, materials (to develop and verify the in-space and in-situ 'tuning' of materials for use in debris and radiation shielding and other protective systems), (2) exposure of microengineered materials to the space environment for passive and operational performance tests (to develop in-situ maintenance and repair techniques and to support the development, enhancement, and implementation of protective systems, data and bio-processing systems, and virtual reality and

  11. SPHERES: Design of a Formation Flying Testbed for ISS

    NASA Astrophysics Data System (ADS)

    Sell, S. W.; Chen, S. E.

    2002-01-01

    The SPHERES (Synchronized Position Hold Engage and Reorient Experimental Satellites) payload is an innovative formation-flying spacecraft testbed currently being developed for use internally aboard the International Space Station (ISS). The purpose of the testbed is to provide a cost-effective, long duration, replenishable, and easily reconfigurable platform with representative dynamics for the development and validation of metrology, formation flying, and autonomy algorithms. The testbed components consist of three 8-inch diameter free-flying "satellites," five ultrasound beacons, and an ISS laptop workstation. Each satellite is self-contained with on-board battery power, cold-gas propulsion (CO2), and processing systems. Satellites use two packs of eight standard AA batteries for approximately 90 minutes of lifetime while beacons last the duration of the mission powered by a single AA battery. The propulsion system uses pressurized carbon dioxide gas, stored in replaceable tanks, distributed through an adjustable regulator and associated tubing to twelve thrusters located on the faces of the satellites. A Texas Instruments C6701 DSP handles control algorithm data while an FPGA manages all sensor data, timing, and communication processes on the satellite. All three satellites communicate with each other and with the controlling laptop via a wireless RF link. Five ultrasound beacons, located around a predetermined work area, transmit ultrasound signals that are received by each satellite. The system effectively acts as a pseudo-GPS system, allowing the satellites to determine position and attitude and to navigate within the test arena. The payload hardware are predominantly Commercial Off The Shelf (COTS) products with the exception of custom electronics boards, selected propulsion system adaptors, and beacon and satellite structural elements. Operationally, SPHERES will run in short duration test sessions with approximately two weeks between each session. During

  12. CLOUD CHEMISTRY.

    SciTech Connect

    SCHWARTZ,S.E.

    2001-03-01

    Clouds present substantial concentrations of liquid-phase water, which can potentially serve as a medium for dissolution and reaction of atmospheric gases. The important precursors of acid deposition, SO{sub 2} and nitrogen oxides NO and NO{sub 2} are only sparingly soluble in clouds without further oxidation to sulfuric and nitric acids. In the case of SO{sub 2} aqueous-phase reaction with hydrogen peroxide, and to lesser extent ozone, are identified as important processes leading to this oxidation, and methods have been described by which to evaluate the rates of these reactions. The limited solubility of the nitrogen oxides precludes significant aqueous-phase reaction of these species, but gas-phase reactions in clouds can be important especially at night.

  13. Neptune's clouds

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The bright cirrus-like clouds of Neptune change rapidly, often forming and dissipating over periods of several to tens of hours. In this sequence Voyager 2 observed cloud evolution in the region around the Great Dark Spot (GDS). The surprisingly rapid changes which occur separating each panel shows that in this region Neptune's weather is perhaps as dynamic and variable as that of the Earth. However, the scale is immense by our standards -- the Earth and the GDS are of similar size -- and in Neptune's frigid atmosphere, where temperatures are as low as 55 degrees Kelvin (-360 F), the cirrus clouds are composed of frozen methane rather than Earth's crystals of water ice. The Voyager Mission is conducted by JPL for NASA's Office of Space Science and Applications

  14. Our World: Cool Clouds

    NASA Video Gallery

    Learn how clouds are formed and watch an experiment to make a cloud using liquid nitrogen. Find out how scientists classify clouds according to their altitude and how clouds reflect and absorb ligh...

  15. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Mike; Munson, Mike; Teate, George

    2006-01-01

    A new testbed for hypersonic flight research is proposed. Known as the Phoenix air-launched small missile (ALSM) flight testbed, it was conceived to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of two unique and very capable flight assets: the United States Navy Phoenix AIM-54 long-range, guided air-to-air missile and the NASA Dryden F-15B testbed airplane. The U.S. Navy retirement of the Phoenix AIM-54 missiles from fleet operation has presented an excellent opportunity for converting this valuable flight asset into a new flight testbed. This cost-effective new platform will fill an existing gap in the test and evaluation of current and future hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform. When launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will be valuable for the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite small-payload air-launched space boosters.

  16. Intelligent Elements for the ISHM Testbed and Prototypes (ITP) Project

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Park, Han; Schwabacher, Mark; Watson, Michael; Mackey, Ryan; Fijany, Amir; Trevino, Luis; Weir, John

    2005-01-01

    Deep-space manned missions will require advanced automated health assessment capabilities. Requirements such as in-space assembly, long dormant periods and limited accessibility during flight, present significant challenges that should be addressed through Integrated System Health Management (ISHM). The ISHM approach will provide safety and reliability coverage for a complete system over its entire life cycle by determining and integrating health status and performance information from the subsystem and component levels. This paper will focus on the potential advanced diagnostic elements that will provide intelligent assessment of the subsystem health and the planned implementation of these elements in the ISHM Testbed and Prototypes (ITP) Project under the NASA Exploration Systems Research and Technology program.

  17. Test applications for heterogeneous real-time network testbed

    SciTech Connect

    Mines, R.F.; Knightly, E.W.

    1994-07-01

    This paper investigates several applications for a heterogeneous real-time network testbed. The network is heterogeneous in terms of network devices, technologies, protocols, and algorithms. The network is real-time in that its services can provide per-connection end-to-end performance guarantees. Although different parts of the network use different algorithms, all components have the necessary mechanisms to provide performance guarantees: admission control and priority scheduling. Three applications for this network are described in this paper: a video conferencing tool, a tool for combustion modeling using distributed computing, and an MPEG video archival system. Each has minimum performance requirements that must be provided by the network. By analyzing these applications, we provide insights to the traffic characteristics and performance requirements of practical real-time loads.

  18. X-ray Pulsar Navigation Algorithms and Testbed for SEXTANT

    NASA Technical Reports Server (NTRS)

    Winternitz, Luke M. B.; Hasouneh, Monther A.; Mitchell, Jason W.; Valdez, Jennifer E.; Price, Samuel R.; Semper, Sean R.; Yu, Wayne H.; Ray, Paul S.; Wood, Kent S.; Arzoumanian, Zaven; Grendreau, Keith C.

    2015-01-01

    The Station Explorer for X-ray Timing and Navigation Technology (SEXTANT) is a NASA funded technologydemonstration. SEXTANT will, for the first time, demonstrate real-time, on-board X-ray Pulsar-based Navigation (XNAV), a significant milestone in the quest to establish a GPS-like navigation capability available throughout our Solar System and beyond. This paper describes the basic design of the SEXTANT system with a focus on core models and algorithms, and the design and continued development of the GSFC X-ray Navigation Laboratory Testbed (GXLT) with its dynamic pulsar emulation capability. We also present early results from GXLT modeling of the combined NICER X-ray timing instrument hardware and SEXTANT flight software algorithms.

  19. Simulation to Flight Test for a UAV Controls Testbed

    NASA Technical Reports Server (NTRS)

    Motter, Mark A.; Logan, Michael J.; French, Michael L.; Guerreiro, Nelson M.

    2006-01-01

    The NASA Flying Controls Testbed (FLiC) is a relatively small and inexpensive unmanned aerial vehicle developed specifically to test highly experimental flight control approaches. The most recent version of the FLiC is configured with 16 independent aileron segments, supports the implementation of C-coded experimental controllers, and is capable of fully autonomous flight from takeoff roll to landing, including flight test maneuvers. The test vehicle is basically a modified Army target drone, AN/FQM-117B, developed as part of a collaboration between the Aviation Applied Technology Directorate (AATD) at Fort Eustis, Virginia and NASA Langley Research Center. Several vehicles have been constructed and collectively have flown over 600 successful test flights, including a fully autonomous demonstration at the Association of Unmanned Vehicle Systems International (AUVSI) UAV Demo 2005. Simulations based on wind tunnel data are being used to further develop advanced controllers for implementation and flight test.

  20. A Simulation Testbed for Airborne Merging and Spacing

    NASA Technical Reports Server (NTRS)

    Santos, Michel; Manikonda, Vikram; Feinberg, Art; Lohr, Gary

    2008-01-01

    The key innovation in this effort is the development of a simulation testbed for airborne merging and spacing (AM&S). We focus on concepts related to airports with Super Dense Operations where new airport runway configurations (e.g. parallel runways), sequencing, merging, and spacing are some of the concepts considered. We focus on modeling and simulating a complementary airborne and ground system for AM&S to increase efficiency and capacity of these high density terminal areas. From a ground systems perspective, a scheduling decision support tool generates arrival sequences and spacing requirements that are fed to the AM&S system operating on the flight deck. We enhanced NASA's Airspace Concept Evaluation Systems (ACES) software to model and simulate AM&S concepts and algorithms.

  1. The computational structural mechanics testbed architecture. Volume 2: The interface

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the third set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 3 describes the CLIP-Processor interface and related topics. It is intended only for processor developers.

  2. Experimental Testbed for the Study of Hydrodynamic Issues in Supernovae

    SciTech Connect

    Robey, H F; Kane, J O; Remington, B A; Drake, R P; Hurricane, O A; Louis, H; Wallace, R J; Knauer, J; Keiter, P; Arnett, D

    2000-10-09

    More than a decade after the explosion of SN 1987A, unresolved discrepancies still remain in attempts to numerically simulate the mixing processes initiated by the passage of a very strong shock through the layered structure of the progenitor star. Numerically computed velocities of the radioactive {sup 56}Ni and {sup 56}CO, produced by shock-induced explosive burning within the silicon layer for example, are still more than 50% too low as compared with the measured velocities. In order to resolve such discrepancies between observation and simulation, an experimental testbed has been designed on the Omega Laser for the study of hydrodynamic issues of importance to supernovae (SNe). In this paper, we present results from a series of scaled laboratory experiments designed to isolate and explore several issues in the hydrodynamics of SN explosions. The results of the experiments are compared with numerical simulations and are generally found to be in reasonable agreement.

  3. Phase retrieval algorithm for JWST Flight and Testbed Telescope

    NASA Astrophysics Data System (ADS)

    Dean, Bruce H.; Aronstein, David L.; Smith, J. Scott; Shiri, Ron; Acton, D. Scott

    2006-06-01

    An image-based wavefront sensing and control algorithm for the James Webb Space Telescope (JWST) is presented. The algorithm heritage is discussed in addition to implications for algorithm performance dictated by NASA's Technology Readiness Level (TRL) 6. The algorithm uses feedback through an adaptive diversity function to avoid the need for phase-unwrapping post-processing steps. Algorithm results are demonstrated using JWST Testbed Telescope (TBT) commissioning data and the accuracy is assessed by comparison with interferometer results on a multi-wave phase aberration. Strategies for minimizing aliasing artifacts in the recovered phase are presented and orthogonal basis functions are implemented for representing wavefronts in irregular hexagonal apertures. Algorithm implementation on a parallel cluster of high-speed digital signal processors (DSPs) is also discussed.

  4. SIM Interferometer Testbed (SCDU) Status and Recent Results

    NASA Technical Reports Server (NTRS)

    Nemati, Bijan; An, Xin; Goullioud, Renaud; Shao, Michael; Shen, Tsae-Pyng; Wehmeier, Udo J.; Weilert, Mark A.; Wang, Xu; Werne, Thomas A.; Wu, Janet P.; Zhai, Chengxing

    2010-01-01

    SIM Lite is a space-borne stellar interferometer capable of searching for Earth-size planets in the habitable zones of nearby stars. This search will require measurement of astrometric angles with sub micro-arcsecond accuracy and optical pathlength differences to 1 picometer by the end of the five-year mission. One of the most significant technical risks in achieving this level of accuracy is from systematic errors that arise from spectral differences between candidate stars and nearby reference stars. The Spectral Calibration Development Unit (SCDU), in operation since 2007, has been used to explore this effect and demonstrate performance meeting SIM goals. In this paper we present the status of this testbed and recent results.

  5. Active structural subsystem of the OISI interferometry testbed

    NASA Astrophysics Data System (ADS)

    Döngi, Frank; Johann, Ulrich; Szerdahelyi, Laszlo

    1999-12-01

    An adaptive truss structure has been realized for active vibration damping within a laboratory testbed for future spaceborne optical and infra-red interferometers. The active elements are based on piezoelectric sensors and actuators. The paper first surveys configuration scenarios for space interferometers that aim at nanometre accuracy of optical pathlengths. It then focuses on the function of active structural control. For the laboratory truss, practical design considerations as well as analytical approaches for modelling and system identification, placement of active elements and design of active damping control are discussed in detail. Experimental results of the active damping performance achieved with integral feedback of strut force signals are compared with analytical predictions. The combined effects of active damping and passive vibration isolation are presented, and conclusions are drawn regarding further activities towards nanometre stabilization of optical pathlengths.

  6. The magic crayon: an object definition and volume calculation testbed

    NASA Astrophysics Data System (ADS)

    Beard, David V.; Faith, R. E.; Eberly, David H.; Pizer, Stephen M.; Kurak, Charles; Johnston, Richard E.

    1993-09-01

    Rapid, accurate definition and volume calculation of anatomical objects is essential for effective CT and MR diagnosis. Absolute volumes often signal abnormalities while relative volumes--such as a change in tumor size--can provide critical information on the effectiveness of radiation therapy. To this end, we have developed the 'magic crayon' (MC) anatomical object visualization, object definition, and volume calculation tool as a follow on to UNC's Image Hierarchy Editor (IHE) and Image Hierarchy Visualizer (IHV). This paper presents the magic crayon system detailing interaction, implementation, and preliminary observer studies. MC has several features: (1) it uses a number of 3D visualization methods to visualize rapidly an anatomical object. (2) MC can serve as a test bed for various object definition algorithms. (3) MC serves as a testbed allowing the comparative evaluation of various volume calculation methods including pixel counting and Dr. David Eberly's divergence method.

  7. The computational structural mechanics testbed architecture. Volume 2: Directives

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1989-01-01

    This is the second of a set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language (CLAMP), the command language interpreter (CLIP), and the data manager (GAL). Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 2 describes the CLIP directives in detail. It is intended for intermediate and advanced users.

  8. The computational structural mechanics testbed architecture. Volume 1: The language

    NASA Technical Reports Server (NTRS)

    Felippa, Carlos A.

    1988-01-01

    This is the first set of five volumes which describe the software architecture for the Computational Structural Mechanics Testbed. Derived from NICE, an integrated software system developed at Lockheed Palo Alto Research Laboratory, the architecture is composed of the command language CLAMP, the command language interpreter CLIP, and the data manager GAL. Volumes 1, 2, and 3 (NASA CR's 178384, 178385, and 178386, respectively) describe CLAMP and CLIP, and the CLIP-processor interface. Volumes 4 and 5 (NASA CR's 178387 and 178388, respectively) describe GAL and its low-level I/O. CLAMP, an acronym for Command Language for Applied Mechanics Processors, is designed to control the flow of execution of processors written for NICE. Volume 1 presents the basic elements of the CLAMP language and is intended for all users.

  9. Telescience testbed: Operational support functions for biomedical experiments

    NASA Astrophysics Data System (ADS)

    Yamashita, Masamichi; Watanabe, Satoru; Shoji, Takatoshi; Clarke, Andrew H.; Suzuki, Hiroyuki; Yanagihara, Dai

    A telescience testbed was conducted to study the methodology of space biomedicine with simulated constraints imposed on space experiments. An experimental subject selected for this testbedding was an elaborate surgery of animals and electrophysiological measurements conducted by an operator onboard. The standing potential in the ampulla of the pigeon's semicircular canal was measured during gravitational and caloric stimulation. A principal investigator, isolated from the operation site, participated in the experiment interactively by telecommunication links. Reliability analysis was applied to the whole layers of experimentation, including design of experimental objectives and operational procedures. Engineering and technological aspects of telescience are discussed in terms of reliability to assure quality of science. Feasibility of robotics was examined for supportive functions to reduce the workload of the onboard operator.

  10. The Benchmark Extensible Tractable Testbed Engineering Resource (BETTER)

    SciTech Connect

    Siranosian, Antranik Antonio; Schembri, Philip Edward; Miller, Nathan Andrew

    2016-06-02

    The Benchmark Extensible Tractable Testbed Engineering Resource (BETTER) is proposed as a family of modular test bodies that are intended to support engineering capability development by helping to identify weaknesses and needs. Weapon systems, subassemblies, and components are often complex and difficult to test and analyze, resulting in low confidence and high uncertainties in experimental and simulated results. The complexities make it difficult to distinguish between inherent uncertainties and errors due to insufficient capabilities. BETTER test bodies will first use simplified geometries and materials such that testing, data collection, modeling and simulation can be accomplished with high confidence and low uncertainty. Modifications and combinations of simple and well-characterized BETTER test bodies can then be used to increase complexity in order to reproduce relevant mechanics and identify weaknesses. BETTER can provide both immediate and long-term improvements in testing and simulation capabilities. This document presents the motivation, concept, benefits and examples for BETTER.

  11. Photovoltaic Engineering Testbed Designed for Calibrating Photovoltaic Devices in Space

    NASA Technical Reports Server (NTRS)

    Landis, Geoffrey A.

    2002-01-01

    Accurate prediction of the performance of solar arrays in space requires that the cells be tested in comparison with a space-flown standard. Recognizing that improvements in future solar cell technology will require an ever-increasing fidelity of standards, the Photovoltaics and Space Environment Branch at the NASA Glenn Research Center, in collaboration with the Ohio Aerospace Institute, designed a prototype facility to allow routine calibration, measurement, and qualification of solar cells on the International Space Station, and then the return of the cells to Earth for laboratory use. For solar cell testing, the Photovoltaic Engineering Testbed (PET) site provides a true air-mass-zero (AM0) solar spectrum. This allows solar cells to be accurately calibrated using the full spectrum of the Sun.

  12. An Overview of Research Activity at the Launch Systems Testbed

    NASA Technical Reports Server (NTRS)

    Vu, Bruce; Kandula, Max

    2003-01-01

    This paper summarizes the acoustic testing and analysis activities at the Launch System Testbed (LST) of Kennedy Space Center (KSC). A major goal is to develop passive methods of mitigation of sound from rocket exhaust jets with ducted systems devoid of traditional water injection. Current testing efforts are concerned with the launch-induced vibroacoustic behavior of scaled exhaust jets. Numerical simulations are also developed to study the sound propagation from supersonic jets in free air and through enclosed ducts. Scaling laws accounting for the effects of important parameters such as jet Mach number, jet velocity, and jet temperature on the far-field noise are investigated in order to deduce full-scale environment from small-scale tests.

  13. Vacuum Nuller Testbed Performance, Characterization and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, R. G.; Clampin, M.; Petrone, P.; Mallik, U.; Madison, T.; Bolcar, M.; Noecker, C.; Kendrick, S.; Helmbrecht, M. A.

    2011-01-01

    The Visible Nulling Coronagraph (VNC) can detect and characterize exoplanets with filled, segmented and sparse aperture telescopes, thereby spanning the choice of future internal coronagraph exoplanet missions. NASA/Goddard Space Flight Center (GSFC) has developed a Vacuum Nuller Testbed (VNT) to advance this approach, and assess and advance technologies needed to realize a VNC as a flight instrument. The VNT is an ultra-stable testbed operating at 15 Hz in vacuum. It consists of a MachZehnder nulling interferometer; modified with a "W" configuration to accommodate a hexpacked MEMS based deformable mirror (DM), coherent fiber bundle and achromatic phase shifters. The 2-output channels are imaged with a vacuum photon counting camera and conventional camera. Error-sensing and feedback to DM and delay line with control algorithms are implemented in a real-time architecture. The inherent advantage of the VNC is that it is its own interferometer and directly controls its errors by exploiting images from bright and dark channels simultaneously. Conservation of energy requires the sum total of the photon counts be conserved independent of the VNC state. Thus sensing and control bandwidth is limited by the target stars throughput, with the net effect that the higher bandwidth offloads stressing stability tolerances within the telescope. We report our recent progress with the VNT towards achieving an incremental sequence of contrast milestones of 10(exp 8) , 10(exp 9) and 10(exp 10) respectively at inner working angles approaching 2A/D. Discussed will be the optics, lab results, technologies, and null control. Shown will be evidence that the milestones have been achieved.

  14. Modular algorithm concept evaluation tool (MACET) sensor fusion algorithm testbed

    NASA Astrophysics Data System (ADS)

    Watson, John S.; Williams, Bradford D.; Talele, Sunjay E.; Amphay, Sengvieng A.

    1995-07-01

    Target acquisition in a high clutter environment in all-weather at any time of day represents a much needed capability for the air-to-surface strike mission. A considerable amount of the research at the Armament Directorate at Wright Laboratory, Advanced Guidance Division WL/MNG, has been devoted to exploring various seeker technologies, including multi-spectral sensor fusion, that may yield a cost efficient system with these capabilities. Critical elements of any such seekers are the autonomous target acquisition and tracking algorithms. These algorithms allow the weapon system to operate independently and accurately in realistic battlefield scenarios. In order to assess the performance of the multi-spectral sensor fusion algorithms being produced as part of the seeker technology development programs, the Munition Processing Technology Branch of WL/MN is developing an algorithm testbed. This testbed consists of the Irma signature prediction model, data analysis workstations, such as the TABILS Analysis and Management System (TAMS), and the Modular Algorithm Concept Evaluation Tool (MACET) algorithm workstation. All three of these components are being enhanced to accommodate multi-spectral sensor fusion systems. MACET is being developed to provide a graphical interface driven simulation by which to quickly configure algorithm components and conduct performance evaluations. MACET is being developed incrementally with each release providing an additional channel of operation. To date MACET 1.0, a passive IR algorithm environment, has been delivered. The second release, MACET 1.1 is presented in this paper using the MMW/IR data from the Advanced Autonomous Dual Mode Seeker (AADMS) captive flight demonstration. Once completed, the delivered software from past algorithm development efforts will be converted to the MACET library format, thereby providing an on-line database of the algorithm research conducted to date.

  15. Cloud Front

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA02171 Cloud Front

    These clouds formed in the south polar region. The faintness of the cloud system likely indicates that these are mainly ice clouds, with relatively little dust content.

    Image information: VIS instrument. Latitude -86.7N, Longitude 212.3E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  16. Cloud Arcs

    Atmospheric Science Data Center

    2013-04-19

    ... a sinking motion elsewhere, are very common, the degree of organization exhibited here is relatively rare, as the wind field at different altitudes usually disrupts such patterns. The degree of self organization of this cloud image, whereby three or four such circular events ...

  17. Thin Clouds

    Atmospheric Science Data Center

    2013-04-18

    ... one of a new generation of instruments flying aboard the NASA Earth Observing System's Terra satellite, views Earth with nine cameras ... of thin cirrus minutes after MISR imaged the cloud from space. At the same time, another NASA high-altitude jet, the WB-57, flew right ...

  18. Independent Technology Assessment within the Federation of Earth Science Information Partners (ESIP) Testbed

    NASA Astrophysics Data System (ADS)

    Burgess, A. B.; Robinson, E.; Graybeal, J.

    2015-12-01

    The Federation of Earth Science Information Partners (ESIP) is a community of science, data and information technology practitioners. ESIP's mission is to support the networking and data dissemination needs of our members and the global community. We do this by linking the functional sectors of education, observation, research and application with the ultimate use of Earth science. Amongst the services provided to ESIP members is the Testbed; a collaborative forum for the development of technology standards, services, protocols and best practices. ESIP has partnered with the NASA Advanced Information Systems Technology (AIST) program to integrate independent assessment of Testing Readiness Level (TRL) into the ESIP Testbed. In this presentation we will 1) demonstrate TRL assessment in the ESIP Testbed using three AIST projects, 2) discuss challenges and insights into creating an independent validation/verification framework and 3) outline the versatility of the ESIP Testbed as applied to other technology projects.

  19. A Real-Time Testbed for Satellite and Terrestrial Communications Experimentation and Development

    NASA Technical Reports Server (NTRS)

    Angkasa, K.; Hamkins, J.; Jao, J.; Lay, N.; Satorius, E.; Zevallos, A.

    1997-01-01

    This paper describes a programmable DSP-based testbed that is employed in the development and evaluation of blind demodulation algorithms to be used in wireless satellite or terrestrial communications systems. The testbed employs a graphical user interface (GUI) to provide independent, real-time control of modulator, channel and demodulator parameters and also affords realtime observation of various diagnostic signals such as carrier, timing recovery and decoder metrics. This interactive flexibility enables an operator to tailor the testbed parameters and environment to investigate the performance of any arbitrary communications system and channel model. Furthermore, a variety of digital and analog interfaces allow the testbed to be used either as a stand-alone digital modulator or receiver, thereby extending its experimental utility from the laboratory to the field.

  20. Preliminary Design of a Galactic Cosmic Ray Shielding Materials Testbed for the International Space Station

    NASA Technical Reports Server (NTRS)

    Gaier, James R.; Berkebile, Stephen; Sechkar, Edward A.; Panko, Scott R.

    2012-01-01

    The preliminary design of a testbed to evaluate the effectiveness of galactic cosmic ray (GCR) shielding materials, the MISSE Radiation Shielding Testbed (MRSMAT) is presented. The intent is to mount the testbed on the Materials International Space Station Experiment-X (MISSE-X) which is to be mounted on the International Space Station (ISS) in 2016. A key feature is the ability to simultaneously test nine samples, including standards, which are 5.25 cm thick. This thickness will enable most samples to have an areal density greater than 5 g/sq cm. It features a novel and compact GCR telescope which will be able to distinguish which cosmic rays have penetrated which shielding material, and will be able to evaluate the dose transmitted through the shield. The testbed could play a pivotal role in the development and qualification of new cosmic ray shielding technologies.

  1. Carrier Plus: A Sensor Payload for Living With a Star Space Environment Testbed (LWS/SET)

    NASA Technical Reports Server (NTRS)

    Marshall, Cheryl; Moss, Steven; Howard, Regan; LaBel, Kenneth; Grycewicz, Tom; Barth, Janet; Brewer, Dana

    2003-01-01

    The paper discusses the following: 1. Living with a Star (LWS) program: space environment testbed (SET); natural space environment. 2. Carrier plus: goals and benefits. 3. ON-orbit sensor measurements. 4. Carrier plus architecture. 5. Participation in carrier plus.

  2. Testbed for extended-scene Shack-Hartmann and phase retrieval wavefront sensing

    NASA Technical Reports Server (NTRS)

    Morgan, Rhonda M.; Ohara, Catherine M.; Green, Joseph J.; Roberts, Jennifer; Sidick, Erkin; Shcheglov, Kirill

    2005-01-01

    We have implemented a testbed to demonstrate wavefront sensing and control on an extended scene using Shack-Hartmann and MGS phase retrieval simultaneously. This dual approach allows for both high sensitivity and high dynamic range wavefront sensing.

  3. Precision truss structures from concept to hardware reality: application to the Micro-Precision Interferometer Testbed

    NASA Astrophysics Data System (ADS)

    Sword, Lee F.; Carne, Thomas G.

    1993-09-01

    This paper describes the development of the truss structure at the Jet Propulsion Laboratory that forms the backbone of JPL's Micro-Precision Interferometer (MPI) Testbed. The Micro- Precision Interferometer (MPI) Testbed is the third generation of Control Structure Interaction (CSI) Testbeds constructed by JPL aimed at developing and validating control concepts. The MPI testbed is essentially a space-based Michelson interferometer suspended in a ground- based laboratory. This instrument, mounted to the flexible truss, requires nanometer level precision alignment and positioning of its optical elements to achieve science objectives. A layered control architecture, utilizing isolation, structural control, and active optical control technologies, allow the system to meet its vibration attenuation goals. Success of the structural control design, which involves replacement of truss struts with active and/or passive elements, depends heavily on high fidelity models of the structure to evaluate strut placement locations. The first step in obtaining an accurate structure model is to build a structure which is linear.

  4. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  5. Modeling of clouds and radiation for developing parameterizations for general circulation models. Annual report, 1995

    SciTech Connect

    Toon, O.B.; Westphal, D.L.

    1996-07-01

    We have used a hierarchy of numerical models for cirrus and stratus clouds and for radiative transfer to improve the reliability of general circulation models. Our detailed cloud microphysical model includes all of the physical processes believed to control the lifecycles of liquid and ice clouds in the troposphere. We have worked on specific GCM parameterizations for the radiative properties of cirrus clouds, making use of a mesocale model as the test-bed for the parameterizations. We have also modeled cirrus cloud properties with a detailed cloud physics model to better understand how the radiatively important properties of cirrus are controlled by their environment. We have used another cloud microphysics model to investigate of the interactions between aerosols and clouds. This work is some of the first to follow the details of interactions between aerosols and cloud droplets and has shown some unexpected relations between clouds and aerosols. We have also used line-by- line radiative transfer results verified with ARM data, to derive a GCMS.

  6. Validation of the CERTS Microgrid Concept The CEC/CERTS MicrogridTestbed

    SciTech Connect

    Nichols, David K.; Stevens, John; Lasseter, Robert H.; Eto,Joseph H.

    2006-06-01

    The development of test plans to validate the CERTSMicrogrid concept is discussed, including the status of a testbed.Increased application of Distributed Energy Resources on the Distributionsystem has the potential to improve performance, lower operational costsand create value. Microgrids have the potential to deliver these highvalue benefits. This presentation will focus on operationalcharacteristics of the CERTS microgrid, the partners in the project andthe status of the CEC/CERTS microgrid testbed. Index Terms DistributedGeneration, Distributed Resource, Islanding, Microgrid,Microturbine

  7. A high-resolution, four-band SAR testbed with real-time image formation

    SciTech Connect

    Walker, B.; Sander, G.; Thompson, M.; Burns, B.; Fellerhoff, R.; Dubbert, D.

    1996-03-01

    This paper describes the Twin-Otter SAR Testbed developed at Sandia National Laboratories. This SAR is a flexible, adaptable testbed capable of operation on four frequency bands: Ka, Ku, X, and VHF/UHF bands. The SAR features real-time image formation at fine resolution in spotlight and stripmap modes. High-quality images are formed in real time using the overlapped subaperture (OSA) image-formation and phase gradient autofocus (PGA) algorithms.

  8. Crew-integration and Automation Testbed (CAT)Program Overview and RUX06 Introduction

    DTIC Science & Technology

    2006-09-20

    unlimited Crew-integration and Automation Testbed ( CAT ) Program Overview and RUX06 Introduction 26-27 July 2006 Patrick Nunez, Terry Tierney, Brian Novak...3. DATES COVERED 4. TITLE AND SUBTITLE Crew-integration and Automation Testbed ( CAT )Program Overview and RUX06 Introduction 5a. CONTRACT...Experiment • Capstone CAT experiment – Evaluate effectiveness of CAT program in improving the performance and/or reducing the workload for a mounted

  9. Transmitting SPIHT Compressed ECG Data Over a Next-Generation Mobile Telecardiology Testbed

    DTIC Science & Technology

    2007-11-02

    formance of the testbed for the compressed ECG data segments selected from the MIT-BIH arrhythmia database is evaluated in ter ms of BER (bit er ror...the MIT/BIH arrhythmia database are transmitted. Finally, a conclusion is given in Section IV. II. NEXT-GENERATION MOBILE TELECARDIOLOGY TESTBED A... source is compressed by the SPIHT, the resultant bits are encoded frame-by-frame and the frame length is 10ms. First, it is attached by CRC (Cyclic

  10. Closing the contrast gap between testbed and model prediction with WFIRST-CGI shaped pupil coronagraph

    NASA Astrophysics Data System (ADS)

    Zhou, Hanying; Nemati, Bijan; Krist, John; Cady, Eric; Prada, Camilo M.; Kern, Brian; Poberezhskiy, Ilya

    2016-07-01

    JPL has recently passed an important milestone in its technology development for a proposed NASA WFIRST mission coronagraph: demonstration of better than 1x10-8 contrast over broad bandwidth (10%) on both shaped pupil coronagraph (SPC) and hybrid Lyot coronagraph (HLC) testbeds with the WFIRST obscuration pattern. Challenges remain, however, in the technology readiness for the proposed mission. One is the discrepancies between the achieved contrasts on the testbeds and their corresponding model predictions. A series of testbed diagnoses and modeling activities were planned and carried out on the SPC testbed in order to close the gap. A very useful tool we developed was a derived "measured" testbed wavefront control Jacobian matrix that could be compared with the model-predicted "control" version that was used to generate the high contrast dark hole region in the image plane. The difference between these two is an estimate of the error in the control Jacobian. When the control matrix, which includes both amplitude and phase, was modified to reproduce the error, the simulated performance closely matched the SPC testbed behavior in both contrast floor and contrast convergence speed. This is a step closer toward model validation for high contrast coronagraphs. Further Jacobian analysis and modeling provided clues to the possible sources for the mismatch: DM misregistration and testbed optical wavefront error (WFE) and the deformable mirror (DM) setting for correcting this WFE. These analyses suggested that a high contrast coronagraph has a tight tolerance in the accuracy of its control Jacobian. Modifications to both testbed control model as well as prediction model are being implemented, and future works are discussed.

  11. Experimental Studies in a Reconfigurable C4 Test-bed for Network Enabled Capability

    DTIC Science & Technology

    2006-06-01

    Cross1, Dr R. Houghton1, and Mr R. McMaster1 Defence Technology Centre for Human factors Integration (DTC HFI ) BITlab, School of Engineering and Design...NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Defence Technology Centre for Human factors Integration (DTC HFI ) BITlab, School of...studies into NEC by the Human Factors Integration Defence Technology Centre ( HFI -DTC). DEVELOPMENT OF THE TESTBED In brief, the C4 test-bed

  12. Large-scale structural analysis: The structural analyst, the CSM Testbed and the NAS System

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Mccleary, Susan L.; Macy, Steven C.; Aminpour, Mohammad A.

    1989-01-01

    The Computational Structural Mechanics (CSM) activity is developing advanced structural analysis and computational methods that exploit high-performance computers. Methods are developed in the framework of the CSM testbed software system and applied to representative complex structural analysis problems from the aerospace industry. An overview of the CSM testbed methods development environment is presented and some numerical methods developed on a CRAY-2 are described. Selected application studies performed on the NAS CRAY-2 are also summarized.

  13. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds

    PubMed Central

    Frank, Jared A.; Brill, Anthony; Kapila, Vikram

    2016-01-01

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability. PMID:27556464

  14. New Air-Launched Small Missile (ALSM) Flight Testbed for Hypersonic Systems

    NASA Technical Reports Server (NTRS)

    Bui, Trong T.; Lux, David P.; Stenger, Michael T.; Munson, Michael J.; Teate, George F.

    2007-01-01

    The Phoenix Air-Launched Small Missile (ALSM) flight testbed was conceived and is proposed to help address the lack of quick-turnaround and cost-effective hypersonic flight research capabilities. The Phoenix ALSM testbed results from utilization of the United States Navy Phoenix AIM-54 (Hughes Aircraft Company, now Raytheon Company, Waltham, Massachusetts) long-range, guided air-to-air missile and the National Aeronautics and Space Administration (NASA) Dryden Flight Research Center (Edwards, California) F-15B (McDonnell Douglas, now the Boeing Company, Chicago, Illinois) testbed airplane. The retirement of the Phoenix AIM-54 missiles from fleet operation has presented an opportunity for converting this flight asset into a new flight testbed. This cost-effective new platform will fill the gap in the test and evaluation of hypersonic systems for flight Mach numbers ranging from 3 to 5. Preliminary studies indicate that the Phoenix missile is a highly capable platform; when launched from a high-performance airplane, the guided Phoenix missile can boost research payloads to low hypersonic Mach numbers, enabling flight research in the supersonic-to-hypersonic transitional flight envelope. Experience gained from developing and operating the Phoenix ALSM testbed will assist the development and operation of future higher-performance ALSM flight testbeds as well as responsive microsatellite-small-payload air-launched space boosters.

  15. Mounted Smartphones as Measurement and Control Platforms for Motor-Based Laboratory Test-Beds.

    PubMed

    Frank, Jared A; Brill, Anthony; Kapila, Vikram

    2016-08-20

    Laboratory education in science and engineering often entails the use of test-beds equipped with costly peripherals for sensing, acquisition, storage, processing, and control of physical behavior. However, costly peripherals are no longer necessary to obtain precise measurements and achieve stable feedback control of test-beds. With smartphones performing diverse sensing and processing tasks, this study examines the feasibility of mounting smartphones directly to test-beds to exploit their embedded hardware and software in the measurement and control of the test-beds. This approach is a first step towards replacing laboratory-grade peripherals with more compact and affordable smartphone-based platforms, whose interactive user interfaces can engender wider participation and engagement from learners. Demonstrative cases are presented in which the sensing, computation, control, and user interaction with three motor-based test-beds are handled by a mounted smartphone. Results of experiments and simulations are used to validate the feasibility of mounted smartphones as measurement and feedback control platforms for motor-based laboratory test-beds, report the measurement precision and closed-loop performance achieved with such platforms, and address challenges in the development of platforms to maintain system stability.

  16. Micro-Precision Interferometer Testbed: end-to-end system integration of control structure interaction technologies

    NASA Astrophysics Data System (ADS)

    Neat, Gregory W.; Sword, Lee F.; Hines, Braden E.; Calvet, Robert J.

    1993-09-01

    This paper describes the overall design and planned phased delivery of the ground-based Micro-Precision Interferometer (MPI) Testbed. The testbed is a half scale replica of a future space-based interferometer containing all the spacecraft subsystems necessary to perform an astrometric measurement. Appropriate sized reaction wheels will regulate the testbed attitude as well as provide a flight-like disturbance source. The optical system will consist of two complete Michelson interferometers. Successful interferometric measurements require controlling the positional stabilities of these optical elements to the nanometer level. The primary objective of the testbed is to perform a system integration of Control Structure Interaction (CSI) technologies necessary to demonstrate the end-to-end operation of a space- based interferometer, ultimately proving to flight mission planners that the necessary control technology exists to meet the challenging requirements of future space-based interferometry missions. These technologies form a multi-layered vibration attenuation architecture to achieve the necessary quiet environment. This three layered methodology blends disturbance isolation, structural quieting and active optical control techniques. The paper describes all the testbed subsystems in this end-to-end ground-based system as well as the present capabilities of the evolving testbed.

  17. Southern Clouds

    NASA Technical Reports Server (NTRS)

    2005-01-01

    [figure removed for brevity, see original site] Context image for PIA03026 Southern Clouds

    This image shows a system of clouds just off the margin of the South Polar cap. Taken during the summer season, these clouds contain both water-ice and dust.

    Image information: VIS instrument. Latitude 80.2S, Longitude 57.6E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  18. Linear Clouds

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA03667 Linear Clouds

    These clouds are located near the edge of the south polar region. The cloud tops are the puffy white features in the bottom half of the image.

    Image information: VIS instrument. Latitude -80.1N, Longitude 52.1E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  19. Designing an autonomous helicopter testbed: From conception through implementation

    NASA Astrophysics Data System (ADS)

    Garcia, Richard D.

    Miniature Unmanned Aerial Vehicles (UAVs) are currently being researched for a wide range of tasks, including search and rescue, surveillance, reconnaissance, traffic monitoring, fire detection, pipe and electrical line inspection, and border patrol to name only a few of the application domains. Although small/miniature UAVs, including both Vertical Takeoff and Landing (VTOL) vehicles and small helicopters, have shown great potential in both civilian and military domains, including research and development, integration, prototyping, and field testing, these unmanned systems/vehicles are limited to only a handful of university labs. For VTOL type aircraft the number is less than fifteen worldwide! This lack of development is due to both the extensive time and cost required to design, integrate and test a fully operational prototype as well as the shortcomings of published materials to fully describe how to design and build a "complete" and "operational" prototype system. This dissertation overcomes existing barriers and limitations by describing and presenting in great detail every technical aspect of designing and integrating a small UAV helicopter including the on-board navigation controller, capable of fully autonomous takeoff, waypoint navigation, and landing. The presented research goes beyond previous works by designing the system as a testbed vehicle. This design aims to provide a general framework that will not only allow researchers the ability to supplement the system with new technologies but will also allow researchers to add innovation to the vehicle itself. Examples include modification or replacement of controllers, updated filtering and fusion techniques, addition or replacement of sensors, vision algorithms, Operating Systems (OS) changes or replacements, and platform modification or replacement. This is supported by the testbed's design to not only adhere to the technology it currently utilizes but to be general enough to adhere to a multitude of

  20. Cloud Interactions

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site]

    Released 1 July 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.

    Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.

    This image was acquired during mid-spring near the North Pole. The linear water-ice clouds are now regional in extent and often interact with neighboring cloud system, as seen in this image. The bottom of the image shows how the interaction can destroy the linear nature. While the surface is still visible through most of the clouds, there is evidence that dust is also starting to enter the atmosphere.

    Image information: VIS instrument. Latitude 68.4, Longitude 258.8 East (101.2 West). 38 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration

  1. Aerosol-Cloud Interactions in Ship Tracks Using Terra MODIS/MISR

    NASA Astrophysics Data System (ADS)

    Chen, Y. C.; Christensen, M.; Diner, D. J.; Garay, M. J.; Nelson, D. L.

    2014-12-01

    Simultaneous ship track observations from Terra Moderate Resolution Imaging Spectroradiometer (MODIS) and Multi-angle Imaging SpectroRadiometer (MISR) have been compiled to investigate how ship-injected aerosols affect marine warm boundary layer clouds under different cloud types and environmental conditions. Taking advantage of the high spatial resolution multiangle observations uniquely available from MISR, we utilized the retrieved cloud albedo, cloud top height, and cloud motion vectors to examine the cloud property responses in ship-polluted and nearby unpolluted clouds. The strength of cloud albedo response to increased aerosol level is primarily dependent on cloud cell structure, dryness of the free troposphere, and boundary layer depth, corroborating a previous study by Chen et al. (2012) where A-Train satellite data were applied. Under open cell cloud structure, the cloud properties are more susceptible to aerosol perturbations as compared to closed cells. Aerosol plumes caused an increase in liquid water amount (+27%), cloud top height (+11%), and cloud albedo (+40%) for open cell clouds, whereas under closed cell clouds, little changes in cloud properties were observed. Further capitalizing on MISR's unique capabilities, the MISR cross-track cloud speed has been used to derive cloud top divergence. Statistically averaging the results from many plume segments to reduce random noise, we have found that in ship-polluted clouds there is stronger cloud top divergence, and in nearby unpolluted clouds, convergence occurs and leads to downdrafts, providing observational evidence for cloud top entrainment feedback. These results suggest that detailed cloud responses, classified by cloud type and environmental conditions, must be accounted for in global climate modeling studies to reduce uncertainties of aerosol indirect forcing. Reference: Chen, Y.-C. et al. Occurrence of lower cloud albedo in ship tracks, Atmos. Chem. Phys., 12, 8223-8235, doi:10.5194/acp-12

  2. A price- and-time-slot-negotiation mechanism for Cloud service reservations.

    PubMed

    Son, Seokho; Sim, Kwang Mong

    2012-06-01

    When making reservations for Cloud services, consumers and providers need to establish service-level agreements through negotiation. Whereas it is essential for both a consumer and a provider to reach an agreement on the price of a service and when to use the service, to date, there is little or no negotiation support for both price and time-slot negotiations (PTNs) for Cloud service reservations. This paper presents a multi-issue negotiation mechanism to facilitate the following: 1) PTNs between Cloud agents and 2) tradeoff between price and time-slot utilities. Unlike many existing negotiation mechanisms in which a negotiation agent can only make one proposal at a time, agents in this work are designed to concurrently make multiple proposals in a negotiation round that generate the same aggregated utility, differing only in terms of individual price and time-slot utilities. Another novelty of this work is formulating a novel time-slot utility function that characterizes preferences for different time slots. These ideas are implemented in an agent-based Cloud testbed. Using the testbed, experiments were carried out to compare this work with related approaches. Empirical results show that PTN agents reach faster agreements and achieve higher utilities than other related approaches. A case study was carried out to demonstrate the application of the PTN mechanism for pricing Cloud resources.

  3. Web Solutions Inspire Cloud Computing Software

    NASA Technical Reports Server (NTRS)

    2013-01-01

    An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.

  4. Feedback in Clouds II: UV photoionization and the first supernova in a massive cloud

    NASA Astrophysics Data System (ADS)

    Geen, Sam; Hennebelle, Patrick; Tremblin, Pascal; Rosdahl, Joakim

    2016-12-01

    Molecular cloud structure is regulated by stellar feedback in various forms. Two of the most important feedback processes are UV photoionization and supernovae from massive stars. However, the precise response of the cloud to these processes, and the interaction between them, remains an open question. In particular, we wish to know under which conditions the cloud can be dispersed by feedback, which, in turn, can give us hints as to how feedback regulates the star formation inside the cloud. We perform a suite of radiative magnetohydrodynamic simulations of a 105 solar mass cloud with embedded sources of ionizing radiation and supernovae, including multiple supernovae and a hypernova model. A UV source corresponding to 10 per cent of the mass of the cloud is required to disperse the cloud, suggesting that the star formation efficiency should be of the order of 10 per cent. A single supernova is unable to significantly affect the evolution of the cloud. However, energetic hypernovae and multiple supernovae are able to add significant quantities of momentum to the cloud, approximately 1043 g cm s-1 of momentum per 1051 erg of supernova energy. We argue that supernovae alone are unable to regulate star formation in molecular clouds. We stress the importance of ram pressure from turbulence in regulating feedback in molecular clouds.

  5. Spectral Dependence of MODIS Cloud Droplet Effective Radius Retrievals for Marine Boundary Layer Clouds

    NASA Technical Reports Server (NTRS)

    Zhang, Zhibo; Platnick, Steven E.; Ackerman, Andrew S.; Cho, Hyoun-Myoung

    2014-01-01

    Low-level warm marine boundary layer (MBL) clouds cover large regions of Earth's surface. They have a significant role in Earth's radiative energy balance and hydrological cycle. Despite the fundamental role of low-level warm water clouds in climate, our understanding of these clouds is still limited. In particular, connections between their properties (e.g. cloud fraction, cloud water path, and cloud droplet size) and environmental factors such as aerosol loading and meteorological conditions continue to be uncertain or unknown. Modeling these clouds in climate models remains a challenging problem. As a result, the influence of aerosols on these clouds in the past and future, and the potential impacts of these clouds on global warming remain open questions leading to substantial uncertainty in climate projections. To improve our understanding of these clouds, we need continuous observations of cloud properties on both a global scale and over a long enough timescale for climate studies. At present, satellite-based remote sensing is the only means of providing such observations.

  6. Estimating Cloud Cover

    ERIC Educational Resources Information Center

    Moseley, Christine

    2007-01-01

    The purpose of this activity was to help students understand the percentage of cloud cover and make more accurate cloud cover observations. Students estimated the percentage of cloud cover represented by simulated clouds and assigned a cloud cover classification to those simulations. (Contains 2 notes and 3 tables.)

  7. Assessing the Performance of Computationally Simple and Complex Representations of Aerosol Processes using a Testbed Methodology

    NASA Astrophysics Data System (ADS)

    Fast, J. D.; Ma, P.; Easter, R. C.; Liu, X.; Zaveri, R. A.; Rasch, P.

    2012-12-01

    Predictions of aerosol radiative forcing in climate models still contain large uncertainties, resulting from a poor understanding of certain aerosol processes, the level of complexity of aerosol processes represented in models, and the ability of models to account for sub-grid scale variability of aerosols and processes affecting them. In addition, comparing the performance and computational efficiency of new aerosol process modules used in various studies is problematic because different studies often employ different grid configurations, meteorology, trace gas chemistry, and emissions that affect the temporal and spatial evolution of aerosols. To address this issue, we have developed an Aerosol Modeling Testbed (AMT) to systematically and objectively evaluate aerosol process modules. The AMT consists of the modular Weather Research and Forecasting (WRF) model, a series of testbed cases for which extensive in situ and remote sensing measurements of meteorological, trace gas, and aerosol properties are available, and a suite of tools to evaluate the performance of meteorological, chemical, aerosol process modules. WRF contains various parameterizations of meteorological, chemical, and aerosol processes and includes interactive aerosol-cloud-radiation treatments similar to those employed by climate models. In addition, the physics suite from a global climate model, Community Atmosphere Model version 5 (CAM5), has also been ported to WRF so that these parameterizations can be tested at various spatial scales and compared directly with field campaign data and other parameterizations commonly used by the mesoscale modeling community. In this study, we evaluate simple and complex treatments of the aerosol size distribution and secondary organic aerosols using the AMT and measurements collected during three field campaigns: the Megacities Initiative Local and Global Observations (MILAGRO) campaign conducted in the vicinity of Mexico City during March 2006, the

  8. Priority scheme planning for the robust SSM/PMAD testbed

    NASA Astrophysics Data System (ADS)

    Elges, Michael R.; Ashworth, Barry R.

    Whenever mixing priorities of manually controlled resources with those of autonomously controlled resources, the space station module power management and distribution (SSM/PMAD) environment requires cooperating expert system interaction between the planning function and the priority manager. The elements and interactions of the SSM/PMAD planning and priority management functions are presented. Their adherence to cooperating for common achievement are described. In the SSM/PMAD testbed these actions are guided by having a system planning function, KANT, which has insight to the executing system and its automated database. First, the user must be given access to all information which may have an effect on the desired outcome. Second, the fault manager element, FRAMES, must be informed as to the change so that correct diagnoses and operations take place if and when faults occur. Third, some element must engage as mediator for selection of resources and actions to be added or removed at the user's request. This is performed by the priority manager, LPLMS. Lastly, the scheduling mechanism, MAESTRO, must provide future schedules adhering to the user modified resource base.

  9. An integrated dexterous robotic testbed for space applications

    NASA Technical Reports Server (NTRS)

    Li, Larry C.; Nguyen, Hai; Sauer, Edward

    1992-01-01

    An integrated dexterous robotic system was developed as a testbed to evaluate various robotics technologies for advanced space applications. The system configuration consisted of a Utah/MIT Dexterous Hand, a PUMA 562 arm, a stereo vision system, and a multiprocessing computer control system. In addition to these major subsystems, a proximity sensing system was integrated with the Utah/MIT Hand to provide capability for non-contact sensing of a nearby object. A high-speed fiber-optic link was used to transmit digitized proximity sensor signals back to the multiprocessing control system. The hardware system was designed to satisfy the requirements for both teleoperated and autonomous operations. The software system was designed to exploit parallel processing capability, pursue functional modularity, incorporate artificial intelligence for robot control, allow high-level symbolic robot commands, maximize reusable code, minimize compilation requirements, and provide an interactive application development and debugging environment for the end users. An overview is presented of the system hardware and software configurations, and implementation is discussed of subsystem functions.

  10. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  11. TORCH Computational Reference Kernels - A Testbed for Computer Science Research

    SciTech Connect

    Kaiser, Alex; Williams, Samuel Webb; Madduri, Kamesh; Ibrahim, Khaled; Bailey, David H.; Demmel, James W.; Strohmaier, Erich

    2010-12-02

    For decades, computer scientists have sought guidance on how to evolve architectures, languages, and programming models in order to improve application performance, efficiency, and productivity. Unfortunately, without overarching advice about future directions in these areas, individual guidance is inferred from the existing software/hardware ecosystem, and each discipline often conducts their research independently assuming all other technologies remain fixed. In today's rapidly evolving world of on-chip parallelism, isolated and iterative improvements to performance may miss superior solutions in the same way gradient descent optimization techniques may get stuck in local minima. To combat this, we present TORCH: A Testbed for Optimization ResearCH. These computational reference kernels define the core problems of interest in scientific computing without mandating a specific language, algorithm, programming model, or implementation. To compliment the kernel (problem) definitions, we provide a set of algorithmically-expressed verification tests that can be used to verify a hardware/software co-designed solution produces an acceptable answer. Finally, to provide some illumination as to how researchers have implemented solutions to these problems in the past, we provide a set of reference implementations in C and MATLAB.

  12. Adaptive Signal Processing Testbed application software: User's manual

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1992-05-01

    The Adaptive Signal Processing Testbed (ASPT) application software is a set of programs that provide general data acquisition and minimal processing functions on live digital data. The data are obtained from a digital input interface whose data source is the DAR4000 digital quadrature receiver that receives a phase shift keying signal at 21.4 MHz intermediate frequency. The data acquisition software is used to acquire raw unprocessed data from the DAR4000 and store it on disk in the Sun workstation based ASPT. File processing utilities are available to convert the stored files for analysis. The data evaluation software is used for the following functions: acquisition of data from the DAR4000, conversion to IEEE format, and storage to disk; acquisition of data from the DAR4000, power spectrum estimation, and on-line plotting on the graphics screen; and processing of disk file data, power spectrum estimation, and display and/or storage to disk in the new format. A user's guide is provided that describes the acquisition and evaluation programs along with how to acquire, evaluate, and use the data.

  13. Adaptive Signal Processing Testbed signal excision software: User's manual

    NASA Astrophysics Data System (ADS)

    Parliament, Hugh A.

    1992-05-01

    The Adaptive Signal Processing Testbed (ASPT) signal excision software is a set of programs that provide real-time processing functions for the excision of interfering tones from a live spread-spectrum signal as well as off-line functions for the analysis of the effectiveness of the excision technique. The processing functions provided by the ASPT signal excision software are real-time adaptive filtering of live data, storage to disk, and file sorting and conversion. The main off-line analysis function is bit error determination. The purpose of the software is to measure the effectiveness of an adaptive filtering algorithm to suppress interfering or jamming signals in a spread spectrum signal environment. A user manual for the software is provided, containing information on the different software components available to perform signal excision experiments: the real-time excision software, excision host program, file processing utilities, and despreading and bit error rate determination software. In addition, information is presented describing the excision algorithm implemented, the real-time processing framework, the steps required to add algorithms to the system, the processing functions used in despreading, and description of command sequences for post-run analysis of the data.

  14. Priority scheme planning for the robust SSM/PMAD testbed

    NASA Technical Reports Server (NTRS)

    Elges, Michael R.; Ashworth, Barry R.

    1991-01-01

    Whenever mixing priorities of manually controlled resources with those of autonomously controlled resources, the space station module power management and distribution (SSM/PMAD) environment requires cooperating expert system interaction between the planning function and the priority manager. The elements and interactions of the SSM/PMAD planning and priority management functions are presented. Their adherence to cooperating for common achievement are described. In the SSM/PMAD testbed these actions are guided by having a system planning function, KANT, which has insight to the executing system and its automated database. First, the user must be given access to all information which may have an effect on the desired outcome. Second, the fault manager element, FRAMES, must be informed as to the change so that correct diagnoses and operations take place if and when faults occur. Third, some element must engage as mediator for selection of resources and actions to be added or removed at the user's request. This is performed by the priority manager, LPLMS. Lastly, the scheduling mechanism, MAESTRO, must provide future schedules adhering to the user modified resource base.

  15. A satellite orbital testbed for SATCOM using mobile robots

    NASA Astrophysics Data System (ADS)

    Shen, Dan; Lu, Wenjie; Wang, Zhonghai; Jia, Bin; Wang, Gang; Wang, Tao; Chen, Genshe; Blasch, Erik; Pham, Khanh

    2016-05-01

    This paper develops and evaluates a satellite orbital testbed (SOT) for satellite communications (SATCOM). SOT can emulate the 3D satellite orbit using the omni-wheeled robots and a robotic arm. The 3D motion of satellite is partitioned into the movements in the equatorial plane and the up-down motions in the vertical plane. The former actions are emulated by omni-wheeled robots while the up-down motions are performed by a stepped-motor-controlled-ball along a rod (robotic arm), which is attached to the robot. The emulated satellite positions will go to the measure model, whose results will be used to perform multiple space object tracking. Then the tracking results will go to the maneuver detection and collision alert. The satellite maneuver commands will be translated to robots commands and robotic arm commands. In SATCOM, the effects of jamming depend on the range and angles of the positions of satellite transponder relative to the jamming satellite. We extend the SOT to include USRP transceivers. In the extended SOT, the relative ranges and angles are implemented using omni-wheeled robots and robotic arms.

  16. Wavefront Control Toolbox for James Webb Space Telescope Testbed

    NASA Technical Reports Server (NTRS)

    Shiri, Ron; Aronstein, David L.; Smith, Jeffery Scott; Dean, Bruce H.; Sabatke, Erin

    2007-01-01

    We have developed a Matlab toolbox for wavefront control of optical systems. We have applied this toolbox to the optical models of James Webb Space Telescope (JWST) in general and to the JWST Testbed Telescope (TBT) in particular, implementing both unconstrained and constrained wavefront optimization to correct for possible misalignments present on the segmented primary mirror or the monolithic secondary mirror. The optical models implemented in Zemax optical design program and information is exchanged between Matlab and Zemax via the Dynamic Data Exchange (DDE) interface. The model configuration is managed using the XML protocol. The optimization algorithm uses influence functions for each adjustable degree of freedom of the optical mode. The iterative and non-iterative algorithms have been developed to converge to a local minimum of the root-mean-square (rms) of wavefront error using singular value decomposition technique of the control matrix of influence functions. The toolkit is highly modular and allows the user to choose control strategies for the degrees of freedom to be adjusted on a given iteration and wavefront convergence criterion. As the influence functions are nonlinear over the control parameter space, the toolkit also allows for trade-offs between frequency of updating the local influence functions and execution speed. The functionality of the toolbox and the validity of the underlying algorithms have been verified through extensive simulations.

  17. A Test-Bed Configuration: Toward an Autonomous System

    NASA Astrophysics Data System (ADS)

    Ocaña, F.; Castillo, M.; Uranga, E.; Ponz, J. D.; TBT Consortium

    2015-09-01

    In the context of the Space Situational Awareness (SSA) program of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. In order to fulfill all the security requirements for the TBT project, the use of a autonomous emergency system (AES) is foreseen to monitor the control system. The AES will monitor remotely the health of the observing system and the internal and external environment. It will incorporate both autonomous and interactive actuators to force the protection of the system (i.e., emergency dome close out).

  18. NN-SITE: A remote monitoring testbed facility

    SciTech Connect

    Kadner, S.; White, R.; Roman, W.; Sheely, K.; Puckett, J.; Ystesund, K.

    1997-08-01

    DOE, Aquila Technologies, LANL and SNL recently launched collaborative efforts to create a Non-Proliferation Network Systems Integration and Test (NN-Site, pronounced N-Site) facility. NN-Site will focus on wide area, local area, and local operating level network connectivity including Internet access. This facility will provide thorough and cost-effective integration, testing and development of information connectivity among diverse operating systems and network topologies prior to full-scale deployment. In concentrating on instrument interconnectivity, tamper indication, and data collection and review, NN-Site will facilitate efforts of equipment providers and system integrators in deploying systems that will meet nuclear non-proliferation and safeguards objectives. The following will discuss the objectives of ongoing remote monitoring efforts, as well as the prevalent policy concerns. An in-depth discussion of the Non-Proliferation Network Systems Integration and Test facility (NN-Site) will illuminate the role that this testbed facility can perform in meeting the objectives of remote monitoring efforts, and its potential contribution in promoting eventual acceptance of remote monitoring systems in facilities worldwide.

  19. SPHERES tethered formation flight testbed: application to NASA's SPECS mission

    NASA Astrophysics Data System (ADS)

    Chung, Soon-Jo; Kong, Edmund M.; Miller, David W.

    2005-08-01

    This paper elaborates on theory and experiment of the formation flight control for the future space-borne tethered interferometers. The nonlinear equations of multi-vehicle tethered spacecraft system are derived by Lagrange equations and decoupling method. The preliminary analysis predicts unstable dynamics depending on the direction of the tether motor. The controllability analysis indicates that both array resizing and spin-up are fully controllable only by the reaction wheels and the tether motor, thereby eliminating the need for thrusters. Linear and nonlinear decentralized control techniques have been implemented into the tethered SPHERES testbed, and tested at the NASA MSFC's flat floor facility using two and three SPHERES configurations. The nonlinear control using feedback linearization technique performed successfully in both two SPHERES in-line configuration and three triangular configuration while varying the tether length. The relative metrology system, using the ultra sound metrology system and the inertial sensors as well as the decentralized nonlinear estimator, is developed to provide necessary state information.

  20. Development of a Testbed for Distributed Satellite Command and Control

    NASA Astrophysics Data System (ADS)

    Zetocha, Paul; Brito, Margarita

    2002-01-01

    At the Air Force Research Laboratory's Space Vehicles Directorate we are investigating and developing architectures for commanding and controlling a cluster of cooperating satellites through prototype development for the TechSat-21 program. The objective of this paper is to describe a distributed satellite testbed that is currently under development and to summarize near term prototypes being implemented for cluster command and control. To design, develop, and test our architecture we are using eight PowerPC 750 VME-based single board computers, representing eight satellites. Each of these computers is hosting the OSE(TM) real-time operating system from Enea Systems. At the core of our on-board cluster manager is ObjectAgent. ObjectAgent is an agent-based object-oriented framework for flight systems, which is particularly suitable for distributed applications. In order to handle communication with the ground as well as to assist with the cluster management we are using the Spacecraft Command Language (SCL). SCL is also at the centerpiece of our ground control station and handles cluster commanding, telemetry decommutation, state-of-health monitoring, and Fault Detection, Isolation, and Resolution (FDIR). For planning and scheduling activities we are currently using ASPEN from NASA/JPL. This paper will describe each of the above components in detail and then present the prototypes being implemented.

  1. Extrasolar Planetary Imaging Coronagraph: Visible Nulling Coronagraph Testbed Results

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.

    2008-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Discovery mission to image and characterize extrasolar giant planets in orbits with semi-major axes between 2 and 10 AU. EPIC will provide insights into the physical nature of a variety of planets in other solar systems complimenting radial velocity (RV) and astrometric planet searches. It will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses, characterize the atmospheres around A and F stars, observed the inner spatial structure and colors of inner Spitzer selected debris disks. EPIC would be launched to heliocentric Earth trailing drift-away orbit, with a 3-year mission lifetime ( 5 year goal) and will revisit planets at least three times at intervals of 9 months. The starlight suppression approach consists of a visible nulling coronagraph (VNC) that enables high order starlight suppression in broadband light. To demonstrate the VNC approach and advance it's technology readiness the NASA Goddard Space Flight Center and Lockheed-Martin have developed a laboratory VNC and have demonstrated white light nulling. We will discuss our ongoing VNC work and show the latest results from the VNC testbed,

  2. Finite Element Modeling of the NASA Langley Aluminum Testbed Cylinder

    NASA Technical Reports Server (NTRS)

    Grosveld, Ferdinand W.; Pritchard, Joselyn I.; Buehrle, Ralph D.; Pappa, Richard S.

    2002-01-01

    The NASA Langley Aluminum Testbed Cylinder (ATC) was designed to serve as a universal structure for evaluating structural acoustic codes, modeling techniques and optimization methods used in the prediction of aircraft interior noise. Finite element models were developed for the components of the ATC based on the geometric, structural and material properties of the physical test structure. Numerically predicted modal frequencies for the longitudinal stringer, ring frame and dome component models, and six assembled ATC configurations were compared with experimental modal survey data. The finite element models were updated and refined, using physical parameters, to increase correlation with the measured modal data. Excellent agreement, within an average 1.5% to 2.9%, was obtained between the predicted and measured modal frequencies of the stringer, frame and dome components. The predictions for the modal frequencies of the assembled component Configurations I through V were within an average 2.9% and 9.1%. Finite element modal analyses were performed for comparison with 3 psi and 6 psi internal pressurization conditions in Configuration VI. The modal frequencies were predicted by applying differential stiffness to the elements with pressure loading and creating reduced matrices for beam elements with offsets inside external superelements. The average disagreement between the measured and predicted differences for the 0 psi and 6 psi internal pressure conditions was less than 0.5%. Comparably good agreement was obtained for the differences between the 0 psi and 3 psi measured and predicted internal pressure conditions.

  3. Aerospace Engineering Systems and the Advanced Design Technologies Testbed Experience

    NASA Technical Reports Server (NTRS)

    VanDalsem, William R.; Livingston, Mary E.; Melton, John E.; Torres, Francisco J.; Stremel, Paul M.

    1999-01-01

    Continuous improvement of aerospace product development processes is a driving requirement across much of the aerospace community. As up to 90% of the cost of an aerospace product is committed during the first 10% of the development cycle, there is a strong emphasis on capturing, creating, and communicating better information (both requirements and performance) early in the product development process. The community has responded by pursuing the development of computer-based systems designed to enhance the decision-making capabilities of product development individuals and teams. Recently, the historical foci on sharing the geometrical representation and on configuration management are being augmented: 1) Physics-based analysis tools for filling the design space database; 2) Distributed computational resources to reduce response time and cost; 3) Web-based technologies to relieve machine-dependence; and 4) Artificial intelligence technologies to accelerate processes and reduce process variability. The Advanced Design Technologies Testbed (ADTT) activity at NASA Ames Research Center was initiated to study the strengths and weaknesses of the technologies supporting each of these trends, as well as the overall impact of the combination of these trends on a product development event. Lessons learned and recommendations for future activities are reported.

  4. High Vertically Resolved Atmospheric and Surface/Cloud Parameters Retrieved with Infrared Atmospheric Sounding Interferometer (IASI)

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Liu, Xu; Larar, Allen M.; Smith, WIlliam L.; Taylor, Jonathan P.; Schluessel, Peter; Strow, L. Larrabee; Mango, Stephen A.

    2008-01-01

    The Joint Airborne IASI Validation Experiment (JAIVEx) was conducted during April 2007 mainly for validation of the IASI on the MetOp satellite. IASI possesses an ultra-spectral resolution of 0.25/cm and a spectral coverage from 645 to 2760/cm. Ultra-spectral resolution infrared spectral radiance obtained from near nadir observations provide atmospheric, surface, and cloud property information. An advanced retrieval algorithm with a fast radiative transfer model, including cloud effects, is used for atmospheric profile and cloud parameter retrieval. This physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the cloud-free and/or clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals are achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). Preliminary retrievals of atmospheric soundings, surface properties, and cloud optical/microphysical properties with the IASI observations are obtained and presented. These retrievals will be further inter-compared with those obtained from airborne FTS system, such as the NPOESS Airborne Sounder Testbed - Interferometer (NAST-I), dedicated dropsondes, radiosondes, and ground based Raman Lidar. The

  5. Testing Cloud Microphysics Parameterizations in NCAR CAM5 with ISDAC and M-PACE Observations

    SciTech Connect

    Liu, Xiaohong; Xie, Shaocheng; Boyle, James; Klein, Stephen A.; Shi, Xiangjun; Wang, Zhien; Lin, Wuyin; Ghan, Steven J.; Earle, Michael; Liu, Peter; Zelenyuk, Alla

    2011-12-24

    Arctic clouds simulated by the NCAR Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic Spring and Fall seasons performed under the Cloud- Associated Parameterizations Testbed (CAPT) framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary layer mixed-phase stratocumulus, and multilayer or deep frontal clouds. However, for low-level clouds, the model significantly underestimates the observed cloud liquid water content in both seasons and cloud fraction in the Spring season. As a result, CAM5 significantly underestimates the surface downward longwave (LW) radiative fluxes by 20-40 W m-2. The model with a new ice nucleation parameterization moderately improves the model simulations by increasing cloud liquid water content in mixed-phase clouds through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron- Findeisen (WBF) process. The CAM5 single column model testing shows that change in the homogeneous freezing temperature of rain to form snow from -5 C to -40 C has a substantial impact on the modeled liquid water content through the slowing-down of liquid and rain-related processes. In contrast, collections of cloud ice by snow and cloud liquid by rain are of minor importance for single-layer boundary layer mixed-phase clouds in the Arctic.

  6. Testing cloud microphysics parameterizations in NCAR CAM5 with ISDAC and M-PACE observations

    SciTech Connect

    Liu X.; Lin W.; Xie, S.; Boyle, J.; Klein, S. A.; Shi, X.; Wang, Z.; Ghan, S. J.; Earle, M.; Liu, P. S. K.; Zelenyuk, A.

    2011-12-24

    Arctic clouds simulated by the National Center for Atmospheric Research (NCAR) Community Atmospheric Model version 5 (CAM5) are evaluated with observations from the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Indirect and Semi-Direct Aerosol Campaign (ISDAC) and Mixed-Phase Arctic Cloud Experiment (M-PACE), which were conducted at its North Slope of Alaska site in April 2008 and October 2004, respectively. Model forecasts for the Arctic spring and fall seasons performed under the Cloud-Associated Parameterizations Testbed framework generally reproduce the spatial distributions of cloud fraction for single-layer boundary-layer mixed-phase stratocumulus and multilayer or deep frontal clouds. However, for low-level stratocumulus, the model significantly underestimates the observed cloud liquid water content in both seasons. As a result, CAM5 significantly underestimates the surface downward longwave radiative fluxes by 20-40 W m{sup -2}. Introducing a new ice nucleation parameterization slightly improves the model performance for low-level mixed-phase clouds by increasing cloud liquid water content through the reduction of the conversion rate from cloud liquid to ice by the Wegener-Bergeron-Findeisen process. The CAM5 single-column model testing shows that changing the instantaneous freezing temperature of rain to form snow from -5 C to -40 C causes a large increase in modeled cloud liquid water content through the slowing down of cloud liquid and rain-related processes (e.g., autoconversion of cloud liquid to rain). The underestimation of aerosol concentrations in CAM5 in the Arctic also plays an important role in the low bias of cloud liquid water in the single-layer mixed-phase clouds. In addition, numerical issues related to the coupling of model physics and time stepping in CAM5 are responsible for the model biases and will be explored in future studies.

  7. Corona-producing ice clouds: a case study of a cold mid-latitude cirrus layer.

    PubMed

    Sassen, K; Mace, G G; Hallett, J; Poellot, M R

    1998-03-20

    A high (14.0-km), cold (-71.0 degrees C) cirrus cloud was studied by ground-based polarization lidar and millimeter radar and aircraft probes on the night of 19 April 1994 from the Cloud and Radiation Testbed site in northern Oklahoma. A rare cirrus cloud lunar corona was generated by this 1-2-km-deep cloud, thus providing an opportunity to measure the composition in situ, which had previously been assumed only on the basis of lidar depolarization data and simple diffraction theory for spheres. In this case, corona ring analysis indicated an effective particle diameter of ~22 mum. A variety of in situ data corroborates the approximate ice-particle size derived from the passive retrieval method, especially near the cloud top, where impacted cloud samples show simple solid crystals. The homogeneous freezing of sulfuric acid droplets of stratospheric origin is assumed to be the dominant ice-particle nucleation mode acting in corona-producing cirrus clouds. It is speculated that this process results in a previously unrecognized mode of acid-contaminated ice-particle growth and that such small-particle cold cirrus clouds are potentially a radiatively distinct type of cloud.

  8. Martian Clouds

    NASA Technical Reports Server (NTRS)

    2004-01-01

    [figure removed for brevity, see original site]

    Released 28 June 2004 The atmosphere of Mars is a dynamic system. Water-ice clouds, fog, and hazes can make imaging the surface from space difficult. Dust storms can grow from local disturbances to global sizes, through which imaging is impossible. Seasonal temperature changes are the usual drivers in cloud and dust storm development and growth.

    Eons of atmospheric dust storm activity has left its mark on the surface of Mars. Dust carried aloft by the wind has settled out on every available surface; sand dunes have been created and moved by centuries of wind; and the effect of continual sand-blasting has modified many regions of Mars, creating yardangs and other unusual surface forms.

    This image was acquired during early spring near the North Pole. The linear 'ripples' are transparent water-ice clouds. This linear form is typical for polar clouds. The black regions on the margins of this image are areas of saturation caused by the build up of scattered light from the bright polar material during the long image exposure.

    Image information: VIS instrument. Latitude 68.1, Longitude 147.9 East (212.1 West). 38 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS

  9. Description of the SSF PMAD DC testbed control system data acquisition function

    NASA Technical Reports Server (NTRS)

    Baez, Anastacio N.; Mackin, Michael; Wright, Theodore

    1992-01-01

    The NASA LeRC in Cleveland, Ohio has completed the development and integration of a Power Management and Distribution (PMAD) DC Testbed. This testbed is a reduced scale representation of the end to end, sources to loads, Space Station Freedom Electrical Power System (SSF EPS). This unique facility is being used to demonstrate DC power generation and distribution, power management and control, and system operation techniques considered to be prime candidates for the Space Station Freedom. A key capability of the testbed is its ability to be configured to address system level issues in support of critical SSF program design milestones. Electrical power system control and operation issues like source control, source regulation, system fault protection, end-to-end system stability, health monitoring, resource allocation, and resource management are being evaluated in the testbed. The SSF EPS control functional allocation between on-board computers and ground based systems is evolving. Initially, ground based systems will perform the bulk of power system control and operation. The EPS control system is required to continuously monitor and determine the current state of the power system. The DC Testbed Control System consists of standard controllers arranged in a hierarchical and distributed architecture. These controllers provide all the monitoring and control functions for the DC Testbed Electrical Power System. Higher level controllers include the Power Management Controller, Load Management Controller, Operator Interface System, and a network of computer systems that perform some of the SSF Ground based Control Center Operation. The lower level controllers include Main Bus Switch Controllers and Photovoltaic Controllers. Power system status information is periodically provided to the higher level controllers to perform system control and operation. The data acquisition function of the control system is distributed among the various levels of the hierarchy. Data

  10. The Fourier-Kelvin Stellar Interferometer (FKSI): A Progress Report and Preliminary Results from Our Laboratory Testbed

    NASA Technical Reports Server (NTRS)

    Berry, Richard; Rajagopa, J.; Danchi, W. C.; Allen, R. J.; Benford, D. J.; Deming, D.; Gezari, D. Y.; Kuchner, M.; Leisawitz, D. T.; Linfield, R.

    2005-01-01

    The Fourier-Kelvin Stellar Interferometer (FKSI) is a mission concept for an imaging and nulling interferometer for the near-infrared to mid-infrared spectral region (3-8 microns). FKSI is conceived as a scientific and technological pathfinder to TPF/DARWIN as well as SPIRIT, SPECS, and SAFIR. It will also be a high angular resolution system complementary to JWST. The scientific emphasis of the mission is on the evolution of protostellar systems, from just after the collapse of the precursor molecular cloud core, through the formation of the disk surrounding the protostar, the formation of planets in the disk, and eventual dispersal of the disk material. FKSI will also search for brown dwarfs and Jupiter mass and smaller planets, and could also play a very powerful role in the investigation of the structure of active galactic nuclei and extra-galactic star formation. We report additional studies of the imaging capabilities of the FKSI with various configurations of two to five telescopes, studies of the capabilities of FKSI assuming an increase in long wavelength response to 10 or 12 microns (depending on availability of detectors), and preliminary results from our nulling testbed.

  11. Sensor System Performance Evaluation and Benefits from the NPOESS Airborne Sounder Testbed-Interferometer (NAST-I)

    NASA Technical Reports Server (NTRS)

    Larar, A.; Zhou, D.; Smith, W.

    2009-01-01

    Advanced satellite sensors are tasked with improving global-scale measurements of the Earth's atmosphere, clouds, and surface to enable enhancements in weather prediction, climate monitoring, and environmental change detection. Validation of the entire measurement system is crucial to achieving this goal and thus maximizing research and operational utility of resultant data. Field campaigns employing satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft are an essential part of this validation task. The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Airborne Sounder Testbed-Interferometer (NAST-I) has been a fundamental contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This paper focuses on some of the challenges associated with validating advanced atmospheric sounders and the benefits obtained from employing airborne interferometers such as the NAST-I. Select results from underflights of the Aqua Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) obtained during recent field campaigns will be presented.

  12. Crater Clouds

    NASA Technical Reports Server (NTRS)

    2006-01-01

    [figure removed for brevity, see original site] Context image for PIA06085 Crater Clouds

    The crater on the right side of this image is affecting the local wind regime. Note the bright line of clouds streaming off the north rim of the crater.

    Image information: VIS instrument. Latitude -78.8N, Longitude 320.0E. 17 meter/pixel resolution.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

  13. AstroCloud, a Cyber-Infrastructure for Astronomy Research: Cloud Computing Environments

    NASA Astrophysics Data System (ADS)

    Li, C.; Wang, J.; Cui, C.; He, B.; Fan, D.; Yang, Y.; Chen, J.; Zhang, H.; Yu, C.; Xiao, J.; Wang, C.; Cao, Z.; Fan, Y.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Yin, S.

    2015-09-01

    AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on CloudStack, an open source software, we set up the cloud computing environment for AstroCloud Project. It consists of five distributed nodes across the mainland of China. Users can use and analysis data in this cloud computing environment. Based on GlusterFS, we built a scalable cloud storage system. Each user has a private space, which can be shared among different virtual machines and desktop systems. With this environments, astronomer can access to astronomical data collected by different telescopes and data centers easily, and data producers can archive their datasets safely.

  14. Pattern recognition of satellite cloud imagery for improved weather prediction

    NASA Technical Reports Server (NTRS)

    Gautier, Catherine; Somerville, Richard C. J.; Volfson, Leonid B.

    1986-01-01

    The major accomplishment was the successful development of a method for extracting time derivative information from geostationary meteorological satellite imagery. This research is a proof-of-concept study which demonstrates the feasibility of using pattern recognition techniques and a statistical cloud classification method to estimate time rate of change of large-scale meteorological fields from remote sensing data. The cloud classification methodology is based on typical shape function analysis of parameter sets characterizing the cloud fields. The three specific technical objectives, all of which were successfully achieved, are as follows: develop and test a cloud classification technique based on pattern recognition methods, suitable for the analysis of visible and infrared geostationary satellite VISSR imagery; develop and test a methodology for intercomparing successive images using the cloud classification technique, so as to obtain estimates of the time rate of change of meteorological fields; and implement this technique in a testbed system incorporating an interactive graphics terminal to determine the feasibility of extracting time derivative information suitable for comparison with numerical weather prediction products.

  15. Cloud-Top Entrainment in Stratocumulus Clouds

    NASA Astrophysics Data System (ADS)

    Mellado, Juan Pedro

    2017-01-01

    Cloud entrainment, the mixing between cloudy and clear air at the boundary of clouds, constitutes one paradigm for the relevance of small scales in the Earth system: By regulating cloud lifetimes, meter- and submeter-scale processes at cloud boundaries can influence planetary-scale properties. Understanding cloud entrainment is difficult given the complexity and diversity of the associated phenomena, which include turbulence entrainment within a stratified medium, convective instabilities driven by radiative and evaporative cooling, shear instabilities, and cloud microphysics. Obtaining accurate data at the required small scales is also challenging, for both simulations and measurements. During the past few decades, however, high-resolution simulations and measurements have greatly advanced our understanding of the main mechanisms controlling cloud entrainment. This article reviews some of these advances, focusing on stratocumulus clouds, and indicates remaining challenges.

  16. Advances in the TRIDEC Cloud

    NASA Astrophysics Data System (ADS)

    Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven

    2016-04-01

    The TRIDEC Cloud is a platform that merges several complementary cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The platform offers a modern web-based graphical user interface so that operators in warning centres and stakeholders of other involved parties (e.g. CPAs, ministries) just need a standard web browser to access a full-fledged early warning and information system with unique interactive features such as Cloud Messages and Shared Maps. Furthermore, the TRIDEC Cloud can be accessed in different modes, e.g. the monitoring mode, which provides important functionality required to act in a real event, and the exercise-and-training mode, which enables training and exercises with virtual scenarios re-played by a scenario player. The software system architecture and open interfaces facilitate global coverage so that the system is applicable for any region in the world and allow the integration of different sensor systems as well as the integration of other hazard types and use cases different to tsunami early warning. Current advances of the TRIDEC Cloud platform will be summarized in this presentation.

  17. T-Check in System-of-Systems Technologies: Cloud Computing

    DTIC Science & Technology

    2010-09-01

    Provides developers with tools to build their own cloud computing infrastructures [3tera 2010] Eucalyptus Systems: Provides an open-source...for cloud computing [ Eucalyptus 2010]. 5 The National Institute of Standards and Technology (NIST) defines two additional types of cloud...40 | CMU/SEI-2010-TN-009 5 Conclusions and Open Questions Cloud computing is in essence an economic model—a different way to acquire and manage

  18. !CHAOS: A cloud of controls

    NASA Astrophysics Data System (ADS)

    Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.

    2016-01-01

    The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.

  19. Statistical Analyses of Satellite Cloud Object Data From CERES. Part 4; Boundary-layer Cloud Objects During 1998 El Nino

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce A.; Parker, Lindsay

    2006-01-01

    Three boundary-layer cloud object types, stratus, stratocumulus and cumulus, that occurred over the Pacific Ocean during January-August 1998, are identified from the CERES (Clouds and the Earth s Radiant Energy System) single scanner footprint (SSF) data from the TRMM (Tropical Rainfall Measuring Mission) satellite. This study emphasizes the differences and similarities in the characteristics of each cloud-object type between the tropical and subtropical regions and among different size categories and among small geographic areas. Both the frequencies of occurrence and statistical distributions of cloud physical properties are analyzed. In terms of frequencies of occurrence, stratocumulus clouds dominate the entire boundary layer cloud population in all regions and among all size categories. Stratus clouds are more prevalent in the subtropics and near the coastal regions, while cumulus clouds are relatively prevalent over open ocean and the equatorial regions, particularly, within the small size categories. The largest size category of stratus cloud objects occurs more frequently in the subtropics than in the tropics and has much larger average size than its cumulus and stratocumulus counterparts. Each of the three cloud object types exhibits small differences in statistical distributions of cloud optical depth, liquid water path, TOA albedo and perhaps cloud-top height, but large differences in those of cloud-top temperature and OLR between the tropics and subtropics. Differences in the sea surface temperature (SST) distributions between the tropics and subtropics influence some of the cloud macrophysical properties, but cloud microphysical properties and albedo for each cloud object type are likely determined by (local) boundary-layer dynamics and structures. Systematic variations of cloud optical depth, TOA albedo, cloud-top height, OLR and SST with cloud object sizes are pronounced for the stratocumulus and stratus types, which are related to systematic

  20. Emulating JWST Exoplanet Transit Observations in a Testbed laboratory experiment

    NASA Astrophysics Data System (ADS)

    Touli, D.; Beichman, C. A.; Vasisht, G.; Smith, R.; Krist, J. E.

    2014-12-01

    The transit technique is used for the detection and characterization of exoplanets. The combination of transit and radial velocity (RV) measurements gives information about a planet's radius and mass, respectively, leading to an estimate of the planet's density (Borucki et al. 2011) and therefore to its composition and evolutionary history. Transit spectroscopy can provide information on atmospheric composition and structure (Fortney et al. 2013). Spectroscopic observations of individual planets have revealed atomic and molecular species such as H2O, CO2 and CH4 in atmospheres of planets orbiting bright stars, e.g. Deming et al. (2013). The transit observations require extremely precise photometry. For instance, Jupiter transit results to a 1% brightness decrease of a solar type star while the Earth causes only a 0.0084% decrease (84 ppm). Spectroscopic measurements require still greater precision <30ppm. The Precision Projector Laboratory (PPL) is a collaboration between the Jet Propulsion Laboratory (JPL) and California Institute of Technology (Caltech) to characterize and validate detectors through emulation of science images. At PPL we have developed a testbed to project simulated spectra and other images onto a HgCdTe array in order to assess precision photometry for transits, weak lensing etc. for Explorer concepts like JWST, WFIRST, EUCLID. In our controlled laboratory experiment, the goal is to demonstrate ability to extract weak transit spectra as expected for NIRCam, NIRIS and NIRSpec. Two lamps of variable intensity, along with spectral line and photometric simulation masks emulate the signals from a star-only, from a planet-only and finally, from a combination of a planet + star. Three masks have been used to simulate spectra in monochromatic light. These masks, which are fabricated at JPL, have a length of 1000 pixels and widths of 2 pixels, 10 pixels and 1 pixel to correspond respectively to the noted above JWST instruments. From many-hour long

  1. Ensemble Prediction System Development for Hydrometeorological Testbed (HMT) Application

    NASA Astrophysics Data System (ADS)

    Jankov, I.; Albers, S.; Yuan, H.; Wharton, L.; Toth, Z.; Schneider, T.; White, A.; Ralph, M.

    2010-09-01

    Significant precipitation events in California during the winter season are often caused by land-falling "atmospheric rivers" associated with extratropical cyclones from the Pacific Ocean. Atmospheric rivers are narrow, elongated plumes of enhanced water vapor transport over the Pacific and Atlantic oceans that can extend from the tropics and subtropics into the extratropics. Large values of integrated water vapor are advected within the warm sector of extratropical cyclones immediately ahead of polar cold fronts, although the source of these vapor plumes can originate in the tropics beyond the cyclone warm sector. When an atmospheric river makes a landfall on the coast of California, the northwest to southeast orientation of the Sierra Mountain chain exerts orographic forcing on the southwesterly low-level flow in the warm sector of approaching extratropical cyclones. As a result, sustained precipitation is typically enhanced and modified by the complex terrain. This has major hydrological consequences. The National Oceanic Atmospheric Administration (NOAA) has established the Hydrometeorological Testbed (HMT) to design and support a series of field and numerical modeling experiments to better understand and forecast precipitation in the Central Valley. The main role of the Forecast Application Branch (NOAA/ESRL/GSD) in HMT has been in supporting the real time numerical forecasts as well as research activities targeting better understanding and improvement of Quantitative Precipitation Forecasts (QPF). For this purpose ensemble modeling system has been developed. The ensemble system consists of mixed dynamic cores, mixed physics and mixed lateral boundary conditions. Details related to the ensemble setting and performance will be presented at the conference.

  2. Laboratory Spacecraft Data Processing and Instrument Autonomy: AOSAT as Testbed

    NASA Astrophysics Data System (ADS)

    Lightholder, Jack; Asphaug, Erik; Thangavelautham, Jekan

    2015-11-01

    Recent advances in small spacecraft allow for their use as orbiting microgravity laboratories (e.g. Asphaug and Thangavelautham LPSC 2014) that will produce substantial amounts of data. Power, bandwidth and processing constraints impose limitations on the number of operations which can be performed on this data as well as the data volume the spacecraft can downlink. We show that instrument autonomy and machine learning techniques can intelligently conduct data reduction and downlink queueing to meet data storage and downlink limitations. As small spacecraft laboratory capabilities increase, we must find techniques to increase instrument autonomy and spacecraft scientific decision making. The Asteroid Origins Satellite (AOSAT) CubeSat centrifuge will act as a testbed for further proving these techniques. Lightweight algorithms, such as connected components analysis, centroid tracking, K-means clustering, edge detection, convex hull analysis and intelligent cropping routines can be coupled with the tradition packet compression routines to reduce data transfer per image as well as provide a first order filtering of what data is most relevant to downlink. This intelligent queueing provides timelier downlink of scientifically relevant data while reducing the amount of irrelevant downlinked data. Resulting algorithms allow for scientists to throttle the amount of data downlinked based on initial experimental results. The data downlink pipeline, prioritized for scientific relevance based on incorporated scientific objectives, can continue from the spacecraft until the data is no longer fruitful. Coupled with data compression and cropping strategies at the data packet level, bandwidth reductions exceeding 40% can be achieved while still downlinking data deemed to be most relevant in a double blind study between scientist and algorithm. Applications of this technology allow for the incorporation of instrumentation which produces significant data volumes on small spacecraft

  3. Results from SIM's Thermo-Opto-Mechanical (TOM3) Testbed

    NASA Technical Reports Server (NTRS)

    Goullioud, Renaud; Lindensmith, C. A.; Hahn, I.

    2006-01-01

    Future space-based optical interferometers, such as the Space Interferometer Mission Planet Quest (SIM), require thermal stability of the optical wavefront to the level of picometers in order to produce astrometric data at the micro-arc-second level. In SIM, the internal path of the interferometer will be measured with a small metrology beam whereas the starlight fringe position is estimated from a large concentric annular beam. To achieve the micro-arc-second observation goal for SIM, it is necessary to maintain the optical path difference between the central and the outer annulus portions of the wavefront of the front-end telescope optics to a few tens of picometers. The Thermo-Opto-Mecha nical testbed (TOM3) was developed at the Jet Propulsion Laboratory to measure thermally induced optical deformations of a full-size flight-like beam compressor and siderostat, the two largest optics on SIM, in flight-like thermal environments. A Common Path Heterodyne Interferometer (COPHI) developed at JPL was used for the fine optical path difference measurement as the metrology sensor. The system was integrated inside a large vacuum chamber in order to mitigate the atmospheric and thermal disturbances. The siderostat was installed in a temperature-controlled thermal shroud inside the vacuum chamber, creating a flight-like thermal environment. Detailed thermal and structural models of the test articles (siderostat and compressor) were also developed for model prediction and correlation of the thermal deformations. Experimental data shows SIM required thermal stability of the test articles and good agreement with the model predictions.

  4. Earthbound Unmanned Autonomous Vehicles (UAVS) As Planetary Science Testbeds

    NASA Astrophysics Data System (ADS)

    Pieri, D. C.; Bland, G.; Diaz, J. A.; Fladeland, M. M.

    2014-12-01

    Recent advances in the technology of unmanned vehicles have greatly expanded the range of contemplated terrestrial operational environments for their use, including aerial, surface, and submarine. The advances have been most pronounced in the areas of autonomy, miniaturization, durability, standardization, and ease of operation, most notably (especially in the popular press) for airborne vehicles. Of course, for a wide range of planetary venues, autonomy at high cost of both money and risk, has always been a requirement. Most recently, missions to Mars have also featured an unprecedented degree of mobility. Combining the traditional planetary surface deployment operational and science imperatives with emerging, very accessible, and relatively economical small UAV platforms on Earth can provide flexible, rugged, self-directed, test-bed platforms for landed instruments and strategies that will ultimately be directed elsewhere, and, in the process, provide valuable earth science data. While the most direct transfer of technology from terrestrial to planetary venues is perhaps for bodies with atmospheres (and oceans), with appropriate technology and strategy accommodations, single and networked UAVs can be designed to operate on even airless bodies, under a variety of gravities. In this presentation, we present and use results and lessons learned from our recent earth-bound UAV volcano deployments, as well as our future plans for such, to conceptualize a range of planetary and small-body missions. We gratefully acknowledge the assistance of students and colleagues at our home institutions, and the government of Costa Rica, without which our UAV deployments would not have been possible. This work was carried out, in part, at the Jet Propulsion Laboratory of the California Institute of Technology under contract to NASA.

  5. Graphical interface between the CIRSSE testbed and CimStation software with MCS/CTOS

    NASA Technical Reports Server (NTRS)

    Hron, Anna B.

    1992-01-01

    This research is concerned with developing a graphical simulation of the testbed at the Center for Intelligent Robotic Systems for Space Exploration (CIRSSE) and the interface which allows for communication between the two. Such an interface is useful in telerobotic operations, and as a functional interaction tool for testbed users. Creating a simulated model of a real world system, generates inevitable calibration discrepancies between them. This thesis gives a brief overview of the work done to date in the area of workcell representation and communication, describes the development of the CIRSSE interface, and gives a direction for future work in the area of system calibration. The CimStation software used for development of this interface, is a highly versatile robotic workcell simulation package which has been programmed for this application with a scale graphical model of the testbed, and supporting interface menu code. A need for this tool has been identified for the reasons of path previewing, as a window on teleoperation and for calibration of simulated vs. real world models. The interface allows information (i.e., joint angles) generated by CimStation to be sent as motion goal positions to the testbed robots. An option of the interface has been established such that joint angle information generated by supporting testbed algorithms (i.e., TG, collision avoidance) can be piped through CimStation as a visual preview of the path.

  6. The Development of a Smart Distribution Grid Testbed for Integrated Information Management Systems

    SciTech Connect

    Lu, Ning; Du, Pengwei; Paulson, Patrick R.; Greitzer, Frank L.; Guo, Xinxin; Hadley, Mark D.

    2011-07-28

    This paper presents a smart distribution grid testbed to test or compare designs of integrated information management systems (I2MSs). An I2MS extracts and synthesizes information from a wide range of data sources to detect abnormal system behaviors, identify possible causes, assess the system status, and provide grid operators with response suggestions. The objective of the testbed is to provide a modeling environment with sufficient data sources for the I2MS design. The testbed includes five information layers and a physical layer; it generates multi-layer chronological data based on actual measurement playbacks or simulated data sets produced by the physical layer. The testbed models random hardware failures, human errors, extreme weather events, and deliberate tampering attempts to allow users to evaluate the performance of different I2MS designs. Initial results of I2MS performance tests showed that the testbed created a close-to-real-world environment that allowed key performance metrics of the I2MS to be evaluated.

  7. Preliminary results of the LLNL airborne experimental test-bed SAR system

    SciTech Connect

    Miller, M.G.; Mullenhoff, C.J.; Kiefer, R.D.; Brase, J.M.; Wieting, M.G.; Berry, G.L.; Jones, H.E.

    1996-01-16

    The Imaging and Detection Program (IDP) within Laser Programs at Lawrence Livermore National Laboratory (LLNL) in cooperation with the Hughes Aircraft Company has developed a versatile, high performance, airborne experimental test-bed (AETB) capability. The test-bed has been developed for a wide range of research and development experimental applications including radar and radiometry plus, with additional aircraft modifications, optical systems. The airborne test-bed capability has been developed within a Douglas EA-3B Skywarrior jet aircraft provided and flown by Hughes Aircraft Company. The current test-bed payload consists of an X-band radar system, a high-speed data acquisition, and a real-time processing capability. The medium power radar system is configured to operate in a high resolution, synthetic aperture radar (SAR) mode and is highly configurable in terms of waveforrns, PRF, bandwidth, etc. Antennas are mounted on a 2-axis gimbal in the belly radome of the aircraft which provides pointing and stabilization. Aircraft position and antenna attitude are derived from a dedicated navigational system and provided to the real-time SAR image processor for instant image reconstruction and analysis. This paper presents a further description of the test-bed and payload subsystems plus preliminary results of SAR imagery.

  8. Conceptual Design and Cost Estimate of a Subsonic NASA Testbed Vehicle (NTV) for Aeronautics Research

    NASA Technical Reports Server (NTRS)

    Nickol, Craig L.; Frederic, Peter

    2013-01-01

    A conceptual design and cost estimate for a subsonic flight research vehicle designed to support NASA's Environmentally Responsible Aviation (ERA) project goals is presented. To investigate the technical and economic feasibility of modifying an existing aircraft, a highly modified Boeing 717 was developed for maturation of technologies supporting the three ERA project goals of reduced fuel burn, noise, and emissions. This modified 717 utilizes midfuselage mounted modern high bypass ratio engines in conjunction with engine exhaust shielding structures to provide a low noise testbed. The testbed also integrates a natural laminar flow wing section and active flow control for the vertical tail. An eight year program plan was created to incrementally modify and test the vehicle, enabling the suite of technology benefits to be isolated and quantified. Based on the conceptual design and programmatic plan for this testbed vehicle, a full cost estimate of $526M was developed, representing then-year dollars at a 50% confidence level.

  9. Recent experiments conducted with the Wide-field imaging interferometry testbed (WIIT)

    NASA Astrophysics Data System (ADS)

    Leisawitz, David T.; Juanola-Parramon, Roser; Bolcar, Matthew; Fienup, James R.; Iacchetta, Alexander S.; Maher, Stephen F.; Rinehart, Stephen A.

    2016-08-01

    The Wide-field Imaging Interferometry Testbed (WIIT) was developed at NASA's Goddard Space Flight Center to demonstrate and explore the practical limitations inherent in wide field-of-view "double Fourier" (spatio-spectral) interferometry. The testbed delivers high-quality interferometric data and is capable of observing spatially and spectrally complex hyperspectral test scenes. Although WIIT operates at visible wavelengths, by design the data are representative of those from a space-based far-infrared observatory. We used WIIT to observe a calibrated, independently characterized test scene of modest spatial and spectral complexity, and an astronomically realistic test scene of much greater spatial and spectral complexity. This paper describes the experimental setup, summarizes the performance of the testbed, and presents representative data.

  10. Comparison of two matrix data structures for advanced CSM testbed applications

    NASA Technical Reports Server (NTRS)

    Regelbrugge, M. E.; Brogan, F. A.; Nour-Omid, B.; Rankin, C. C.; Wright, M. A.

    1989-01-01

    The first section describes data storage schemes presently used by the Computational Structural Mechanics (CSM) testbed sparse matrix facilities and similar skyline (profile) matrix facilities. The second section contains a discussion of certain features required for the implementation of particular advanced CSM algorithms, and how these features might be incorporated into the data storage schemes described previously. The third section presents recommendations, based on the discussions of the prior sections, for directing future CSM testbed development to provide necessary matrix facilities for advanced algorithm implementation and use. The objective is to lend insight into the matrix structures discussed and to help explain the process of evaluating alternative matrix data structures and utilities for subsequent use in the CSM testbed.

  11. Towards an autonomous telescope system: the Test-Bed Telescope project

    NASA Astrophysics Data System (ADS)

    Racero, E.; Ocaña, F.; Ponz, D.; the TBT Consortium

    2015-05-01

    In the context of the Space Situational Awareness (SSA) programme of ESA, it is foreseen to deploy several large robotic telescopes in remote locations to provide surveillance and tracking services for man-made as well as natural near-Earth objects (NEOs). The present project, termed Telescope Test Bed (TBT) is being developed under ESA's General Studies and Technology Programme, and shall implement a test-bed for the validation of an autonomous optical observing system in a realistic scenario, consisting of two telescopes located in Spain and Australia, to collect representative test data for precursor NEO services. It is foreseen that this test-bed environment will be used to validate future prototype software systems as well as to evaluate remote monitoring and control techniques. The test-bed system will be capable to deliver astrometric and photometric data of the observed objects in near real-time. This contribution describes the current status of the project.

  12. Definition study for variable cycle engine testbed engine and associated test program

    NASA Technical Reports Server (NTRS)

    Vdoviak, J. W.

    1978-01-01

    The product/study double bypass variable cycle engine (VCE) was updated to incorporate recent improvements. The effect of these improvements on mission range and noise levels was determined. This engine design was then compared with current existing high-technology core engines in order to define a subscale testbed configuration that simulated many of the critical technology features of the product/study VCE. Detailed preliminary program plans were then developed for the design, fabrication, and static test of the selected testbed engine configuration. These plans included estimated costs and schedules for the detail design, fabrication and test of the testbed engine and the definition of a test program, test plan, schedule, instrumentation, and test stand requirements.

  13. SPHERES tethered formation flight testbed: advancements in enabling NASA's SPECS mission

    NASA Astrophysics Data System (ADS)

    Chung, Soon-Jo; Adams, Danielle; Saenz-Otero, Alvar; Kong, Edmund; Miller, David W.; Leisawitz, David; Lorenzini, Enrico; Sell, Steve

    2006-06-01

    This paper reports on efforts to control a tethered formation flight spacecraft array for NASA's SPECS mission using the SPHERES test-bed developed by the MIT Space Systems Laboratory. Specifically, advances in methodology and experimental results realized since the 2005 SPIE paper are emphasized. These include a new test-bed setup with a reaction wheel assembly, a novel relative attitude measurement system using force torque sensors, and modeling of non-ideal tethers to account for tether vibration modes. The nonlinear equations of motion of multi-vehicle tethered spacecraft with elastic flexible tethers are derived from Lagrange's equations. The controllability analysis indicates that both array resizing and spin-up are fully controllable by the reaction wheels and the tether motor, thereby saving thruster fuel consumption. Based upon this analysis, linear and nonlinear controllers have been successfully implemented on the tethered SPHERES testbed, and tested at the NASA MSFC's flat floor facility using two and three SPHERES configurations.

  14. The hypercluster: A parallel processing test-bed architecture for computational mechanics applications

    NASA Technical Reports Server (NTRS)

    Blech, Richard A.

    1987-01-01

    The development of numerical methods and software tools for parallel processors can be aided through the use of a hardware test-bed. The test-bed architecture must be flexible enough to support investigations into architecture-algorithm interactions. One way to implement a test-bed is to use a commercial parallel processor. Unfortunately, most commercial parallel processors are fixed in their interconnection and/or processor architecture. In this paper, we describe a modified n cube architecture, called the hypercluster, which is a superset of many other processor and interconnection architectures. The hypercluster is intended to support research into parallel processing of computational fluid and structural mechanics problems which may require a number of different architectural configurations. An example of how a typical partial differential equation solution algorithm maps on to the hypercluster is given.

  15. Multivesicular Assemblies as Real-World Testbeds for Embryogenic Evolutionary Systems

    NASA Astrophysics Data System (ADS)

    Hadorn, Maik; Eggenberger Hotz, Peter

    Embryogenic evolution emulates in silico cell-like entities to get more powerful methods for complex evolutionary tasks. As simulations have to abstract from the biological model, implicit information hidden in its physics is lost. Here, we propose to use cell-like entities as a real-world in vitro testbed. In analogy to evolutionary robotics, where solutions evolved in simulations may be tested in real-world on macroscale, the proposed vesicular testbed would do the same for the embryogenic evolutionary tasks on mesoscale. As a first step towards a vesicular testbed emulating growth, cell division, and cell differentiation, we present a modified vesicle production method, providing custom-tailored chemical cargo, and present a novel self-assembly procedure to provide vesicle aggregates of programmable composition.

  16. Flight Testing of Guidance, Navigation and Control Systems on the Mighty Eagle Robotic Lander Testbed

    NASA Technical Reports Server (NTRS)

    Hannan, Mike; Rickman, Doug; Chavers, Greg; Adam, Jason; Becker, Chris; Eliser, Joshua; Gunter, Dan; Kennedy, Logan; O'Leary, Patrick

    2015-01-01

    During 2011 a series of progressively more challenging flight tests of the Mighty Eagle autonomous terrestrial lander testbed were conducted primarily to validate the GNC system for a proposed lunar lander. With the successful completion of this GNC validation objective the opportunity existed to utilize the Mighty Eagle as a flying testbed for a variety of technologies. In 2012 an Autonomous Rendezvous and Capture (AR&C) algorithm was implemented in flight software and demonstrated in a series of flight tests. In 2012 a hazard avoidance system was developed and flight tested on the Mighty Eagle. Additionally, GNC algorithms from Moon Express and a MEMs IMU were tested in 2012. All of the testing described herein was above and beyond the original charter for the Mighty Eagle. In addition to being an excellent testbed for a wide variety of systems the Mighty Eagle also provided a great learning opportunity for many engineers and technicians to work a flight program.

  17. An Unattended Cloud-Profiling Radar for Use in Climate Research.

    NASA Astrophysics Data System (ADS)

    Moran, Kenneth P.; Martner, Brooks E.; Post, M. J.; Kropfli, Robert A.; Welsh, David C.; Widener, Kevin B.

    1998-03-01

    A new millimeter-wave cloud radar (MMCR) has been designed to provide detailed, long-term observations of nonprecipitating and weakly precipitating clouds at Cloud and Radiation Testbed (CART) sites of the Department of Energy's Atmospheric Radiation Measurement (ARM) program. Scientific requirements included excellent sensitivity and vertical resolution to detect weak and thin multiple layers of ice and liquid water clouds over the sites and long-term, unattended operations in remote locales. In response to these requirements, the innovative radar design features a vertically pointing, single-polarization, Doppler system operating at 35 GHz (Ka band). It uses a low-peak-power transmitter for long-term reliability and high-gain antenna and pulse-compressed waveforms to maximize sensitivity and resolution. The radar uses the same kind of signal processor as that used in commercial wind profilers. The first MMCR began operations at the CART in northern Oklahoma in late 1996 and has operated continuously there for thousands of hours. It routinely provides remarkably detailed images of the ever-changing cloud structure and kinematics over this densely instrumented site. Examples of the data are presented. The radar measurements will greatly improve quantitative documentation of cloud conditions over the CART sites and will bolster ARM research to understand how clouds impact climate through their effects on radiative transfer. Millimeter-wave radars such as the MMCR also have potential applications in the fields of aviation weather, weather modification, and basic cloud physics research.

  18. Chapter 25: Cloud-Resolving Modeling: ARM and the GCSS Story

    NASA Technical Reports Server (NTRS)

    Krueger, Steven K.; Morrison, Hugh; Fridlind, Ann M.

    2016-01-01

    The Global Energy and Water Cycle Experiment (GEWEX) Cloud System Study (GCSS) was created in 1992. As described by Browning et al., The focus of GCSS is on cloud systems spanning the mesoscale rather than on individual clouds. Observations from field programs will be used to develop and validate the cloud-resolving models, which in turn will be used as test-beds to develop the parameterizations for the large-scale models. The most important activities that GCSS promoted were the following: Identify key questions about cloud systems relating to parameterization issues and suggest approaches to address them, and Organize model intercomparison studies relevant to cloud parameterization. Four different cloud system types were chosen for GCSS to study: boundary layer, cirrus, frontal, and deep precipitating convective. A working group (WG) was formed for each of the cloud system types. The WGs organized model intercomparison studies and meetings to present results of the intercomparisons. The first such intercomparison study took place in 1994.

  19. The Open University Opens.

    ERIC Educational Resources Information Center

    Tunstall, Jeremy, Ed.

    Conceived by the British Labor Government in the 1960's the Open University was viewed as a way to extend higher education to Britain's working class, but enrollment figures in classes that represent traditional academic disciplines show that the student population is predominantly middle class. Bringing education into the home presents numerous…

  20. Virtual Pipeline System Testbed to Optimize the U.S. Natural Gas Transmission Pipeline System

    SciTech Connect

    Kirby S. Chapman; Prakash Krishniswami; Virg Wallentine; Mohammed Abbaspour; Revathi Ranganathan; Ravi Addanki; Jeet Sengupta; Liubo Chen

    2005-06-01

    The goal of this project is to develop a Virtual Pipeline System Testbed (VPST) for natural gas transmission. This study uses a fully implicit finite difference method to analyze transient, nonisothermal compressible gas flow through a gas pipeline system. The inertia term of the momentum equation is included in the analysis. The testbed simulate compressor stations, the pipe that connects these compressor stations, the supply sources, and the end-user demand markets. The compressor station is described by identifying the make, model, and number of engines, gas turbines, and compressors. System operators and engineers can analyze the impact of system changes on the dynamic deliverability of gas and on the environment.

  1. SAVA 3: A testbed for integration and control of visual processes

    NASA Technical Reports Server (NTRS)

    Crowley, James L.; Christensen, Henrik

    1994-01-01

    The development of an experimental test-bed to investigate the integration and control of perception in a continuously operating vision system is described. The test-bed integrates a 12 axis robotic stereo camera head mounted on a mobile robot, dedicated computer boards for real-time image acquisition and processing, and a distributed system for image description. The architecture was designed to: (1) be continuously operating, (2) integrate software contributions from geographically dispersed laboratories, (3) integrate description of the environment with 2D measurements, 3D models, and recognition of objects, (4) capable of supporting diverse experiments in gaze control, visual servoing, navigation, and object surveillance, and (5) dynamically reconfiguarable.

  2. The Segmented Aperture Interferometric Nulling Testbed (SAINT) I: overview and air-side system description

    NASA Astrophysics Data System (ADS)

    Hicks, Brian A.; Lyon, Richard G.; Petrone, Peter; Ballard, Marlin; Bolcar, Matthew R.; Bolognese, Jeff; Clampin, Mark; Dogoda, Peter; Dworzanski, Daniel; Helmbrecht, Michael A.; Koca, Corina; Shiri, Ron

    2016-07-01

    This work presents an overview of the Segmented Aperture Interferometric Nulling Testbed (SAINT), a project that will pair an actively-controlled macro-scale segmented mirror with the Visible Nulling Coronagraph (VNC). SAINT will incorporate the VNC's demonstrated wavefront sensing and control system to refine and quantify end-to-end high-contrast starlight suppression performance. This pathfinder testbed will be used as a tool to study and refine approaches to mitigating instabilities and complex diffraction expected from future large segmented aperture telescopes.

  3. Regenerative Fuel Cell System Testbed Program for Government and Commercial Applications

    NASA Technical Reports Server (NTRS)

    1996-01-01

    NASA Lewis Research Center's Electrochemical Technology Branch has led a multiagency effort to design, fabricate, and operate a regenerative fuel cell (RFC) system testbed. Key objectives of this program are to evaluate, characterize, and demonstrate fully integrated RFC's for space, military, and commercial applications. The Lewis-led team is implementing the program through a unique international coalition that encompasses both Government and industry participants. Construction of the 25-kW RFC testbed at the NASA facility at Edwards Air Force Base was completed in January 1995, and the system has been operational since that time.

  4. Jovian clouds and haze

    NASA Astrophysics Data System (ADS)

    West, Robert A.; Baines, Kevin H.; Friedson, A. James; Banfield, Don; Ragent, Boris; Taylor, Fred W.

    Tropospheric clouds: thermochemical equilibrium theory and cloud microphysical theory, condensate cloud microphysics, tropospheric cloud and haze distribution - observations, results from the Galileo probe experiments, Galileo NIMS observations and results, Galileo SSE observations and results, recent analyses of ground-based and HST data; Tropospheric clouds and haze: optical and physical properties: partical composition, particle optical properties, size and shape, chromophores; Stratospheric haze: particle distribution, optical properties, size and shape, particle formation.

  5. The Oort cloud

    NASA Technical Reports Server (NTRS)

    Marochnik, Leonid S.; Mukhin, Lev M.; Sagdeev, Roald Z.

    1991-01-01

    Views of the large-scale structure of the solar system, consisting of the Sun, the nine planets and their satellites, changed when Oort demonstrated that a gigantic cloud of comets (the Oort cloud) is located on the periphery of the solar system. The following subject areas are covered: (1) the Oort cloud's mass; (2) Hill's cloud mass; (3) angular momentum distribution in the solar system; and (4) the cometary cloud around other stars.

  6. Towards Autonomous Operations of the Robonaut 2 Humanoid Robotic Testbed

    NASA Technical Reports Server (NTRS)

    Badger, Julia; Nguyen, Vienny; Mehling, Joshua; Hambuchen, Kimberly; Diftler, Myron; Luna, Ryan; Baker, William; Joyce, Charles

    2016-01-01

    The Robonaut project has been conducting research in robotics technology on board the International Space Station (ISS) since 2012. Recently, the original upper body humanoid robot was upgraded by the addition of two climbing manipulators ("legs"), more capable processors, and new sensors, as shown in Figure 1. While Robonaut 2 (R2) has been working through checkout exercises on orbit following the upgrade, technology development on the ground has continued to advance. Through the Active Reduced Gravity Offload System (ARGOS), the Robonaut team has been able to develop technologies that will enable full operation of the robotic testbed on orbit using similar robots located at the Johnson Space Center. Once these technologies have been vetted in this way, they will be implemented and tested on the R2 unit on board the ISS. The goal of this work is to create a fully-featured robotics research platform on board the ISS to increase the technology readiness level of technologies that will aid in future exploration missions. Technology development has thus far followed two main paths, autonomous climbing and efficient tool manipulation. Central to both technologies has been the incorporation of a human robotic interaction paradigm that involves the visualization of sensory and pre-planned command data with models of the robot and its environment. Figure 2 shows screenshots of these interactive tools, built in rviz, that are used to develop and implement these technologies on R2. Robonaut 2 is designed to move along the handrails and seat track around the US lab inside the ISS. This is difficult for many reasons, namely the environment is cluttered and constrained, the robot has many degrees of freedom (DOF) it can utilize for climbing, and remote commanding for precision tasks such as grasping handrails is time-consuming and difficult. Because of this, it is important to develop the technologies needed to allow the robot to reach operator-specified positions as

  7. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    NASA Technical Reports Server (NTRS)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and

  8. The Bermuda Testbed Mooring and Emerging Technologies for Interdisciplinary Research

    NASA Astrophysics Data System (ADS)

    Dickey, T. D.

    2001-12-01

    The Bermuda Testbed Mooring (BTM) program provides the oceanographic community with a deep-water platform for testing new instrumentation. Scientific studies also utilize data collected from the BTM, particularly in conjunction with the U.S. JGOFS Bermuda Atlantic Time-series Study (BATS) program. Additionally, the BTM has been used for groundtruthing of satellite ocean color imager (SeaWiFS) data. The mooring is located about 80 km southeast of Bermuda. Surface instruments have collected meteorological and spectral radiometric data from the buoy tower and measurements at depth have included: currents, temperature, bio-optical, chemical, and acoustical variables. The BTM captures a broad dynamic range of oceanic variability (minutes to years). Key results include: 1. Data obtained during passages of cold-core eddies have been used to estimate the role of such features on new production and carbon flux to the deep ocean. One of the observed features contained the greatest values of chlorophyll observed during the decade of observations at the site (based on BATS historical data base). The measurements provide high frequency, long-term data, which can be used for a) detailed studies of a variety of physical, chemical, bio-optical, and ecological processes on time scales from minutes to years, b) contextual information for many other observations made near the BTM/BATS sites, c) evaluation of undersampling/aliasing effects, and d) developing/testing models. 2. The dynamics of the upper ocean have been observed during transient re-stratification events and during passages of hurricanes and other intense storms. These observations are unique and the subject of ongoing modeling efforts. 3. BTM papers have provided new insights concerning bio-optical variability on short (minutes to day) time scales and have proven valuable for ocean color satellite groundtruthing. 4. During the BTM project, several new sensors and systems have been tested by U.S. and international groups

  9. Virtual infrastructure management in private and hybrid clouds.

    SciTech Connect

    Sotomayor, B.; Montero, R. S.; Llorente, I. M.; Foster, I.; Mathematics and Computer Science; Univ. of Chicago; Univ. Complutense of Madrid

    2009-01-01

    One of the many definitions of 'cloud' is that of an infrastructure-as-a-service (IaaS) system, in which IT infrastructure is deployed in a provider's data center as virtual machines. With IaaS clouds growing popularity, tools and technologies are emerging that can transform an organization's existing infrastructure into a private or hybrid cloud. OpenNebula is an open source, virtual infrastructure manager that deploys virtualized services on both a local pool of resources and external IaaS clouds. Haizea, a resource lease manager, can act as a scheduling back end for OpenNebula, providing features not found in other cloud software or virtualization-based data center management software.

  10. Ice Clouds

    NASA Technical Reports Server (NTRS)

    2003-01-01

    [figure removed for brevity, see original site]

    Heavy water ice clouds almost completely obscure the surface in Vastitas Borealis.

    Note: this THEMIS visual image has not been radiometrically nor geometrically calibrated for this preliminary release. An empirical correction has been performed to remove instrumental effects. A linear shift has been applied in the cross-track and down-track direction to approximate spacecraft and planetary motion. Fully calibrated and geometrically projected images will be released through the Planetary Data System in accordance with Project policies at a later time.

    NASA's Jet Propulsion Laboratory manages the 2001 Mars Odyssey mission for NASA's Office of Space Science, Washington, D.C. The Thermal Emission Imaging System (THEMIS) was developed by Arizona State University, Tempe, in collaboration with Raytheon Santa Barbara Remote Sensing. The THEMIS investigation is led by Dr. Philip Christensen at Arizona State University. Lockheed Martin Astronautics, Denver, is the prime contractor for the Odyssey project, and developed and built the orbiter. Mission operations are conducted jointly from Lockheed Martin and from JPL, a division of the California Institute of Technology in Pasadena.

    Image information: VIS instrument. Latitude 69.5, Longitude 283.6 East (76.4 West). 19 meter/pixel resolution.

  11. A Cloud Microphysics Model for the Gas Giant Planets

    NASA Astrophysics Data System (ADS)

    Palotai, Csaba J.; Le Beau, Raymond P.; Shankar, Ramanakumar; Flom, Abigail; Lashley, Jacob; McCabe, Tyler

    2016-10-01

    Recent studies have significantly increased the quality and the number of observed meteorological features on the jovian planets, revealing banded cloud structures and discrete features. Our current understanding of the formation and decay of those clouds also defines the conceptual modes about the underlying atmospheric dynamics. The full interpretation of the new observational data set and the related theories requires modeling these features in a general circulation model (GCM). Here, we present details of our bulk cloud microphysics model that was designed to simulate clouds in the Explicit Planetary Hybrid-Isentropic Coordinate (EPIC) GCM for the jovian planets. The cloud module includes hydrological cycles for each condensable species that consist of interactive vapor, cloud and precipitation phases and it also accounts for latent heating and cooling throughout the transfer processes (Palotai and Dowling, 2008. Icarus, 194, 303-326). Previously, the self-organizing clouds in our simulations successfully reproduced the vertical and horizontal ammonia cloud structure in the vicinity of Jupiter's Great Red Spot and Oval BA (Palotai et al. 2014, Icarus, 232, 141-156). In our recent work, we extended this model to include water clouds on Jupiter and Saturn, ammonia clouds on Saturn, and methane clouds on Uranus and Neptune. Details of our cloud parameterization scheme, our initial results and their comparison with observations will be shown. The latest version of EPIC model is available as open source software from NASA's PDS Atmospheres Node.

  12. Astronomy in the Cloud: Using MapReduce for Image Co-Addition

    NASA Astrophysics Data System (ADS)

    Wiley, K.; Connolly, A.; Gardner, J.; Krughoff, S.; Balazinska, M.; Howe, B.; Kwon, Y.; Bu, Y.

    2011-03-01

    In the coming decade, astronomical surveys of the sky will generate tens of terabytes of images and detect hundreds of millions of sources every night. The study of these sources will involve computation challenges such as anomaly detection and classification and moving-object tracking. Since such studies benefit from the highest-quality data, methods such as image co-addition, i.e., astrometric registration followed by per-pixel summation, will be a critical preprocessing step prior to scientific investigation. With a requirement that these images be analyzed on a nightly basis to identify moving sources such as potentially hazardous asteroids or transient objects such as supernovae, these data streams present many computational challenges. Given the quantity of data involved, the computational load of these problems can only be addressed by distributing the workload over a large number of nodes. However, the high data throughput demanded by these applications may present scalability challenges for certain storage architectures. One scalable data-processing method that has emerged in recent years is MapReduce, and in this article we focus on its popular open-source implementation called Hadoop. In the Hadoop framework, the data are partitioned among storage attached directly to worker nodes, and the processing workload is scheduled in parallel on the nodes that contain the required input data. A further motivation for using Hadoop is that it allows us to exploit cloud computing resources: i.e., platforms where Hadoop is offered as a service. We report on our experience of implementing a scalable image-processing pipeline for the SDSS imaging database using Hadoop. This multiterabyte imaging data set provides a good testbed for algorithm development, since its scope and structure approximate future surveys. First, we describe MapReduce and how we adapted image co-addition to the MapReduce framework. Then we describe a number of optimizations to our basic approach

  13. Frequency and causes of failed MODIS cloud property retrievals for liquid phase clouds over global oceans.

    PubMed

    Cho, Hyoun-Myoung; Zhang, Zhibo; Meyer, Kerry; Lebsock, Matthew; Platnick, Steven; Ackerman, Andrew S; Di Girolamo, Larry; C-Labonnote, Laurent; Cornet, Céline; Riedi, Jerome; Holz, Robert E

    2015-05-16

    Moderate Resolution Imaging Spectroradiometer (MODIS) retrieves cloud droplet effective radius (re ) and optical thickness (τ) by projecting observed cloud reflectances onto a precomputed look-up table (LUT). When observations fall outside of the LUT, the retrieval is considered "failed" because no combination of τ and re within the LUT can explain the observed cloud reflectances. In this study, the frequency and potential causes of failed MODIS retrievals for marine liquid phase (MLP) clouds are analyzed based on 1 year of Aqua MODIS Collection 6 products and collocated CALIOP and CloudSat observations. The retrieval based on the 0.86 µm and 2.1 µm MODIS channel combination has an overall failure rate of about 16% (10% for the 0.86 µm and 3.7 µm combination). The failure rates are lower over stratocumulus regimes and higher over the broken trade wind cumulus regimes. The leading type of failure is the "re too large" failure accounting for 60%-85% of all failed retrievals. The rest is mostly due to the "re too small" or τ retrieval failures. Enhanced retrieval failure rates are found when MLP cloud pixels are partially cloudy or have high subpixel inhomogeneity, are located at special Sun-satellite viewing geometries such as sunglint, large viewing or solar zenith angles, or cloudbow and glory angles, or are subject to cloud masking, cloud overlapping, and/or cloud phase retrieval issues. The majority (more than 84%) of failed retrievals along the CALIPSO track can be attributed to at least one or more of these potential reasons. The collocated CloudSat radar reflectivity observations reveal that the remaining failed retrievals are often precipitating. It remains an open question whether the extremely large re values observed in these clouds are the consequence of true cloud microphysics or still due to artifacts not included in this study.

  14. Three-Dimensional Space to Assess Cloud Interoperability

    DTIC Science & Technology

    2013-03-01

    major cloud providers, OpenStack and OpeNebula, to demonstrate the usage of the three-dimensional space and its benefits . We start this chapter with a...documentation:rel4.0:external_auth. [68] X. Gao, P. Shah, A. Yoga , A. Kodgire and X. Ni. Cloud storage survey [Online]. Available: http

  15. Limits to Cloud Susceptibility

    NASA Technical Reports Server (NTRS)

    Coakley, James A., Jr.

    2002-01-01

    1-kilometer AVHRR observations of ship tracks in low-level clouds off the west coast of the U S. were used to determine limits for the degree to which clouds might be altered by increases in anthropogenic aerosols. Hundreds of tracks were analyzed to determine whether the changes in droplet radii, visible optical depths, and cloud top altitudes that result from the influx of particles from underlying ships were consistent with expectations based on simple models for the indirect effect of aerosols. The models predict substantial increases in sunlight reflected by polluted clouds due to the increases in droplet numbers and cloud liquid water that result from the elevated particle concentrations. Contrary to the model predictions, the analysis of ship tracks revealed a 15-20% reduction in liquid water for the polluted clouds. Studies performed with a large-eddy cloud simulation model suggested that the shortfall in cloud liquid water found in the satellite observations might be attributed to the restriction that the 1-kilometer pixels be completely covered by either polluted or unpolluted cloud. The simulation model revealed that a substantial fraction of the indirect effect is caused by a horizontal redistribution of cloud water in the polluted clouds. Cloud-free gaps in polluted clouds fill in with cloud water while the cloud-free gaps in the surrounding unpolluted clouds remain cloud-free. By limiting the analysis to only overcast pixels, the current study failed to account for the gap-filling predicted by the simulation model. This finding and an analysis of the spatial variability of marine stratus suggest new ways to analyze ship tracks to determine the limit to which particle pollution will alter the amount of sunlight reflected by clouds.

  16. Radar Data Quality Control and Assimilation at the National Weather Radar Testbed (NWRT)

    DTIC Science & Technology

    2008-09-30

    Radar Data Quality Control and Assimilation at the National Weather Radar Testbed (NWRT) Qin Xu CIMMS ...University of Oklahoma, CIMMS ,120 David L. Boren Blvd,Norman,OK,73072 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING AGENCY NAME(S...performed by project- supported research scientists at CIMMS , the University of Oklahoma. Collaborations between this project and the development of

  17. Radar Data Quality Control and Assimilation at the National Weather Radar Testbed (NWRT)

    DTIC Science & Technology

    2006-09-30

    Radar Data Quality Control and Assimilation at the National Weather Radar Testbed (NWRT) Dr. Qin Xu, CIMMS , University of Oklahoma, 120 David...scientists at CIMMS , the University of Oklahoma. Collaborations between this project and the development of the NWRT phased array radar is coordinated by

  18. A Simulation Testbed for Adaptive Modulation and Coding in Airborne Telemetry (Brief)

    DTIC Science & Technology

    2014-10-01

    Spectrum, Aeronautical telemetry, algorithm, bandwidth, attached sync marker (ASM), Integrated Enhanced Networked Telemetry (iNET), Shaped Offset...Cords Road W→E GTRI_B-‹#› Conclusion and Future Work • Developed a simulation testbed for aeronautical telemetry • Various Tunable Parameters

  19. Model-Based Diagnosis in a Power Distribution Test-Bed

    NASA Technical Reports Server (NTRS)

    Scarl, E.; McCall, K.

    1998-01-01

    The Rodon model-based diagnosis shell was applied to a breadboard test-bed, modeling an automated power distribution system. The constraint-based modeling paradigm and diagnostic algorithm were found to adequately represent the selected set of test scenarios.

  20. Vacuum Nuller Testbed (VNT) Performance, Characterization and Null Control: Progress Report

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.; Noecker, M. Charley; Kendrick, Stephen; Helmbrecht, Michael

    2011-01-01

    Herein we report on the development. sensing and control and our first results with the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraph (VNC) for exoplanet coronagraphy. The VNC is one of the few approaches that works with filled. segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be Hown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center (GSFC) has a well-established effort to develop VNC technologies. and has developed an incremental sequence of VNC testbeds to advance this approach and the enabling technologies associated with it. We discuss the continued development of the vacuum Visible Nulling Coronagraph testbed (VNT). Tbe VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(sup 8), 10(sup 9) and ideally 10(sup 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the initial laboratory results, the optical configuration, critical technologies and the null sensing and control approach.

  1. System identification and structural control on the JPL Phase B testbed

    NASA Technical Reports Server (NTRS)

    Chu, Cheng-Chih; Obrien, John F.; Lurie, Boris J.

    1993-01-01

    The primary objective of NASA's CSI program at JPL is to develop and demonstrate the CSI technology required to achieve high precision structural stability on large complex optical class spacecraft. The focus mission for this work is an orbiting interferometer telescope. Toward the realization of such a mission, a series of evolutionary testbed structures are being constructed. The JPL's CSI Phase B testbed is the second structure constructed in this series which is designed to study the pathlength control problem of the optical train of a stellar interferometer telescope mounted on a large flexible structure. A detailed description of this testbed can be found. This paper describes our efforts in the first phase of active structural control experiments of Phase B testbed using the active control approach where a single piezoelectric active member is used as an actuation device and the measurements include both colocated and noncolocated sensors. Our goal for this experiment is to demonstrate the feasibility of active structural control using both colocated and noncolocated measurements by means of successive control design and loop closing. More specifically, the colocated control loop was designed and closed first to provide good damping improvement over the frequency range of interest. The noncolocated controller was then designed with respect to a partially controlled structure to further improve the performance. Based on our approach, experimental closed-loop results have demonstrated significant performance improvement with excellent stability margins.

  2. System identification and structural control on the JPL Phase B testbed

    NASA Astrophysics Data System (ADS)

    Chu, Cheng-Chih; Obrien, John F.; Lurie, Boris J.

    1993-02-01

    The primary objective of NASA's CSI program at JPL is to develop and demonstrate the CSI technology required to achieve high precision structural stability on large complex optical class spacecraft. The focus mission for this work is an orbiting interferometer telescope. Toward the realization of such a mission, a series of evolutionary testbed structures are being constructed. The JPL's CSI Phase B testbed is the second structure constructed in this series which is designed to study the pathlength control problem of the optical train of a stellar interferometer telescope mounted on a large flexible structure. A detailed description of this testbed can be found. This paper describes our efforts in the first phase of active structural control experiments of Phase B testbed using the active control approach where a single piezoelectric active member is used as an actuation device and the measurements include both colocated and noncolocated sensors. Our goal for this experiment is to demonstrate the feasibility of active structural control using both colocated and noncolocated measurements by means of successive control design and loop closing. More specifically, the colocated control loop was designed and closed first to provide good damping improvement over the frequency range of interest. The noncolocated controller was then designed with respect to a partially controlled structure to further improve the performance. Based on our approach, experimental closed-loop results have demonstrated significant performance improvement with excellent stability margins.

  3. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Bench-scale Testbed Design Report

    SciTech Connect

    Melin, Alexander M.; Kisner, Roger A.; Drira, Anis; Reed, Frederick K.

    2015-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging due to restrictions on sensors and materials. As a part of the Department of Energy's Nuclear Energy Enabling Technology cross-cutting technology development programs Advanced Sensors and Instrumentation topic, this report details the design of a bench-scale embedded instrumentation and control testbed. The design goal of the bench-scale testbed is to build a re-configurable system that can rapidly deploy and test advanced control algorithms in a hardware in the loop setup. The bench-scale testbed will be designed as a fluid pump analog that uses active magnetic bearings to support the shaft. The testbed represents an application that would improve the efficiency and performance of high temperature (700 C) pumps for liquid salt reactors that operate in an extreme environment and provide many engineering challenges that can be overcome with embedded instrumentation and control. This report will give details of the mechanical design, electromagnetic design, geometry optimization, power electronics design, and initial control system design.

  4. Particle-In-Cell Multi-Algorithm Numerical Test-Bed

    NASA Astrophysics Data System (ADS)

    Meyers, M. D.; Yu, P.; Tableman, A.; Decyk, V. K.; Mori, W. B.

    2015-11-01

    We describe a numerical test-bed that allows for the direct comparison of different numerical simulation schemes using only a single code. It is built from the UPIC Framework, which is a set of codes and modules for constructing parallel PIC codes. In this test-bed code, Maxwell's equations are solved in Fourier space in two dimensions. One can readily examine the numerical properties of a real space finite difference scheme by including its operators' Fourier space representations in the Maxwell solver. The fields can be defined at the same location in a simulation cell or can be offset appropriately by half-cells, as in the Yee finite difference time domain scheme. This allows for the accurate comparison of numerical properties (dispersion relations, numerical stability, etc.) across finite difference schemes, or against the original spectral scheme. We have also included different options for the charge and current deposits, including a strict charge conserving current deposit. The test-bed also includes options for studying the analytic time domain scheme, which eliminates numerical dispersion errors in vacuum. We will show examples from the test-bed that illustrate how the properties of some numerical instabilities vary between different PIC algorithms. Work supported by the NSF grant ACI 1339893 and DOE grant DE-SC0008491.

  5. An adaptable, low cost test-bed for unmanned vehicle systems research

    NASA Astrophysics Data System (ADS)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  6. James Webb Space Telescope Optical Simulation Testbed I: overview and first results

    NASA Astrophysics Data System (ADS)

    Perrin, Marshall D.; Soummer, Rémi; Choquet, Élodie; N'Diaye, Mamadou; Levecq, Olivier; Lajoie, Charles-Philippe; Ygouf, Marie; Leboulleux, Lucie; Egron, Sylvain; Anderson, Rachel; Long, Chris; Elliott, Erin; Hartig, George; Pueyo, Laurent; van der Marel, Roeland; Mountain, Matt

    2014-08-01

    The James Webb Space Telescope (JWST) Optical Simulation Testbed (JOST) is a tabletop workbench to study aspects of wavefront sensing and control for a segmented space telescope, including both commissioning and maintenance activities. JOST is complementary to existing optomechanical testbeds for JWST (e.g. the Ball Aerospace Testbed Telescope, TBT) given its compact scale and flexibility, ease of use, and colocation at the JWST Science & Operations Center. We have developed an optical design that reproduces the physics of JWST's three-mirror anastigmat using three aspheric lenses; it provides similar image quality as JWST (80% Strehl ratio) over a field equivalent to a NIRCam module, but at HeNe wavelength. A segmented deformable mirror stands in for the segmented primary mirror and allows control of the 18 segments in piston, tip, and tilt, while the secondary can be controlled in tip, tilt and x, y, z position. This will be sufficient to model many commissioning activities, to investigate field dependence and multiple field point sensing & control, to evaluate alternate sensing algorithms, and develop contingency plans. Testbed data will also be usable for cross-checking of the WFS&C Software Subsystem, and for staff training and development during JWST's five- to ten-year mission.

  7. High Contrast Vacuum Nuller Testbed (VNT) Contrast, Performance and Null Control

    NASA Technical Reports Server (NTRS)

    Lyon, Richard G.; Clampin, Mark; Petrone, Peter; Mallik, Udayan; Madison, Timothy; Bolcar, Matthew R.

    2012-01-01

    Herein we report on our contrast assessment and the development, sensing and control of the Vacuum Nuller Testbed to realize a Visible Nulling Coronagraphy (VNC) for exoplanet detection and characterization. Tbe VNC is one of the few approaches that works with filled, segmented and sparse or diluted-aperture telescope systems. It thus spans a range of potential future NASA telescopes and could be flown as a separate instrument on such a future mission. NASA/Goddard Space Flight Center has an established effort to develop VNC technologies, and an incremental sequence of testbeds to advance this approach and its critical technologies. We discuss the development of the vacuum Visible Nulling Coronagraph testbed (VNT). The VNT is an ultra-stable vibration isolated testbed that operates under closed-loop control within a vacuum chamber. It will be used to achieve an incremental sequence of three visible-light nulling milestones with sequentially higher contrasts of 10(exp 8), 10(exp 9) and ideally 10(exp 10) at an inner working angle of 2*lambda/D. The VNT is based on a modified Mach-Zehnder nulling interferometer, with a "W" configuration to accommodate a hex-packed MEMS based deformable mirror, a coherent fiber bundle and achromatic phase shifters. We discuss the laboratory results, optical configuration, critical technologies and the null sensing and control approach.

  8. INFORM Lab: a testbed for high-level information fusion and resource management

    NASA Astrophysics Data System (ADS)

    Valin, Pierre; Guitouni, Adel; Bossé, Eloi; Wehn, Hans; Happe, Jens

    2011-05-01

    DRDC Valcartier and MDA have created an advanced simulation testbed for the purpose of evaluating the effectiveness of Network Enabled Operations in a Coastal Wide Area Surveillance situation, with algorithms provided by several universities. This INFORM Lab testbed allows experimenting with high-level distributed information fusion, dynamic resource management and configuration management, given multiple constraints on the resources and their communications networks. This paper describes the architecture of INFORM Lab, the essential concepts of goals and situation evidence, a selected set of algorithms for distributed information fusion and dynamic resource management, as well as auto-configurable information fusion architectures. The testbed provides general services which include a multilayer plug-and-play architecture, and a general multi-agent framework based on John Boyd's OODA loop. The testbed's performance is demonstrated on 2 types of scenarios/vignettes for 1) cooperative search-and-rescue efforts, and 2) a noncooperative smuggling scenario involving many target ships and various methods of deceit. For each mission, an appropriate subset of Canadian airborne and naval platforms are dispatched to collect situation evidence, which is fused, and then used to modify the platform trajectories for the most efficient collection of further situation evidence. These platforms are fusion nodes which obey a Command and Control node hierarchy.

  9. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    SciTech Connect

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discovery feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).

  10. Diffraction-based analysis of tunnel size for a scaled external occulter testbed

    NASA Astrophysics Data System (ADS)

    Sirbu, Dan; Kasdin, N. Jeremy; Vanderbei, Robert J.

    2016-07-01

    For performance verification of an external occulter mask (also called a starshade), scaled testbeds have been developed to measure the suppression of the occulter shadow in the pupil plane and contrast in the image plane. For occulter experiments the scaling is typically performed by maintaining an equivalent Fresnel number. The original Princeton occulter testbed was oversized with respect to both input beam and shadow propagation to limit any diffraction effects due to finite testbed enclosure edges; however, to operate at realistic space-mission equivalent Fresnel numbers an extended testbed is currently under construction. With the longer propagation distances involved, diffraction effects due to the edge of the tunnel must now be considered in the experiment design. Here, we present a diffraction-based model of two separate tunnel effects. First, we consider the effect of tunnel-edge induced diffraction ringing upstream from the occulter mask. Second, we consider the diffraction effect due to clipping of the output shadow by the tunnel downstream from the occulter mask. These calculations are performed for a representative point design relevant to the new Princeton occulter experiment, but we also present an analytical relation that can be used for other propagation distances.

  11. High-contrast imager for complex aperture telescopes (HiCAT): 1. testbed design

    NASA Astrophysics Data System (ADS)

    N'Diaye, Mamadou; Choquet, Elodie; Pueyo, Laurent; Elliot, Erin; Perrin, Marshall D.; Wallace, J. Kent; Groff, Tyler; Carlotti, Alexis; Mawet, Dimitri; Sheckells, Matt; Shaklan, Stuart; Macintosh, Bruce; Kasdin, N. Jeremy; Soummer, Rémi

    2013-09-01

    Searching for nearby habitable worlds with direct imaging and spectroscopy will require a telescope large enough to provide angular resolution and sensitivity to planets around a significant sample of stars. Segmented telescopes are a compelling option to obtain such large apertures. However, these telescope designs have a complex geometry (central obstruction, support structures, segmentation) that makes high-contrast imaging more challenging. We are developing a new high-contrast imaging testbed at STScI to provide an integrated solution for wavefront control and starlight suppression on complex aperture geometries. We present our approach for the testbed optical design, which defines the surface requirements for each mirror to minimize the amplitude-induced errors from the propagation of out-of-pupil surfaces. Our approach guarantees that the testbed will not be limited by these Fresnel propagation effects, but only by the aperture geometry. This approach involves iterations between classical ray-tracing optical design optimization, and end-to-end Fresnel propagation with wavefront control (e.g. Electric Field Conjugation / Stroke Minimization). The construction of the testbed is planned to start in late Fall 2013.

  12. Genetic Algorithm Phase Retrieval for the Systematic Image-Based Optical Alignment Testbed

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Steincamp, James; Taylor, Jaime

    2003-01-01

    A reduced surrogate, one point crossover genetic algorithm with random rank-based selection was used successfully to estimate the multiple phases of a segmented optical system modeled on the seven-mirror Systematic Image-Based Optical Alignment testbed located at NASA's Marshall Space Flight Center.

  13. Data distribution service-based interoperability framework for smart grid testbed infrastructure

    DOE PAGES

    Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.

    2016-03-02

    This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less

  14. Human Exploration Spacecraft Testbed for Integration and Advancement (HESTIA)

    NASA Technical Reports Server (NTRS)

    Banker, Brian F.; Robinson, Travis

    2016-01-01

    The proposed paper will cover ongoing effort named HESTIA (Human Exploration Spacecraft Testbed for Integration and Advancement), led at the National Aeronautics and Space Administration (NASA) Johnson Space Center (JSC) to promote a cross-subsystem approach to developing Mars-enabling technologies with the ultimate goal of integrated system optimization. HESTIA also aims to develop the infrastructure required to rapidly test these highly integrated systems at a low cost. The initial focus is on the common fluids architecture required to enable human exploration of mars, specifically between life support and in-situ resource utilization (ISRU) subsystems. An overview of the advancements in both integrated technologies, in infrastructure, in simulation, and in modeling capabilities will be presented, as well as the results and findings of integrated testing,. Due to the enormous mass gear-ratio required for human exploration beyond low-earth orbit, (for every 1 kg of payload landed on Mars, 226 kg will be required on Earth), minimization of surface hardware and commodities is paramount. Hardware requirements can be minimized by reduction of equipment performing similar functions though for different subsystems. If hardware could be developed which meets the requirements of both life support and ISRU it could result in the reduction of primary hardware and/or reduction in spares. Minimization of commodities to the surface of mars can be achieved through the creation of higher efficiency systems producing little to no undesired waste, such as a closed-loop life support subsystem. Where complete efficiency is impossible or impractical, makeup commodities could be manufactured via ISRU. Although, utilization of ISRU products (oxygen and water) for crew consumption holds great promise of reducing demands on life support hardware, there exist concerns as to the purity and transportation of commodities. To date, ISRU has been focused on production rates and purities for

  15. Physically-Retrieving Cloud and Thermodynamic Parameters from Ultraspectral IR Measurements

    NASA Technical Reports Server (NTRS)

    Zhou, Daniel K.; Smith, William L., Sr.; Liu, Xu; Larar, Allen M.; Mango, Stephen A.; Huang, Hung-Lung

    2007-01-01

    A physical inversion scheme has been developed, dealing with cloudy as well as cloud-free radiance observed with ultraspectral infrared sounders, to simultaneously retrieve surface, atmospheric thermodynamic, and cloud microphysical parameters. A fast radiative transfer model, which applies to the clouded atmosphere, is used for atmospheric profile and cloud parameter retrieval. A one-dimensional (1-d) variational multi-variable inversion solution is used to improve an iterative background state defined by an eigenvector-regression-retrieval. The solution is iterated in order to account for non-linearity in the 1-d variational solution. It is shown that relatively accurate temperature and moisture retrievals can be achieved below optically thin clouds. For optically thick clouds, accurate temperature and moisture profiles down to cloud top level are obtained. For both optically thin and thick cloud situations, the cloud top height can be retrieved with relatively high accuracy (i.e., error < 1 km). NPOESS Airborne Sounder Testbed Interferometer (NAST-I) retrievals from the Atlantic-THORPEX Regional Campaign are compared with coincident observations obtained from dropsondes and the nadir-pointing Cloud Physics Lidar (CPL). This work was motivated by the need to obtain solutions for atmospheric soundings from infrared radiances observed for every individual field of view, regardless of cloud cover, from future ultraspectral geostationary satellite sounding instruments, such as the Geosynchronous Imaging Fourier Transform Spectrometer (GIFTS) and the Hyperspectral Environmental Suite (HES). However, this retrieval approach can also be applied to the ultraspectral sounding instruments to fly on Polar satellites, such as the Infrared Atmospheric Sounding Interferometer (IASI) on the European MetOp satellite, the Cross-track Infrared Sounder (CrIS) on the NPOESS Preparatory Project and the following NPOESS series of satellites.

  16. Development and testing of an aerosol-stratus cloud parameterization scheme for middle and high latitudes

    SciTech Connect

    Olsson, P.Q.; Meyers, M.P.; Kreidenweis, S.; Cotton, W.R.

    1996-04-01

    The aim of this new project is to develop an aerosol/cloud microphysics parameterization of mixed-phase stratus and boundary layer clouds. Our approach is to create, test, and implement a bulk-microphysics/aerosol model using data from Atmospheric Radiation Measurement (ARM) Cloud and Radiation Testbed (CART) sites and large-eddy simulation (LES) explicit bin-resolving aerosol/microphysics models. The primary objectives of this work are twofold. First, we need the prediction of number concentrations of activated aerosol which are transferred to the droplet spectrum, so that the aerosol population directly affects the cloud formation and microphysics. Second, we plan to couple the aerosol model to the gas and aqueous-chemistry module that will drive the aerosol formation and growth. We begin by exploring the feasibility of performing cloud-resolving simulations of Arctic stratus clouds over the North Slope CART site. These simulations using Colorado State University`s regional atmospheric modeling system (RAMS) will be useful in designing the structure of the cloud-resolving model and in interpreting data acquired at the North Slope site.

  17. Atmospheric cloud physics laboratory project study

    NASA Technical Reports Server (NTRS)

    Schultz, W. E.; Stephen, L. A.; Usher, L. H.

    1976-01-01

    Engineering studies were performed for the Zero-G Cloud Physics Experiment liquid cooling and air pressure control systems. A total of four concepts for the liquid cooling system was evaluated, two of which were found to closely approach the systems requirements. Thermal insulation requirements, system hardware, and control sensor locations were established. The reservoir sizes and initial temperatures were defined as well as system power requirements. In the study of the pressure control system, fluid analyses by the Atmospheric Cloud Physics Laboratory were performed to determine flow characteristics of various orifice sizes, vacuum pump adequacy, and control systems performance. System parameters predicted in these analyses as a function of time include the following for various orifice sizes: (1) chamber and vacuum pump mass flow rates, (2) the number of valve openings or closures, (3) the maximum cloud chamber pressure deviation from the allowable, and (4) cloud chamber and accumulator pressure.

  18. Absorption of solar radiation in broken clouds

    SciTech Connect

    Zuev, V.E.; Titov, G.A.; Zhuravleva, T.B.

    1996-04-01

    It is recognized now that the plane-parallel model unsatisfactorily describes the transfer of radiation through broken clouds and that, consequently, the radiation codes of general circulation models (GCMs) must be refined. However, before any refinement in a GCM code is made, it is necessary to investigate the dependence of radiative characteristics on the effects caused by the random geometry of cloud fields. Such studies for mean fluxes of downwelling and upwelling solar radiation in the visible and near-infrared (IR) spectral range were performed by Zuev et al. In this work, we investigate the mean spectral and integrated absorption of solar radiation by broken clouds (in what follows, the term {open_quotes}mean{close_quotes} will be implied but not used, for convenience). To evaluate the potential effect of stochastic geometry, we will compare the absorption by cumulus (0.5 {le} {gamma} {le} 2) to that by equivalent stratus ({gamma} <<1) clouds; here {gamma} = H/D, H is the cloud layer thickness and D the characteristic horizontal cloud size. The equivalent stratus clouds differ from cumulus only in the aspect ratio {gamma}, all the other parameters coinciding.

  19. Cloud cover analysis with Arctic AVHRR data: 1. Cloud detection

    NASA Astrophysics Data System (ADS)

    Key, J.; Barry, R. G.

    1989-12-01

    Automated analyses of satellite radiance data have concentrated heavily on low and middle latitude situations. Some of the design objectives for the International Satellite Cloud Climatology Project (ISCCP) cloud detection procedure such as space and time contrasts are used in a basic algorithm from which a polar cloud detection algorithm is developed. This algorithm is applied to Arctic data for January and July conditions. Both advanced very high resolution radiometer (AVHRR) and scanning multichannel microwave radiometer (SMMR) data are utilized. Synthetic AVHRR and SMMR data for a 7-day analysis period are also generated to provide a data set with known characteristics on which to test and validate algorithms. Modifications to the basic algorithm for polar conditions include the use of SMMR and SMMR-derived data sets for the estimation of surface parameters, elimination of the spatial test for the warmest pixel, the use of AVHRR channels 1 (0.7 μm), 3 (3.7 μm), and 4 (11 μm) in the temporal tests and the final multispectral thresholding, and the use of surface class characteristic values when clear-sky values cannot be obtained. Additionally, the difference between channels 3 and 4 is included in the temporal test for the detection of optically thin cloud. Greatest improvement in computed cloud fraction is realized over snow and ice surfaces; over open water or snow-free land, all versions perform similarly. Since the inclusion of SMMR for surface analysis and additional spectral channels increases the computational burden, its use may be justified only over snow and ice-covered regions.

  20. Cloud Processed CCN Affect Cloud Microphysics

    NASA Astrophysics Data System (ADS)

    Hudson, J. G.; Noble, S. R., Jr.; Tabor, S. S.

    2015-12-01

    Variations in the bimodality/monomodality of CCN spectra (Hudson et al. 2015) exert opposite effects on cloud microphysics in two aircraft field projects. The figure shows two examples, droplet concentration, Nc, and drizzle liquid water content, Ld, against classification of CCN spectral modality. Low ratings go to balanced separated bimodal spectra, high ratings go to single mode spectra, strictly monomodal 8. Intermediate ratings go merged modes, e.g., one mode a shoulder of another. Bimodality is caused by mass or hygroscopicity increases that go only to CCN that made activated cloud droplets. In the Ice in Clouds Experiment-Tropical (ICE-T) small cumuli with lower Nc, greater droplet mean diameters, MD, effective radii, re, spectral widths, σ, cloud liquid water contents, Lc, and Ld were closer to more bimodal (lower modal ratings) below cloud CCN spectra whereas clouds with higher Nc, smaller MD, re, σ, and Ld were closer to more monomodal CCN (higher modal ratings). In polluted stratus clouds of the MArine Stratus/Stratocumulus Experiment (MASE) clouds that had greater Nc, and smaller MD, re, σ, Lc, and Ld were closer to more bimodal CCN spectra whereas clouds with lower Nc, and greater MD, re, σ, Lc, and Ld were closer to more monomodal CCN. These relationships are opposite because the dominant ICE-T cloud processing was coalescence whereas chemical transformations (e.g., SO2 to SO4) were dominant in MASE. Coalescence reduces Nc and thus also CCN concentrations (NCCN) when droplets evaporate. In subsequent clouds the reduced competition increases MD and σ, which further enhance coalescence and drizzle. Chemical transformations do not change Nc but added sulfate enhances droplet and CCN solubility. Thus, lower critical supersaturation (S) CCN can produce more cloud droplets in subsequent cloud cycles, especially for the low W and effective S of stratus. The increased competition reduces MD, re, and σ, which inhibit coalescence and thus reduce drizzle

  1. SCDU Testbed Automated In-Situ Alignment, Data Acquisition and Analysis

    NASA Technical Reports Server (NTRS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-01-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easily-parseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  2. SCDU testbed automated in-situ alignment, data acquisition and analysis

    NASA Astrophysics Data System (ADS)

    Werne, Thomas A.; Wehmeier, Udo J.; Wu, Janet P.; An, Xin; Goullioud, Renaud; Nemati, Bijan; Shao, Michael; Shen, Tsae-Pyng J.; Wang, Xu; Weilert, Mark A.; Zhai, Chengxing

    2010-07-01

    In the course of fulfilling its mandate, the Spectral Calibration Development Unit (SCDU) testbed for SIM-Lite produces copious amounts of raw data. To effectively spend time attempting to understand the science driving the data, the team devised computerized automations to limit the time spent bringing the testbed to a healthy state and commanding it, and instead focus on analyzing the processed results. We developed a multi-layered scripting language that emphasized the scientific experiments we conducted, which drastically shortened our experiment scripts, improved their readability, and all-but-eliminated testbed operator errors. In addition to scientific experiment functions, we also developed a set of automated alignments that bring the testbed up to a well-aligned state with little more than the push of a button. These scripts were written in the scripting language, and in Matlab via an interface library, allowing all members of the team to augment the existing scripting language with complex analysis scripts. To keep track of these results, we created an easilyparseable state log in which we logged both the state of the testbed and relevant metadata. Finally, we designed a distributed processing system that allowed us to farm lengthy analyses to a collection of client computers which reported their results in a central log. Since these logs were parseable, we wrote query scripts that gave us an effortless way to compare results collected under different conditions. This paper serves as a case-study, detailing the motivating requirements for the decisions we made and explaining the implementation process.

  3. Embedded Sensors and Controls to Improve Component Performance and Reliability -- Loop-scale Testbed Design Report

    SciTech Connect

    Melin, Alexander M.; Kisner, Roger A.

    2016-09-01

    Embedded instrumentation and control systems that can operate in extreme environments are challenging to design and operate. Extreme environments limit the options for sensors and actuators and degrade their performance. Because sensors and actuators are necessary for feedback control these limitations mean that designing embedded instrumentation and control systems for the challenging environments of nuclear reactors requires advanced technical solutions that are not available commercially. This report details the development of testbed that will be used for cross-cutting embedded instrumentation and control research for nuclear power applications. This research is funded by the Department of Energy's Nuclear Energy Enabling Technology program's Advanced Sensors and Instrumentation topic. The design goal of the loop-scale testbed is to build a low temperature pump that utilizes magnetic bearing that will be incorporated into a water loop to test control system performance and self-sensing techniques. Specifically, this testbed will be used to analyze control system performance in response to nonlinear and cross-coupling fluid effects between the shaft axes of motion, rotordynamics and gyroscopic effects, and impeller disturbances. This testbed will also be used to characterize the performance losses when using self-sensing position measurement techniques. Active magnetic bearings are a technology that can reduce failures and maintenance costs in nuclear power plants. They are particularly relevant to liquid salt reactors that operate at high temperatures (700 C). Pumps used in the extreme environment of liquid salt reactors provide many engineering challenges that can be overcome with magnetic bearings and their associated embedded instrumentation and control. This report will give details of the mechanical design and electromagnetic design of the loop-scale embedded instrumentation and control testbed.

  4. A testbed for wide-field, high-resolution, gigapixel-class cameras

    NASA Astrophysics Data System (ADS)

    Kittle, David S.; Marks, Daniel L.; Son, Hui S.; Kim, Jungsang; Brady, David J.

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  5. A testbed for wide-field, high-resolution, gigapixel-class cameras.

    PubMed

    Kittle, David S; Marks, Daniel L; Son, Hui S; Kim, Jungsang; Brady, David J

    2013-05-01

    The high resolution and wide field of view (FOV) of the AWARE (Advanced Wide FOV Architectures for Image Reconstruction and Exploitation) gigapixel class cameras present new challenges in calibration, mechanical testing, and optical performance evaluation. The AWARE system integrates an array of micro-cameras in a multiscale design to achieve gigapixel sampling at video rates. Alignment and optical testing of the micro-cameras is vital in compositing engines, which require pixel-level accurate mappings over the entire array of cameras. A testbed has been developed to automatically calibrate and measure the optical performance of the entire camera array. This testbed utilizes translation and rotation stages to project a ray into any micro-camera of the AWARE system. A spatial light modulator is projected through a telescope to form an arbitrary object space pattern at infinity. This collimated source is then reflected by an elevation stage mirror for pointing through the aperture of the objective into the micro-optics and eventually the detector of the micro-camera. Different targets can be projected with the spatial light modulator for measuring the modulation transfer function (MTF) of the system, fiducials in the overlap regions for registration and compositing, distortion mapping, illumination profiles, thermal stability, and focus calibration. The mathematics of the testbed mechanics are derived for finding the positions of the stages to achieve a particular incident angle into the camera, along with calibration steps for alignment of the camera and testbed coordinate axes. Measurement results for the AWARE-2 gigapixel camera are presented for MTF, focus calibration, illumination profile, fiducial mapping across the micro-camera for registration and distortion correction, thermal stability, and alignment of the camera on the testbed.

  6. Intercomparison of model simulations of mixed-phase clouds observed during the ARM Mixed-Phase Arctic Cloud Experiment. II: Multi layered cloud

    SciTech Connect

    Morrison, H.; McCoy, Renata; Klein, Stephen A.; Xie, Shaocheng; Luo, Yali; Avramov, Alexander; Chen, Mingxuan; Cole, Jason N.; Falk, Michael; Foster, Mike; Del Genio, Anthony D.; Harrington, Jerry Y.; Hoose, Corinna; Khrairoutdinov, Marat; Larson, Vince; Liu, Xiaohong; McFarquhar, Greg; Poellot, M. R.; Von Salzen, Knut; Shipway, Ben; Shupe, Matthew D.; Sud, Yogesh C.; Turner, David D.; Veron, Dana; Walker, Gregory K.; Wang, Zhien; Wolf, Audrey; Xu, Kuan-Man; Yang, Fanglin; Zhang, G.

    2009-05-21

    Results are presented from an intercomparison of single-column and cloud resolving model simulations of a deep, multi-layered, mixed-phase cloud system observed during the ARM Mixed-Phase Arctic Cloud Experiment. This cloud system was associated with strong surface turbulent sensible and latent heat fluxes as cold air flowed over the open Arctic Ocean, combined with a low pressure system that supplied moisture at mid-level. The simulations, performed by 13 single-column and 4 cloud-resolving models, generally overestimate the liquid water path and strongly underestimate the ice water path, although there is a large spread among the models. This finding is in contrast with results for the single-layer, low-level mixed-phase stratocumulus case in Part I of this study, as well as previous studies of shallow mixed-phase Arctic clouds, that showed an underprediction of liquid water path. The overestimate of liquid water path and underestimate of ice water path occur primarily when deeper mixed-phase clouds extending into the mid-troposphere were observed. These results suggest important differences in the ability of models to simulate Arctic mixed-phase clouds that are deep and multi-layered versus shallow and single-layered. In general, the cloud-resolving models and models with a more sophisticated, two-moment treatment of the cloud microphysics produce a somewhat smaller liquid water path that is closer to observations. The cloud-resolving models also tend to produce a larger cloud fraction than the single column models. The liquid water path and especially the cloud fraction have a large impact on the cloud radiative forcing at the surface, which is dominated by the longwave flux for this case.

  7. Evaluation of Cloud Parameterizations in a High Resolution Atmospheric General Circulation Model Using ARM Data

    SciTech Connect

    Govindasamy, B; Duffy, P

    2002-04-12

    Typical state of the art atmospheric general circulation models used in climate change studies have horizontal resolution of approximately 300 km. As computing power increases, many climate modeling groups are working toward enhancing the resolution of global models. An important issue that arises when resolution of a model is changed is whether cloud and convective parameterizations, which were developed for use at coarser resolutions, will need to be reformulated or re-tuned. We propose to investigate this issue and specifically cloud statistics using ARM data. The data streams produced by highly instrumented sections of Cloud and Radiation Testbeds (CART) of ARM program will provide a significant aid in the evaluation of cloud and convection parameterization in high-resolution models. Recently, we have performed multiyear global-climate simulations at T170 and T239 resolutions, corresponding to grid cell sizes of 0.7{sup 0} and 0.5{sup 0} respectively, using the NCAR Community Climate Model. We have also a performed climate change simulation at T170. On the scales of a T42 grid cell (300 km) and larger, nearly all quantities we examined in T170 simulation agree better with observations in terms of spatial patterns than do results in a comparable simulation at T42. Increasing the resolution to T239 brings significant further improvement. At T239, the high-resolution model grid cells approach the dimensions of the highly instrumented sections of ARM Cloud and Radiation Testbed (CART) sites. We propose to form a cloud climatology using ARM data for its CART sites and evaluate cloud statistics of the NCAR Community Atmosphere Model (CAM) at higher resolutions over those sites using this ARM cloud climatology. We will then modify the physical parameterizations of CAM for better agreement with ARM data. We will work closely with NCAR in modifying the parameters in cloud and convection parameterizations for the high-resolution model. Our proposal to evaluate the cloud

  8. Ground- and aircraft-based cirrus cloud measurements using lidar and high-spectral-resolution FTS during the AFWEX 2000 field campaign

    NASA Astrophysics Data System (ADS)

    DeSlover, Daniel H.; Turner, David; Whiteman, David N.; Smith, William L.

    2002-09-01

    The ARM-FIRE Water Vapor Experiment (AFWEX) was conducted during November-December 2000 at the Southern Great Plains (SGP) Cloud and Radiation Testbed (CART). A cirrus event which occurred on 7-8 December was analyzed using ground- and aircraft-based measurements. The ground-based Atmospheric Emitted Radiance Interferometer (AERI) and NPOESS Airborne Sounder Testbed-Interferometer (NAST-I) are high spectral resolution interferometers which measure downwelling and upwelling infrared radiation, respectively. Analysis between water vapor absorption lines within the 8 to 12 micrometers atmospheric window allow inversion of the radiative transfer equation to derive the cirrus cloud optical depth. These data will be compared to ground-based Raman lidar (GSFC and ARM) measurements of cirrus optical depth. The NAST-I measurements were conducted from the Proteus aircraft.

  9. Sensor Web Interoperability Testbed Results Incorporating Earth Observation Satellites

    NASA Technical Reports Server (NTRS)

    Frye, Stuart; Mandl, Daniel J.; Alameh, Nadine; Bambacus, Myra; Cappelaere, Pat; Falke, Stefan; Derezinski, Linda; Zhao, Piesheng

    2007-01-01

    This paper describes an Earth Observation Sensor Web scenario based on the Open Geospatial Consortium s Sensor Web Enablement and Web Services interoperability standards. The scenario demonstrates the application of standards in describing, discovering, accessing and tasking satellites and groundbased sensor installations in a sequence of analysis activities that deliver information required by decision makers in response to national, regional or local emergencies.

  10. Tropical thermostats and low cloud clover

    SciTech Connect

    Miller, R.L.

    1997-03-01

    The ability of subtropical stratus low cloud cover to moderate amplify the tropical response to climate forcing such as increased CO{sub 2} is considered. Cloud radiative forcing over the subtropics is parameterized using an empirical relation between stratus cloud cover and the difference in potential temperature between 700 mb (a level that is above the trade inversion) and the surface. This relation includes the empirical negative correlation between SST and low cloud cover and is potentially a positive feedback to climate forcing. Since potential temperature above the trade inversion varies in unison across the Tropics as a result of the large-scale circulation and because moist convection relates tropospheric temperature within the convecting region to variations in surface temperature and moisture, the subtropical potential temperature at 700 mb depends upon surface conditions within the convecting region. As a result, subtropical stratus cloud cover and the associated feedback depend upon the entire tropical climate and not just the underlying SST. A simple tropical model is constructed, consisting of separate budgets of dry static energy and moisture for the convecting region (referred to as the {open_quotes}warm{close_quotes} pool) and the subtropical descending region (the {open_quotes}cold{close_quotes} pool). The cold pool is the location of stratus low clouds in the model. Dynamics is implicitly included through the assumption that temperature above the boundary layer is horizontally uniform as a result of the large-scale circulation. The tropropause and warm pool surface are shown to be connected by a moist adiabat in the limit of vanishingly narrow convective updrafts. Stratus low cloud cover is found to be a negative feedback, increasing in response to doubled CO{sub 2} and reducing the tropically averaged warming in comparison to the warming with low cloud cover held fixed. 72 refs., 13 figs., 2 tabs.

  11. The Climate-G testbed: towards a large scale data sharing environment for climate change

    NASA Astrophysics Data System (ADS)

    Aloisio, G.; Fiore, S.; Denvil, S.; Petitdidier, M.; Fox, P.; Schwichtenberg, H.; Blower, J.; Barbera, R.

    2009-04-01

    The Climate-G testbed provides an experimental large scale data environment for climate change addressing challenging data and metadata management issues. The main scope of Climate-G is to allow scientists to carry out geographical and cross-institutional climate data discovery, access, visualization and sharing. Climate-G is a multidisciplinary collaboration involving both climate and computer scientists and it currently involves several partners such as: Centro Euro-Mediterraneo per i Cambiamenti Climatici (CMCC), Institut Pierre-Simon Laplace (IPSL), Fraunhofer Institut für Algorithmen und Wissenschaftliches Rechnen (SCAI), National Center for Atmospheric Research (NCAR), University of Reading, University of Catania and University of Salento. To perform distributed metadata search and discovery, we adopted a CMCC metadata solution (which provides a high level of scalability, transparency, fault tolerance and autonomy) leveraging both on P2P and grid technologies (GRelC Data Access and Integration Service). Moreover, data are available through OPeNDAP/THREDDS services, Live Access Server as well as the OGC compliant Web Map Service and they can be downloaded, visualized, accessed into the proposed environment through the Climate-G Data Distribution Centre (DDC), the web gateway to the Climate-G digital library. The DDC is a data-grid portal allowing users to easily, securely and transparently perform search/discovery, metadata management, data access, data visualization, etc. Godiva2 (integrated into the DDC) displays 2D maps (and animations) and also exports maps for display on the Google Earth virtual globe. Presently, Climate-G publishes (through the DDC) about 2TB of data related to the ENSEMBLES project (also including distributed replicas of data) as well as to the IPCC AR4. The main results of the proposed work are: wide data access/sharing environment for climate change; P2P/grid metadata approach; production-level Climate-G DDC; high quality tools for

  12. FY 2011 Second Quarter: Demonstration of New Aerosol Measurement Verification Testbed for Present-Day Global Aerosol Simulations

    SciTech Connect

    Koch, D

    2011-03-20

    The regional-scale Weather Research and Forecasting (WRF) model is being used by a DOE Earth System Modeling (ESM) project titled “Improving the Characterization of Clouds, Aerosols and the Cryosphere in Climate Models” to evaluate the performance of atmospheric process modules that treat aerosols and aerosol radiative forcing in the Arctic. We are using a regional-scale modeling framework for three reasons: (1) It is easier to produce a useful comparison to observations with a high resolution model; (2) We can compare the behavior of the CAM parameterization suite with some of the more complex and computationally expensive parameterizations used in WRF; (3) we can explore the behavior of this parameterization suite at high resolution. Climate models like the Community Atmosphere Model version 5 (CAM5) being used within the Community Earth System Model (CESM) will not likely be run at mesoscale spatial resolutions (10–20 km) until 5–10 years from now. The performance of the current suite of physics modules in CAM5 at such resolutions is not known, and current computing resources do not permit high-resolution global simulations to be performed routinely. We are taking advantage of two tools recently developed under PNNL Laboratory Directed Research and Development (LDRD) projects for this activity. The first is the Aerosol Modeling Testbed (Fast et al., 2011b), a new computational framework designed to streamline the process of testing and evaluating aerosol process modules over a range of spatial and temporal scales. The second is the CAM5 suite of physics parameterizations that have been ported into WRF so that their performance and scale dependency can be quantified at mesoscale spatial resolutions (Gustafson et al., 2010; with more publications in preparation).

  13. Noctilucent Cloud Sightings

    NASA Video Gallery

    Polar Mesospheric Clouds form during each polar region's summer months in the coldest place in the atmosphere, 50 miles above Earth's surface. Noctilucent Clouds were first observed in 1885 by an a...

  14. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  15. Private Cloud Communities for Faculty and Students

    ERIC Educational Resources Information Center

    Tomal, Daniel R.; Grant, Cynthia

    2015-01-01

    Massive open online courses (MOOCs) and public and private cloud communities continue to flourish in the field of higher education. However, MOOCs have received criticism in recent years and offer little benefit to students already enrolled at an institution. This article advocates for the collaborative creation and use of institutional, program…

  16. Computer animation of clouds

    SciTech Connect

    Max, N.

    1994-01-28

    Computer animation of outdoor scenes is enhanced by realistic clouds. I will discuss several different modeling and rendering schemes for clouds, and show how they evolved in my animation work. These include transparency-textured clouds on a 2-D plane, smooth shaded or textured 3-D clouds surfaces, and 3-D volume rendering. For the volume rendering, I will present various illumination schemes, including the density emitter, single scattering, and multiple scattering models.

  17. Comparing Point Clouds

    DTIC Science & Technology

    2004-04-01

    Point clouds are one of the most primitive and fundamental surface representations. A popular source of point clouds are three dimensional shape...acquisition devices such as laser range scanners. Another important field where point clouds are found is in the representation of high-dimensional...framework for comparing manifolds given by point clouds is presented in this paper. The underlying theory is based on Gromov-Hausdorff distances, leading

  18. Cloud Coverage and Height Distribution from the GLAS Polar Orbiting Lidar: Comparison to Passive Cloud Retrievals

    NASA Technical Reports Server (NTRS)

    Spinhime, J. D.; Palm, S. P.; Hlavka, D. L.; Hart, W. D.; Mahesh, A.

    2004-01-01

    The Geoscience Laser Altimeter System (GLAS) began full on orbit operations in September 2003. A main application of the two-wavelength GLAS lidar is highly accurate detection and profiling of global cloud cover. Initial analysis indicates that cloud and aerosol layers are consistently detected on a global basis to cross-sections down to 10(exp -6) per meter. Images of the lidar data dramatically and accurately show the vertical structure of cloud and aerosol to the limit of signal attenuation. The GLAS lidar has made the most accurate measurement of global cloud coverage and height to date. In addition to the calibrated lidar signal, GLAS data products include multi level boundaries and optical depth of all transmissive layers. Processing includes a multi-variable separation of cloud and aerosol layers. An initial application of the data results is to compare monthly cloud means from several months of GLAS observations in 2003 to existing cloud climatologies from other satellite measurement. In some cases direct comparison to passive cloud retrievals is possible. A limitation of the lidar measurements is nadir only sampling. However monthly means exhibit reasonably good global statistics and coverage results, at other than polar regions, compare well with other measurements but show significant differences in height distribution. For polar regions where passive cloud retrievals are problematic and where orbit track density is greatest, the GLAS results are particularly an advance in cloud cover information. Direct comparison to MODIS retrievals show a better than 90% agreement in cloud detection for daytime, but less than 60% at night. Height retrievals are in much less agreement. GLAS is a part of the NASA EOS project and data products are thus openly available to the science community (see http://glo.gsfc.nasa.gov).

  19. Cloud Computing Explained

    ERIC Educational Resources Information Center

    Metz, Rosalyn

    2010-01-01

    While many talk about the cloud, few actually understand it. Three organizations' definitions come to the forefront when defining the cloud: Gartner, Forrester, and the National Institutes of Standards and Technology (NIST). Although both Gartner and Forrester provide definitions of cloud computing, the NIST definition is concise and uses…

  20. Clouds in Planetary Atmospheres

    NASA Technical Reports Server (NTRS)

    West, R.

    1999-01-01

    In the terrestrial atmosphere clouds are familiar as vast collections of small water drops or ice cyrstals suspended in the air. The study of clouds touches on many facets of armospheric science. The chemistry of clouds is tied to the chemistry of the surrounding atmosphere.

  1. Security in the cloud.

    PubMed

    Degaspari, John

    2011-08-01

    As more provider organizations look to the cloud computing model, they face a host of security-related questions. What are the appropriate applications for the cloud, what is the best cloud model, and what do they need to know to choose the best vendor? Hospital CIOs and security experts weigh in.

  2. The Roles of Cloud Drop Effective Radius and LWP in Determining Rain Properties in Marine Stratocumulus

    SciTech Connect

    Rosenfeld, Daniel; Wang, Hailong; Rasch, Philip J.

    2012-07-04

    Numerical simulations described in previous studies showed that adding cloud condensation nuclei to marine stratocumulus can prevent their breakup from closed into open cells. Additional analyses of the same simulations show that the suppression of rain is well described in terms of cloud drop effective radius (re). Rain is initiated when re near cloud top is around 12-14 um. Cloud water starts to get depleted when column-maximum rain intensity (Rmax) exceeds 0.1 mm h-1. This happens when cloud-top re reaches 14 um. Rmax is mostly less than 0.1 mm h-1 at re<14 um, regardless of the cloud water path, but increases rapidly when re exceeds 14 um. This is in agreement with recent aircraft observations and theoretical observations in convective clouds so that the mechanism is not limited to describing marine stratocumulus. These results support the hypothesis that the onset of significant precipitation is determined by the number of nucleated cloud drops and the height (H) above cloud base within the cloud that is required for cloud drops to reach re of 14 um. In turn, this can explain the conditions for initiation of significant drizzle and opening of closed cells providing the basis for a simple parameterization for GCMs that unifies the representation of both precipitating and non-precipitating clouds as well as the transition between them. Furthermore, satellite global observations of cloud depth (from base to top), and cloud top re can be used to derive and validate this parameterization.

  3. Cloud microstructure studies

    NASA Technical Reports Server (NTRS)

    Blau, H. H., Jr.; Fowler, M. G.; Chang, D. T.; Ryan, R. T.

    1972-01-01

    Over two thousand individual cloud droplet size distributions were measured with an optical cloud particle spectrometer flown on the NASA Convair 990 aircraft. Representative droplet spectra and liquid water content, L (gm/cu m) were obtained for oceanic stratiform and cumuliform clouds. For non-precipitating clouds, values of L range from 0.1 gm/cu m to 0.5 gm/cu m; with precipitation, L is often greater than 1 gm/cu m. Measurements were also made in a newly formed contrail and in cirrus clouds.

  4. Intercomparison of model simulations of mixed-phase clouds observed during the ARM Mixed-Phase Arctic Cloud Experiment. Part II: Multi-layered cloud

    SciTech Connect

    Morrison, H; McCoy, R B; Klein, S A; Xie, S; Luo, Y; Avramov, A; Chen, M; Cole, J; Falk, M; Foster, M; Genio, A D; Harrington, J; Hoose, C; Khairoutdinov, M; Larson, V; Liu, X; McFarquhar, G; Poellot, M; Shipway, B; Shupe, M; Sud, Y; Turner, D; Veron, D; Walker, G; Wang, Z; Wolf, A; Xu, K; Yang, F; Zhang, G

    2008-02-27

    Results are presented from an intercomparison of single-column and cloud-resolving model simulations of a deep, multi-layered, mixed-phase cloud system observed during the ARM Mixed-Phase Arctic Cloud Experiment. This cloud system was associated with strong surface turbulent sensible and latent heat fluxes as cold air flowed over the open Arctic Ocean, combined with a low pressure system that supplied moisture at mid-level. The simulations, performed by 13 single-column and 4 cloud-resolving models, generally overestimate the liquid water path and strongly underestimate the ice water path, although there is a large spread among the models. This finding is in contrast with results for the single-layer, low-level mixed-phase stratocumulus case in Part I of this study, as well as previous studies of shallow mixed-phase Arctic clouds, that showed an underprediction of liquid water path. The overestimate of liquid water path and underestimate of ice water path occur primarily when deeper mixed-phase clouds extending into the mid-troposphere were observed. These results suggest important differences in the ability of models to simulate Arctic mixed-phase clouds that are deep and multi-layered versus shallow and single-layered. In general, models with a more sophisticated, two-moment treatment of the cloud microphysics produce a somewhat smaller liquid water path that is closer to observations. The cloud-resolving models tend to produce a larger cloud fraction than the single-column models. The liquid water path and especially the cloud fraction have a large impact on the cloud radiative forcing at the surface, which is dominated by the longwave flux for this case.

  5. Towards Efficient Scientific Data Management Using Cloud Storage

    NASA Technical Reports Server (NTRS)

    He, Qiming

    2013-01-01

    A software prototype allows users to backup and restore data to/from both public and private cloud storage such as Amazon's S3 and NASA's Nebula. Unlike other off-the-shelf tools, this software ensures user data security in the cloud (through encryption), and minimizes users operating costs by using space- and bandwidth-efficient compression and incremental backup. Parallel data processing utilities have also been developed by using massively scalable cloud computing in conjunction with cloud storage. One of the innovations in this software is using modified open source components to work with a private cloud like NASA Nebula. Another innovation is porting the complex backup to- cloud software to embedded Linux, running on the home networking devices, in order to benefit more users.

  6. Aerosol-Cloud-Drizzle-Turbulence Interactions in Boundary Layer Clouds

    DTIC Science & Technology

    2012-09-30

    and cloud observations in trade wind cumulus clouds using the CIRPAS aircraft with the cloud radar was designed and carried out. The observational...gradients in cloud properties off the coast. Further from the South Florida area of fair-weather cumulus clouds (Jan. 2008) where clouds with both...marine and continental characteristics were observed. This was followed by a set of observations made in 2010 of cumulus clouds in off of Barbados

  7. Study of Multi-Scale Cloud Processes Over the Tropical Western Pacific Using Cloud-Resolving Models Constrained by Satellite Data

    SciTech Connect

    Dudhia, Jimy

    2013-03-12

    TWP-ICE using satellite and ground-based observations. -- Perform numerical experiments using WRF to investigate how convection over tropical islands in the Maritime Continent interacts with large-scale circulation and affects convection in nearby regions. -- Evaluate and apply WRF as a testbed for GCM cloud parameterizations, utilizing the ability of WRF to run on multiple scales (from cloud resolving to global) to isolate resolution and physics issues from dynamical and model framework issues. Key products will be disseminated to the ARM and larger community through distribution of data archives, including model outputs from the data assimilation products and cloud resolving simulations, and publications.

  8. Comparison of satellite-derived and observer-based determinations of cloud cover amount at the SGP CART site

    SciTech Connect

    Liaw, Y.P.; Cook, D.R.; Sisterson, D.L.; Gao, W.

    1995-06-01

    Cloud-climate feedback is one of the most important factors in predicting the timing and magnitude of global climate change and its regional effects. Recent satellite measurements indicate that global effects of clouds on solar and infrared radiation are large. The experimental objective of the Atmospheric Radiation Measurement (ARM) Program is to characterize, empically, the radiative processes in the Earth`s atmosphere with improved resolution and accuracy. Therefore, the effective treatment of cloud formation and cloud properties is crucial for reliable climate prediction. This study focuses on the analysis of cloud cover data for the ARM Southern Great Plains (SGP) Cloud and Radiation Testbed (CART) site central facility. The data set was obtained from the Advanced Very High Resolution Radiometer (AVHRR) on National Oceanic and Atmospheric Administration (NOAA) Satellites 11 and 12, and cloud observations made by SGP CART site operators. Such an analysis provides a basis for future evaluations with whole-sky cameras and provides a means of assessing the reliability of surface-based observations of cloud cover at the SGP CART site.

  9. OpenTopography

    NASA Astrophysics Data System (ADS)

    Baru, C.; Arrowsmith, R.; Crosby, C.; Nandigam, V.; Phan, M.; Cowart, C.

    2012-04-01

    OpenTopography is a cyberinfrastructure-based facility for online access to high-resolution topography and tools. The project is an outcome of the Geosciences Network (GEON) project, which was a research project funded several years ago in the US to investigate the use of cyberinfrastructure to support research and education in the geosciences. OpenTopography provides online access to large LiDAR point cloud datasets along with services for processing these data. Users are able to generate custom DEMs by invoking DEM services provided by OpenTopography with custom parameter values. Users can track the progress of their jobs, and a private myOpenTopo area retains job information and job outputs. Data available at OpenTopography are provided by a variety of data acquisition groups under joint agreements and memoranda of understanding (MoU). These include national facilities such as the National Center for Airborne Lidar Mapping, as well as local, state, and federal agencies. OpenTopography is also being designed as a hub for high-resolution topography resources. Datasets and services available at other locations can also be registered here, providing a "one-stop shop" for such information. We will describe the OpenTopography system architecture and its current set of features, including the service-oriented architecture, a job-tracking database, and social networking features. We will also describe several design and development activities underway to archive and publish datasets using digital object identifiers (DOIs); create a more flexible and scalable high-performance environment for processing of large datasets; extend support for satellite-based and terrestrial lidar as well as synthetic aperture radar (SAR) data; and create a "pluggable" infrastructure for third-party services. OpenTopography has successfully created a facility for sharing lidar data. In the next phase, we are developing a facility that will also enable equally easy and successful sharing of

  10. Exploring the nonlinear cloud and rain equation.

    PubMed

    Koren, Ilan; Tziperman, Eli; Feingold, Graham

    2017-01-01

    Marine stratocumulus cloud decks are regarded as the reflectors of the climate system, returning back to space a significant part of the income solar radiation, thus cooling the atmosphere. Such clouds can exist in two stable modes, open and closed cells, for a wide range of environmental conditions. This emergent behavior of the system, and its sensitivity to aerosol and environmental properties, is captured by a set of nonlinear equations. Here, using linear stability analysis, we express the transition from steady to a limit-cycle state analytically, showing how it depends on the model parameters. We show that the control of the droplet concentration (N), the environmental carrying-capacity (H0), and the cloud recovery parameter (τ) can be linked by a single nondimensional parameter (μ=N/(ατH0)), suggesting that for deeper clouds the transition from open (oscillating) to closed (stable fixed point) cells will occur for higher droplet concentration (i.e., higher aerosol loading). The analytical calculations of the possible states, and how they are affected by changes in aerosol and the environmental variables, provide an enhanced understanding of the complex interactions of clouds and rain.

  11. Testbed for development of a DSP-based signal processing subsystem for an Earth-orbiting radar scatterometer

    NASA Technical Reports Server (NTRS)

    Clark, Douglas J.; Lux, James P.; Shirbacheh, Mike

    2002-01-01

    A testbed for evaluation of general-purpose digital signal processors in earth-orbiting radar scatterometers is discussed. Because general purpose DSP represents a departure from previous radar signal processing techniques used on scatterometers, there was a need to demonstrate key elements of the system to verify feasibility for potential future scatterometer instruments. Construction of the testbed also facilitated identification of an appropriate software development environment and the skills mix necessary to perform the work.

  12. The experiment of the OPMDC performance in a 43-Gb/s RZ-DQPSK 1200km transmission testbed

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoguang; Weng, Xuan; Tian, Feng; Xi, Lixia; Xiong, Qianjin; Li, Xixiang; Zhang, Guangyong

    2010-12-01

    This paper reports the prototype of optical polarization mode dispersion compensator (OPMDC) we have made to compensate the polarization mode dispersion (PMD) in fibers. The OPMDC prototype designed was tested in a 43Gb/s RZ-DQPSK system either in the back-to-back case, or in a 1200km transmission testbed. It showed a good performance under the tests of fast SOP and PSP rotation, DGD variation, and moderate knock on the testbed in a period of 12 hours.

  13. Dual-wavelength millimeter-wave radar measurements of cirrus clouds

    SciTech Connect

    Sekelsky, S.M.; Firda, J.M.; McIntosh, R.E.

    1996-04-01

    In April 1994, the University of Massachusetts` 33-GHz/95-GHz Cloud Profiling Radar System (CPRS) participated in the multi-sensor Remote Cloud Sensing (RCS) Intensive Operation Period (IOP), which was conducted at the Southern Great Plains Cloud and Radiation Testbed (CART). During the 3-week experiment, CPRS measured a variety of cloud types and severe weather. In the context of global warming, the most significant measurements are dual-frequency observations of cirrus clouds, which may eventually be used to estimate ice crystal size and shape. Much of the cirrus data collected with CPRS show differences between 33-GHz and 95-GHz reflectivity measurements that are correlated with Doppler estimates of fall velocity. Because of the small range of reflectivity differences, a precise calibration of the radar is required and differential attenuation must also be removed from the data. Depolarization, which is an indicator of crystal shape, was also observed in several clouds. In this abstract we present examples of Mie scattering from cirrus and estimates of differential attenuation due to water vapor and oxygen that were derived from CART radiosonde measurements.

  14. Urban Climate Resilience - Connecting climate models with decision support cyberinfrastructure using open standards

    NASA Astrophysics Data System (ADS)

    Bermudez, L. E.; Percivall, G.; Idol, T. A.

    2015-12-01

    Experts in climate modeling, remote sensing of the Earth, and cyber infrastructure must work together in order to make climate predictions available to decision makers. Such experts and decision makers worked together in the Open Geospatial Consortium's (OGC) Testbed 11 to address a scenario of population displacement by coastal inundation due to the predicted sea level rise. In a Policy Fact Sheet "Harnessing Climate Data to Boost Ecosystem & Water Resilience", issued by White House Office of Science and Technology (OSTP) in December 2014, OGC committed to increase access to climate change information using open standards. In July 2015, the OGC Testbed 11 Urban Climate Resilience activity delivered on that commitment with open standards based support for climate-change preparedness. Using open standards such as the OGC Web Coverage Service and Web Processing Service and the NetCDF and GMLJP2 encoding standards, Testbed 11 deployed an interoperable high-resolution flood model to bring climate model outputs together with global change assessment models and other remote sensing data for decision support. Methods to confirm model predictions and to allow "what-if-scenarios" included in-situ sensor webs and crowdsourcing. A scenario was in two locations: San Francisco Bay Area and Mozambique. The scenarios demonstrated interoperation and capabilities of open geospatial specifications in supporting data services and processing services. The resultant High Resolution Flood Information System addressed access and control of simulation models and high-resolution data in an open, worldwide, collaborative Web environment. The scenarios examined the feasibility and capability of existing OGC geospatial Web service specifications in supporting the on-demand, dynamic serving of flood information from models with forecasting capacity. Results of this testbed included identification of standards and best practices that help researchers and cities deal with climate-related issues

  15. NASA's flight-technological development program - A 650 Mbps laser communications testbed

    NASA Technical Reports Server (NTRS)

    Hayden, W. L.; Fitzmaurice, M. W.; Nace, D. A.; Lokerson, D. C.; Minott, P. O.; Chapman, W. W.

    1991-01-01

    A 650 Mbps laser communications testbed under construction for the development of flight qualifiable hardware suitable for near-term operation on geosynchronous-to-geosynchronous crosslink missions is presented. The program's primary purpose is to develop and optimize laser communications unique subsystems. Requirements for the testbed experiments are to optimize the acquisition processes, to fully simulate the long range (up to 21,000 km) and the fine tracking characteristics of two narrow-beam laser communications terminals, and to fully test communications performance which will include average and burst bit error rates, effects of laser diode coalignment, degradation due to internal and external stray light, and the impact of drifts in the optical components.

  16. The Langley Research Center CSI phase-0 evolutionary model testbed-design and experimental results

    NASA Technical Reports Server (NTRS)

    Belvin, W. K.; Horta, Lucas G.; Elliott, K. B.

    1991-01-01

    A testbed for the development of Controls Structures Interaction (CSI) technology is described. The design philosophy, capabilities, and early experimental results are presented to introduce some of the ongoing CSI research at NASA-Langley. The testbed, referred to as the Phase 0 version of the CSI Evolutionary model (CEM), is the first stage of model complexity designed to show the benefits of CSI technology and to identify weaknesses in current capabilities. Early closed loop test results have shown non-model based controllers can provide an order of magnitude increase in damping in the first few flexible vibration modes. Model based controllers for higher performance will need to be robust to model uncertainty as verified by System ID tests. Data are presented that show finite element model predictions of frequency differ from those obtained from tests. Plans are also presented for evolution of the CEM to study integrated controller and structure design as well as multiple payload dynamics.

  17. An investigation of DTNS2D for use as an incompressible turbulence modelling test-bed

    NASA Technical Reports Server (NTRS)

    Steffen, Christopher J., Jr.

    1992-01-01

    This paper documents an investigation of a two dimensional, incompressible Navier-Stokes solver for use as a test-bed for turbulence modelling. DTNS2D is the code under consideration for use at the Center for Modelling of Turbulence and Transition (CMOTT). This code was created by Gorski at the David Taylor Research Center and incorporates the pseudo compressibility method. Two laminar benchmark flows are used to measure the performance and implementation of the method. The classical solution of the Blasius boundary layer is used for validating the flat plate flow, while experimental data is incorporated in the validation of backward facing step flow. Velocity profiles, convergence histories, and reattachment lengths are used to quantify these calculations. The organization and adaptability of the code are also examined in light of the role as a numerical test-bed.

  18. Towards an Experimental Testbed Facility for Cyber-Physical Security Research

    SciTech Connect

    Edgar, Thomas W.; Manz, David O.; Carroll, Thomas E.

    2012-01-07

    Cyber-Physical Systems (CPSs) are under great scrutiny due to large Smart Grid investments and recent high profile security vulnerabilities and attacks. Research into improved security technologies, communication models, and emergent behavior is necessary to protect these systems from sophisticated adversaries and new risks posed by the convergence of CPSs with IT equipment. However, cyber-physical security research is limited by the lack of access to universal cyber-physical testbed facilities that permit flexible, high-fidelity experiments. This paper presents a remotely-configurable and community-accessible testbed design that integrates elements from the virtual, simulated, and physical environments. Fusing data between the three environments enables the creation of realistic and scalable environments where new functionality and ideas can be exercised. This novel design will enable the research community to analyze and evaluate the security of current environments and design future, secure, cyber-physical technologies.

  19. The implementation of the Human Exploration Demonstration Project (HEDP), a systems technology testbed

    NASA Technical Reports Server (NTRS)

    Rosen, Robert; Korsmeyer, David J.

    1993-01-01

    The Human Exploration Demonstration Project (HEDP) is an ongoing task at the NASA's Ames Research Center to address the advanced technology requirements necessary to implement an integrated working and living environment for a planetary surface habitat. The integrated environment consists of life support systems, physiological monitoring of project crew, a virtual environment work station, and centralized data acquisition and habitat systems health monitoring. The HEDP is an integrated technology demonstrator, as well as an initial operational testbed. There are several robotic systems operational in a simulated planetary landscape external to the habitat environment, to provide representative work loads for the crew. This paper describes the evolution of the HEDP from initial concept to operational project; the status of the HEDP after two years; the final facilities composing the HEDP; the project's role as a NASA Ames Research Center systems technology testbed; and the interim demonstration scenarios that have been run to feature the developing technologies in 1993.

  20. System engineering techniques for establishing balanced design and performance guidelines for the advanced telerobotic testbed

    NASA Technical Reports Server (NTRS)

    Zimmerman, W. F.; Matijevic, J. R.

    1987-01-01

    Novel system engineering techniques have been developed and applied to establishing structured design and performance objectives for the Telerobotics Testbed that reduce technical risk while still allowing the testbed to demonstrate an advancement in state-of-the-art robotic technologies. To estblish the appropriate tradeoff structure and balance of technology performance against technical risk, an analytical data base was developed which drew on: (1) automation/robot-technology availability projections, (2) typical or potential application mission task sets, (3) performance simulations, (4) project schedule constraints, and (5) project funding constraints. Design tradeoffs and configuration/performance iterations were conducted by comparing feasible technology/task set configurations against schedule/budget constraints as well as original program target technology objectives. The final system configuration, task set, and technology set reflected a balanced advancement in state-of-the-art robotic technologies, while meeting programmatic objectives and schedule/cost constraints.