Research on OpenStack of open source cloud computing in colleges and universities’ computer room
NASA Astrophysics Data System (ADS)
Wang, Lei; Zhang, Dandan
2017-06-01
In recent years, the cloud computing technology has a rapid development, especially open source cloud computing. Open source cloud computing has attracted a large number of user groups by the advantages of open source and low cost, have now become a large-scale promotion and application. In this paper, firstly we briefly introduced the main functions and architecture of the open source cloud computing OpenStack tools, and then discussed deeply the core problems of computer labs in colleges and universities. Combining with this research, it is not that the specific application and deployment of university computer rooms with OpenStack tool. The experimental results show that the application of OpenStack tool can efficiently and conveniently deploy cloud of university computer room, and its performance is stable and the functional value is good.
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
Open Source Cloud-Based Technologies for Bim
NASA Astrophysics Data System (ADS)
Logothetis, S.; Karachaliou, E.; Valari, E.; Stylianidis, E.
2018-05-01
This paper presents a Cloud-based open source system for storing and processing data from a 3D survey approach. More specifically, we provide an online service for viewing, storing and analysing BIM. Cloud technologies were used to develop a web interface as a BIM data centre, which can handle large BIM data using a server. The server can be accessed by many users through various electronic devices anytime and anywhere so they can view online 3D models using browsers. Nowadays, the Cloud computing is engaged progressively in facilitating BIM-based collaboration between the multiple stakeholders and disciplinary groups for complicated Architectural, Engineering and Construction (AEC) projects. Besides, the development of Open Source Software (OSS) has been rapidly growing and their use tends to be united. Although BIM and Cloud technologies are extensively known and used, there is a lack of integrated open source Cloud-based platforms able to support all stages of BIM processes. The present research aims to create an open source Cloud-based BIM system that is able to handle geospatial data. In this effort, only open source tools will be used; from the starting point of creating the 3D model with FreeCAD to its online presentation through BIMserver. Python plug-ins will be developed to link the two software which will be distributed and freely available to a large community of professional for their use. The research work will be completed by benchmarking four Cloud-based BIM systems: Autodesk BIM 360, BIMserver, Graphisoft BIMcloud and Onuma System, which present remarkable results.
The Role of Standards in Cloud-Computing Interoperability
2012-10-01
services are not shared outside the organization. CloudStack, Eucalyptus, HP, Microsoft, OpenStack , Ubuntu, and VMWare provide tools for building...center requirements • Developing usage models for cloud ven- dors • Independent IT consortium OpenStack http://www.openstack.org • Open-source...software for running private clouds • Currently consists of three core software projects: OpenStack Compute (Nova), OpenStack Object Storage (Swift
Processing Uav and LIDAR Point Clouds in Grass GIS
NASA Astrophysics Data System (ADS)
Petras, V.; Petrasova, A.; Jeziorska, J.; Mitasova, H.
2016-06-01
Today's methods of acquiring Earth surface data, namely lidar and unmanned aerial vehicle (UAV) imagery, non-selectively collect or generate large amounts of points. Point clouds from different sources vary in their properties such as number of returns, density, or quality. We present a set of tools with applications for different types of points clouds obtained by a lidar scanner, structure from motion technique (SfM), and a low-cost 3D scanner. To take advantage of the vertical structure of multiple return lidar point clouds, we demonstrate tools to process them using 3D raster techniques which allow, for example, the development of custom vegetation classification methods. Dense point clouds obtained from UAV imagery, often containing redundant points, can be decimated using various techniques before further processing. We implemented and compared several decimation techniques in regard to their performance and the final digital surface model (DSM). Finally, we will describe the processing of a point cloud from a low-cost 3D scanner, namely Microsoft Kinect, and its application for interaction with physical models. All the presented tools are open source and integrated in GRASS GIS, a multi-purpose open source GIS with remote sensing capabilities. The tools integrate with other open source projects, specifically Point Data Abstraction Library (PDAL), Point Cloud Library (PCL), and OpenKinect libfreenect2 library to benefit from the open source point cloud ecosystem. The implementation in GRASS GIS ensures long term maintenance and reproducibility by the scientific community but also by the original authors themselves.
ProteoCloud: a full-featured open source proteomics cloud computing pipeline.
Muth, Thilo; Peters, Julian; Blackburn, Jonathan; Rapp, Erdmann; Martens, Lennart
2013-08-02
We here present the ProteoCloud pipeline, a freely available, full-featured cloud-based platform to perform computationally intensive, exhaustive searches in a cloud environment using five different peptide identification algorithms. ProteoCloud is entirely open source, and is built around an easy to use and cross-platform software client with a rich graphical user interface. This client allows full control of the number of cloud instances to initiate and of the spectra to assign for identification. It also enables the user to track progress, and to visualize and interpret the results in detail. Source code, binaries and documentation are all available at http://proteocloud.googlecode.com. Copyright © 2012 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, Hyunwoo; Timm, Steven
We present a summary of how X.509 authentication and authorization are used with OpenNebula in FermiCloud. We also describe a history of why the X.509 authentication was needed in FermiCloud, and review X.509 authorization options, both internal and external to OpenNebula. We show how these options can be and have been used to successfully run scientific workflows on federated clouds, which include OpenNebula on FermiCloud and Amazon Web Services as well as other community clouds. We also outline federation options being used by other commercial and open-source clouds and cloud research projects.
Creating a Rackspace and NASA Nebula compatible cloud using the OpenStack project (Invited)
NASA Astrophysics Data System (ADS)
Clark, R.
2010-12-01
NASA and Rackspace have both provided technology to the OpenStack that allows anyone to create a private Infrastructure as a Service (IaaS) cloud using open source software and commodity hardware. OpenStack is designed and developed completely in the open and with an open governance process. NASA donated Nova, which powers the compute portion of NASA Nebula Cloud Computing Platform, and Rackspace donated Swift, which powers Rackspace Cloud Files. The project is now in continuous development by NASA, Rackspace, and hundreds of other participants. When you create a private cloud using Openstack, you will have the ability to easily interact with your private cloud, a government cloud, and an ecosystem of public cloud providers, using the same API.
NASA Astrophysics Data System (ADS)
Delipetrev, Blagoj
2016-04-01
Presently, most of the existing software is desktop-based, designed to work on a single computer, which represents a major limitation in many ways, starting from limited computer processing, storage power, accessibility, availability, etc. The only feasible solution lies in the web and cloud. This abstract presents research and development of a cloud computing geospatial application for water resources based on free and open source software and open standards using hybrid deployment model of public - private cloud, running on two separate virtual machines (VMs). The first one (VM1) is running on Amazon web services (AWS) and the second one (VM2) is running on a Xen cloud platform. The presented cloud application is developed using free and open source software, open standards and prototype code. The cloud application presents a framework how to develop specialized cloud geospatial application that needs only a web browser to be used. This cloud application is the ultimate collaboration geospatial platform because multiple users across the globe with internet connection and browser can jointly model geospatial objects, enter attribute data and information, execute algorithms, and visualize results. The presented cloud application is: available all the time, accessible from everywhere, it is scalable, works in a distributed computer environment, it creates a real-time multiuser collaboration platform, the programing languages code and components are interoperable, and it is flexible in including additional components. The cloud geospatial application is implemented as a specialized water resources application with three web services for 1) data infrastructure (DI), 2) support for water resources modelling (WRM), 3) user management. The web services are running on two VMs that are communicating over the internet providing services to users. The application was tested on the Zletovica river basin case study with concurrent multiple users. The application is a state-of-the-art cloud geospatial collaboration platform. The presented solution is a prototype and can be used as a foundation for developing of any specialized cloud geospatial applications. Further research will be focused on distributing the cloud application on additional VMs, testing the scalability and availability of services.
2017-06-01
for GIFT Cloud, the web -based application version of the Generalized Intelligent Framework for Tutoring (GIFT). GIFT is a modular, open-source...external applications. GIFT is available to users with a GIFT Account at no cost. GIFT Cloud is an implementation of GIFT. This web -based application...section. Approved for public release; distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser
Diagnosing turbulence for research aircraft safety using open source toolkits
NASA Astrophysics Data System (ADS)
Lang, T. J.; Guy, N.
Open source software toolkits have been developed and applied to diagnose in-cloud turbulence in the vicinity of Earth science research aircraft, via analysis of ground-based Doppler radar data. Based on multiple retrospective analyses, these toolkits show promise for detecting significant turbulence well prior to cloud penetrations by research aircraft. A pilot study demonstrated the ability to provide mission scientists turbulence estimates in near real time during an actual field campaign, and thus these toolkits are recommended for usage in future cloud-penetrating aircraft field campaigns.
NASA Astrophysics Data System (ADS)
Possner, A.; Wang, H.; Caldeira, K.; Wood, R.; Ackerman, T. P.
2017-12-01
Aerosol-cloud interactions (ACIs) in marine stratocumulus remain a significant source of uncertainty in constraining the cloud-radiative effect in a changing climate. Ship tracks are undoubted manifestations of ACIs embedded within stratocumulus cloud decks and have proven to be a useful framework to study the effect of aerosol perturbations on cloud morphology, macrophysical, microphyiscal and cloud-radiative properties. However, so far most observational (Christensen et al. 2012, Chen et al. 2015) and numerical studies (Wang et al. 2011, Possner et al. 2015, Berner et al. 2015) have concentrated on ship tracks in shallow boundary layers of depths between 300 - 800 m, while most stratocumulus decks form in significantly deeper boundary layers (Muhlbauer et al. 2014). In this study we investigate the efficacy of aerosol perturbations in deep open and closed cell stratocumulus. Multi-day idealised cloud-resolving simulations are performed for the RF06 flight of the VOCALS-Rex field campaign (Wood et al. 2011). During this flight pockets of deep open and closed cells were observed in a 1410 m deep boundary layer. The efficacy of aerosol perturbations of varied concentration and spatial gradients in altering the cloud micro- and macrophysical state and cloud-radiative effect is determined in both cloud regimes. Our simulations show that a continued point source emission flux of 1.16*1011 particles m-2 s-1 applied within a 300x300 m2 gridbox induces pronounced cloud cover changes in approximately a third of the simulated 80x80 km2 domain, a weakening of the diurnal cycle in the open-cell regime and a resulting increase in domain-mean cloud albedo of 0.2. Furthermore, we contrast the efficacy of equal strength near-surface or above-cloud aerosol perturbations in altering the cloud state.
OpenID Connect as a security service in cloud-based medical imaging systems.
Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter
2016-04-01
The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as "Kerberos of cloud." We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model.
Genes2WordCloud: a quick way to identify biological themes from gene lists and free text.
Baroukh, Caroline; Jenkins, Sherry L; Dannenfelser, Ruth; Ma'ayan, Avi
2011-10-13
Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications.
Genes2WordCloud: a quick way to identify biological themes from gene lists and free text
2011-01-01
Background Word-clouds recently emerged on the web as a solution for quickly summarizing text by maximizing the display of most relevant terms about a specific topic in the minimum amount of space. As biologists are faced with the daunting amount of new research data commonly presented in textual formats, word-clouds can be used to summarize and represent biological and/or biomedical content for various applications. Results Genes2WordCloud is a web application that enables users to quickly identify biological themes from gene lists and research relevant text by constructing and displaying word-clouds. It provides users with several different options and ideas for the sources that can be used to generate a word-cloud. Different options for rendering and coloring the word-clouds give users the flexibility to quickly generate customized word-clouds of their choice. Methods Genes2WordCloud is a word-cloud generator and a word-cloud viewer that is based on WordCram implemented using Java, Processing, AJAX, mySQL, and PHP. Text is fetched from several sources and then processed to extract the most relevant terms with their computed weights based on word frequencies. Genes2WordCloud is freely available for use online; it is open source software and is available for installation on any web-site along with supporting documentation at http://www.maayanlab.net/G2W. Conclusions Genes2WordCloud provides a useful way to summarize and visualize large amounts of textual biological data or to find biological themes from several different sources. The open source availability of the software enables users to implement customized word-clouds on their own web-sites and desktop applications. PMID:21995939
Elastic Cloud Computing Infrastructures in the Open Cirrus Testbed Implemented via Eucalyptus
NASA Astrophysics Data System (ADS)
Baun, Christian; Kunze, Marcel
Cloud computing realizes the advantages and overcomes some restrictionsof the grid computing paradigm. Elastic infrastructures can easily be createdand managed by cloud users. In order to accelerate the research ondata center management and cloud services the OpenCirrusTM researchtestbed has been started by HP, Intel and Yahoo!. Although commercialcloud offerings are proprietary, Open Source solutions exist in the field ofIaaS with Eucalyptus, PaaS with AppScale and at the applications layerwith Hadoop MapReduce. This paper examines the I/O performance ofcloud computing infrastructures implemented with Eucalyptus in contrastto Amazon S3.
Menu-driven cloud computing and resource sharing for R and Bioconductor.
Bolouri, Hamid; Dulepet, Rajiv; Angerman, Michael
2011-08-15
We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. hbolouri@fhcrc.org.
OpenID Connect as a security service in cloud-based medical imaging systems
Ma, Weina; Sartipi, Kamran; Sharghigoorabi, Hassan; Koff, David; Bak, Peter
2016-01-01
Abstract. The evolution of cloud computing is driving the next generation of medical imaging systems. However, privacy and security concerns have been consistently regarded as the major obstacles for adoption of cloud computing by healthcare domains. OpenID Connect, combining OpenID and OAuth together, is an emerging representational state transfer-based federated identity solution. It is one of the most adopted open standards to potentially become the de facto standard for securing cloud computing and mobile applications, which is also regarded as “Kerberos of cloud.” We introduce OpenID Connect as an authentication and authorization service in cloud-based diagnostic imaging (DI) systems, and propose enhancements that allow for incorporating this technology within distributed enterprise environments. The objective of this study is to offer solutions for secure sharing of medical images among diagnostic imaging repository (DI-r) and heterogeneous picture archiving and communication systems (PACS) as well as Web-based and mobile clients in the cloud ecosystem. The main objective is to use OpenID Connect open-source single sign-on and authorization service and in a user-centric manner, while deploying DI-r and PACS to private or community clouds should provide equivalent security levels to traditional computing model. PMID:27340682
Menu-driven cloud computing and resource sharing for R and Bioconductor
Bolouri, Hamid; Angerman, Michael
2011-01-01
Summary: We report CRdata.org, a cloud-based, free, open-source web server for running analyses and sharing data and R scripts with others. In addition to using the free, public service, CRdata users can launch their own private Amazon Elastic Computing Cloud (EC2) nodes and store private data and scripts on Amazon's Simple Storage Service (S3) with user-controlled access rights. All CRdata services are provided via point-and-click menus. Availability and Implementation: CRdata is open-source and free under the permissive MIT License (opensource.org/licenses/mit-license.php). The source code is in Ruby (ruby-lang.org/en/) and available at: github.com/seerdata/crdata. Contact: hbolouri@fhcrc.org PMID:21685055
OpenID connect as a security service in Cloud-based diagnostic imaging systems
NASA Astrophysics Data System (ADS)
Ma, Weina; Sartipi, Kamran; Sharghi, Hassan; Koff, David; Bak, Peter
2015-03-01
The evolution of cloud computing is driving the next generation of diagnostic imaging (DI) systems. Cloud-based DI systems are able to deliver better services to patients without constraining to their own physical facilities. However, privacy and security concerns have been consistently regarded as the major obstacle for adoption of cloud computing by healthcare domains. Furthermore, traditional computing models and interfaces employed by DI systems are not ready for accessing diagnostic images through mobile devices. RESTful is an ideal technology for provisioning both mobile services and cloud computing. OpenID Connect, combining OpenID and OAuth together, is an emerging REST-based federated identity solution. It is one of the most perspective open standards to potentially become the de-facto standard for securing cloud computing and mobile applications, which has ever been regarded as "Kerberos of Cloud". We introduce OpenID Connect as an identity and authentication service in cloud-based DI systems and propose enhancements that allow for incorporating this technology within distributed enterprise environment. The objective of this study is to offer solutions for secure radiology image sharing among DI-r (Diagnostic Imaging Repository) and heterogeneous PACS (Picture Archiving and Communication Systems) as well as mobile clients in the cloud ecosystem. Through using OpenID Connect as an open-source identity and authentication service, deploying DI-r and PACS to private or community clouds should obtain equivalent security level to traditional computing model.
The Confluence of GIS, Cloud and Open Source, Enabling Big Raster Data Applications
NASA Astrophysics Data System (ADS)
Plesea, L.; Emmart, C. B.; Boller, R. A.; Becker, P.; Baynes, K.
2016-12-01
The rapid evolution of available cloud services is profoundly changing the way applications are being developed and used. Massive object stores, service scalability, continuous integration are some of the most important cloud technology advances that directly influence science applications and GIS. At the same time, more and more scientists are using GIS platforms in their day to day research. Yet with new opportunities there are always some challenges. Given the large amount of data commonly required in science applications, usually large raster datasets, connectivity is one of the biggest problems. Connectivity has two aspects, one being the limited bandwidth and latency of the communication link due to the geographical location of the resources, the other one being the interoperability and intrinsic efficiency of the interface protocol used to connect. NASA and Esri are actively helping each other and collaborating on a few open source projects, aiming to provide some of the core technology components to directly address the GIS enabled data connectivity problems. Last year Esri contributed LERC, a very fast and efficient compression algorithm to the GDAL/MRF format, which itself is a NASA/Esri collaboration project. The MRF raster format has some cloud aware features that make it possible to build high performance web services on cloud platforms, as some of the Esri projects demonstrate. Currently, another NASA open source project, the high performance OnEarth WMTS server is being refactored and enhanced to better integrate with MRF, GDAL and Esri software. Taken together, the GDAL, MRF and OnEarth form the core of an open source CloudGIS toolkit that is already showing results. Since it is well integrated with GDAL, which is the most common interoperability component of GIS applications, this approach should improve the connectivity and performance of many science and GIS applications in the cloud.
2012-05-01
cloud computing 17 NASA Nebula Platform • Cloud computing pilot program at NASA Ames • Integrates open-source components into seamless, self...Mission support • Education and public outreach (NASA Nebula , 2010) 18 NSF Supported Cloud Research • Support for Cloud Computing in...Mell, P. & Grance, T. (2011). The NIST Definition of Cloud Computing. NIST Special Publication 800-145 • NASA Nebula (2010). Retrieved from
STORMSeq: an open-source, user-friendly pipeline for processing personal genomics data in the cloud.
Karczewski, Konrad J; Fernald, Guy Haskin; Martin, Alicia R; Snyder, Michael; Tatonetti, Nicholas P; Dudley, Joel T
2014-01-01
The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5-10 hours to process a full exome sequence and $30 and 3-8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2.
Web Solutions Inspire Cloud Computing Software
NASA Technical Reports Server (NTRS)
2013-01-01
An effort at Ames Research Center to standardize NASA websites unexpectedly led to a breakthrough in open source cloud computing technology. With the help of Rackspace Inc. of San Antonio, Texas, the resulting product, OpenStack, has spurred the growth of an entire industry that is already employing hundreds of people and generating hundreds of millions in revenue.
Hybrid cloud: bridging of private and public cloud computing
NASA Astrophysics Data System (ADS)
Aryotejo, Guruh; Kristiyanto, Daniel Y.; Mufadhol
2018-05-01
Cloud Computing is quickly emerging as a promising paradigm in the recent years especially for the business sector. In addition, through cloud service providers, cloud computing is widely used by Information Technology (IT) based startup company to grow their business. However, the level of most businesses awareness on data security issues is low, since some Cloud Service Provider (CSP) could decrypt their data. Hybrid Cloud Deployment Model (HCDM) has characteristic as open source, which is one of secure cloud computing model, thus HCDM may solve data security issues. The objective of this study is to design, deploy and evaluate a HCDM as Infrastructure as a Service (IaaS). In the implementation process, Metal as a Service (MAAS) engine was used as a base to build an actual server and node. Followed by installing the vsftpd application, which serves as FTP server. In comparison with HCDM, public cloud was adopted through public cloud interface. As a result, the design and deployment of HCDM was conducted successfully, instead of having good security, HCDM able to transfer data faster than public cloud significantly. To the best of our knowledge, Hybrid Cloud Deployment model is one of secure cloud computing model due to its characteristic as open source. Furthermore, this study will serve as a base for future studies about Hybrid Cloud Deployment model which may relevant for solving big security issues of IT-based startup companies especially in Indonesia.
STORMSeq: An Open-Source, User-Friendly Pipeline for Processing Personal Genomics Data in the Cloud
Karczewski, Konrad J.; Fernald, Guy Haskin; Martin, Alicia R.; Snyder, Michael; Tatonetti, Nicholas P.; Dudley, Joel T.
2014-01-01
The increasing public availability of personal complete genome sequencing data has ushered in an era of democratized genomics. However, read mapping and variant calling software is constantly improving and individuals with personal genomic data may prefer to customize and update their variant calls. Here, we describe STORMSeq (Scalable Tools for Open-Source Read Mapping), a graphical interface cloud computing solution that does not require a parallel computing environment or extensive technical experience. This customizable and modular system performs read mapping, read cleaning, and variant calling and annotation. At present, STORMSeq costs approximately $2 and 5–10 hours to process a full exome sequence and $30 and 3–8 days to process a whole genome sequence. We provide this open-access and open-source resource as a user-friendly interface in Amazon EC2. PMID:24454756
DOT National Transportation Integrated Search
2016-11-17
The ETFOMM (Enhanced Transportation Flow Open Source Microscopic Model) Cloud Service (ECS) is a software product sponsored by the U.S. Department of Transportation in conjunction with the Microscopic Traffic Simulation Models and SoftwareAn Op...
Cloud regimes as phase transitions
NASA Astrophysics Data System (ADS)
Stechmann, Samuel; Hottovy, Scott
2017-11-01
Clouds are repeatedly identified as a leading source of uncertainty in future climate predictions. Of particular importance are stratocumulus clouds, which can appear as either (i) closed cells that reflect solar radiation back to space or (ii) open cells that allow solar radiation to reach the Earth's surface. Here we show that these clouds regimes - open versus closed cells - fit the paradigm of a phase transition. In addition, this paradigm characterizes pockets of open cells (POCs) as the interface between the open- and closed-cell regimes, and it identifies shallow cumulus clouds as a regime of higher variability. This behavior can be understood using an idealized model for the dynamics of atmospheric water as a stochastic diffusion process. Similar viewpoints of deep convection and self-organized criticality will also be discussed. With these new conceptual viewpoints, ideas from statistical mechanics could potentially be used for understanding uncertainties related to clouds in the climate system and climate predictions. The research of S.N.S. is partially supported by a Sloan Research Fellowship, ONR Young Investigator Award N00014-12-1-0744, and ONR MURI Grant N00014-12-1-0912.
plas.io: Open Source, Browser-based WebGL Point Cloud Visualization
NASA Astrophysics Data System (ADS)
Butler, H.; Finnegan, D. C.; Gadomski, P. J.; Verma, U. K.
2014-12-01
Point cloud data, in the form of Light Detection and Ranging (LiDAR), RADAR, or semi-global matching (SGM) image processing, are rapidly becoming a foundational data type to quantify and characterize geospatial processes. Visualization of these data, due to overall volume and irregular arrangement, is often difficult. Technological advancement in web browsers, in the form of WebGL and HTML5, have made interactivity and visualization capabilities ubiquitously available which once only existed in desktop software. plas.io is an open source JavaScript application that provides point cloud visualization, exploitation, and compression features in a web-browser platform, reducing the reliance for client-based desktop applications. The wide reach of WebGL and browser-based technologies mean plas.io's capabilities can be delivered to a diverse list of devices -- from phones and tablets to high-end workstations -- with very little custom software development. These properties make plas.io an ideal open platform for researchers and software developers to communicate visualizations of complex and rich point cloud data to devices to which everyone has easy access.
Krintz, Chandra
2013-01-01
AppScale is an open source distributed software system that implements a cloud platform as a service (PaaS). AppScale makes cloud applications easy to deploy and scale over disparate cloud fabrics, implementing a set of APIs and architecture that also makes apps portable across the services they employ. AppScale is API-compatible with Google App Engine (GAE) and thus executes GAE applications on-premise or over other cloud infrastructures, without modification. PMID:23828721
CSNS computing environment Based on OpenStack
NASA Astrophysics Data System (ADS)
Li, Yakang; Qi, Fazhi; Chen, Gang; Wang, Yanming; Hong, Jianshu
2017-10-01
Cloud computing can allow for more flexible configuration of IT resources and optimized hardware utilization, it also can provide computing service according to the real need. We are applying this computing mode to the China Spallation Neutron Source(CSNS) computing environment. So, firstly, CSNS experiment and its computing scenarios and requirements are introduced in this paper. Secondly, the design and practice of cloud computing platform based on OpenStack are mainly demonstrated from the aspects of cloud computing system framework, network, storage and so on. Thirdly, some improvments to openstack we made are discussed further. Finally, current status of CSNS cloud computing environment are summarized in the ending of this paper.
An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing
NASA Astrophysics Data System (ADS)
Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.
2015-07-01
Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write complex data processing code on the web directly, so they can design their own data processing algorithm.
Dynamic VM Provisioning for TORQUE in a Cloud Environment
NASA Astrophysics Data System (ADS)
Zhang, S.; Boland, L.; Coddington, P.; Sevior, M.
2014-06-01
Cloud computing, also known as an Infrastructure-as-a-Service (IaaS), is attracting more interest from the commercial and educational sectors as a way to provide cost-effective computational infrastructure. It is an ideal platform for researchers who must share common resources but need to be able to scale up to massive computational requirements for specific periods of time. This paper presents the tools and techniques developed to allow the open source TORQUE distributed resource manager and Maui cluster scheduler to dynamically integrate OpenStack cloud resources into existing high throughput computing clusters.
The Integration of CloudStack and OCCI/OpenNebula with DIRAC
NASA Astrophysics Data System (ADS)
Méndez Muñoz, Víctor; Fernández Albor, Víctor; Graciani Diaz, Ricardo; Casajús Ramo, Adriàn; Fernández Pena, Tomás; Merino Arévalo, Gonzalo; José Saborido Silva, Juan
2012-12-01
The increasing availability of Cloud resources is arising as a realistic alternative to the Grid as a paradigm for enabling scientific communities to access large distributed computing resources. The DIRAC framework for distributed computing is an easy way to efficiently access to resources from both systems. This paper explains the integration of DIRAC with two open-source Cloud Managers: OpenNebula (taking advantage of the OCCI standard) and CloudStack. These are computing tools to manage the complexity and heterogeneity of distributed data center infrastructures, allowing to create virtual clusters on demand, including public, private and hybrid clouds. This approach has required to develop an extension to the previous DIRAC Virtual Machine engine, which was developed for Amazon EC2, allowing the connection with these new cloud managers. In the OpenNebula case, the development has been based on the CernVM Virtual Software Appliance with appropriate contextualization, while in the case of CloudStack, the infrastructure has been kept more general, which permits other Virtual Machine sources and operating systems being used. In both cases, CernVM File System has been used to facilitate software distribution to the computing nodes. With the resulting infrastructure, the cloud resources are transparent to the users through a friendly interface, like the DIRAC Web Portal. The main purpose of this integration is to get a system that can manage cloud and grid resources at the same time. This particular feature pushes DIRAC to a new conceptual denomination as interware, integrating different middleware. Users from different communities do not need to care about the installation of the standard software that is available at the nodes, nor the operating system of the host machine which is transparent to the user. This paper presents an analysis of the overhead of the virtual layer, doing some tests to compare the proposed approach with the existing Grid solution. License Notice: Published under licence in Journal of Physics: Conference Series by IOP Publishing Ltd.
Delivering Unidata Technology via the Cloud
NASA Astrophysics Data System (ADS)
Fisher, Ward; Oxelson Ganter, Jennifer
2016-04-01
Over the last two years, Docker has emerged as the clear leader in open-source containerization. Containerization technology provides a means by which software can be pre-configured and packaged into a single unit, i.e. a container. This container can then be easily deployed either on local or remote systems. Containerization is particularly advantageous when moving software into the cloud, as it simplifies the process. Unidata is adopting containerization as part of our commitment to migrate our technologies to the cloud. We are using a two-pronged approach in this endeavor. In addition to migrating our data-portal services to a cloud environment, we are also exploring new and novel ways to use cloud-specific technology to serve our community. This effort has resulted in several new cloud/Docker-specific projects at Unidata: "CloudStream," "CloudIDV," and "CloudControl." CloudStream is a docker-based technology stack for bringing legacy desktop software to new computing environments, without the need to invest significant engineering/development resources. CloudStream helps make it easier to run existing software in a cloud environment via a technology called "Application Streaming." CloudIDV is a CloudStream-based implementation of the Unidata Integrated Data Viewer (IDV). CloudIDV serves as a practical example of application streaming, and demonstrates how traditional software can be easily accessed and controlled via a web browser. Finally, CloudControl is a web-based dashboard which provides administrative controls for running docker-based technologies in the cloud, as well as providing user management. In this work we will give an overview of these three open-source technologies and the value they offer to our community.
Dynamic Extension of a Virtualized Cluster by using Cloud Resources
NASA Astrophysics Data System (ADS)
Oberst, Oliver; Hauth, Thomas; Kernert, David; Riedel, Stephan; Quast, Günter
2012-12-01
The specific requirements concerning the software environment within the HEP community constrain the choice of resource providers for the outsourcing of computing infrastructure. The use of virtualization in HPC clusters and in the context of cloud resources is therefore a subject of recent developments in scientific computing. The dynamic virtualization of worker nodes in common batch systems provided by ViBatch serves each user with a dynamically virtualized subset of worker nodes on a local cluster. Now it can be transparently extended by the use of common open source cloud interfaces like OpenNebula or Eucalyptus, launching a subset of the virtual worker nodes within the cloud. This paper demonstrates how a dynamically virtualized computing cluster is combined with cloud resources by attaching remotely started virtual worker nodes to the local batch system.
NASA Astrophysics Data System (ADS)
Zacharek, M.; Delis, P.; Kedzierski, M.; Fryskowska, A.
2017-05-01
These studies have been conductedusing non-metric digital camera and dense image matching algorithms, as non-contact methods of creating monuments documentation.In order toprocess the imagery, few open-source software and algorithms of generating adense point cloud from images have been executed. In the research, the OSM Bundler, VisualSFM software, and web application ARC3D were used. Images obtained for each of the investigated objects were processed using those applications, and then dense point clouds and textured 3D models were created. As a result of post-processing, obtained models were filtered and scaled.The research showedthat even using the open-source software it is possible toobtain accurate 3D models of structures (with an accuracy of a few centimeters), but for the purpose of documentation and conservation of cultural and historical heritage, such accuracy can be insufficient.
NASA Astrophysics Data System (ADS)
Kearns, E. J.
2017-12-01
NOAA's Big Data Project is conducting an experiment in the collaborative distribution of open government data to non-governmental cloud-based systems. Through Cooperative Research and Development Agreements signed in 2015 between NOAA and Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium, NOAA is distributing open government data to a wide community of potential users. There are a number of significant advantages related to the use of open data on commercial cloud platforms, but through this experiment NOAA is also discovering significant challenges for those stewarding and maintaining NOAA's data resources in support of users in the wider open data ecosystem. Among the challenges that will be discussed are: the need to provide effective interpretation of the data content to enable their use by data scientists from other expert communities; effective maintenance of Collaborators' open data stores through coordinated publication of new data and new versions of older data; the provenance and verification of open data as authentic NOAA-sourced data across multiple management boundaries and analytical tools; and keeping pace with the accelerating expectations of users with regard to improved quality control, data latency, availability, and discoverability. Suggested strategies to address these challenges will also be described.
NASA Astrophysics Data System (ADS)
Yu, Xiaoyuan; Yuan, Jian; Chen, Shi
2013-03-01
Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.
Factors Influencing F/OSS Cloud Computing Software Product Success: A Quantitative Study
ERIC Educational Resources Information Center
Letort, D. Brian
2012-01-01
Cloud Computing introduces a new business operational model that allows an organization to shift information technology consumption from traditional capital expenditure to operational expenditure. This shift introduces challenges from both the adoption and creation vantage. This study evaluates factors that influence Free/Open Source Software…
NASA Astrophysics Data System (ADS)
Angius, S.; Bisegni, C.; Ciuffetti, P.; Di Pirro, G.; Foggetta, L. G.; Galletti, F.; Gargana, R.; Gioscio, E.; Maselli, D.; Mazzitelli, G.; Michelotti, A.; Orrù, R.; Pistoni, M.; Spagnoli, F.; Spigone, D.; Stecchi, A.; Tonto, T.; Tota, M. A.; Catani, L.; Di Giulio, C.; Salina, G.; Buzzi, P.; Checcucci, B.; Lubrano, P.; Piccini, M.; Fattibene, E.; Michelotto, M.; Cavallaro, S. R.; Diana, B. F.; Enrico, F.; Pulvirenti, S.
2016-01-01
The paper is aimed to present the !CHAOS open source project aimed to develop a prototype of a national private Cloud Computing infrastructure, devoted to accelerator control systems and large experiments of High Energy Physics (HEP). The !CHAOS project has been financed by MIUR (Italian Ministry of Research and Education) and aims to develop a new concept of control system and data acquisition framework by providing, with a high level of aaabstraction, all the services needed for controlling and managing a large scientific, or non-scientific, infrastructure. A beta version of the !CHAOS infrastructure will be released at the end of December 2015 and will run on private Cloud infrastructures based on OpenStack.
Mapping urban green open space in Bontang city using QGIS and cloud computing
NASA Astrophysics Data System (ADS)
Agus, F.; Ramadiani; Silalahi, W.; Armanda, A.; Kusnandar
2018-04-01
Digital mapping techniques are available freely and openly so that map-based application development is easier, faster and cheaper. A rapid development of Cloud Computing Geographic Information System makes this system can help the needs of the community for the provision of geospatial information online. The presence of urban Green Open Space (GOS) provide great benefits as an oxygen supplier, carbon-binding agent and can contribute to providing comfort and beauty of city life. This study aims to propose a platform application of GIS Cloud Computing (CC) of Bontang City GOS mapping. The GIS-CC platform uses the basic map available that’s free and open source. The research used survey method to collect GOS data obtained from Bontang City Government, while application developing works Quantum GIS-CC. The result section describes the existence of GOS Bontang City and the design of GOS mapping application.
Di Tommaso, Paolo; Orobitg, Miquel; Guirado, Fernando; Cores, Fernado; Espinosa, Toni; Notredame, Cedric
2010-08-01
We present the first parallel implementation of the T-Coffee consistency-based multiple aligner. We benchmark it on the Amazon Elastic Cloud (EC2) and show that the parallelization procedure is reasonably effective. We also conclude that for a web server with moderate usage (10K hits/month) the cloud provides a cost-effective alternative to in-house deployment. T-Coffee is a freeware open source package available from http://www.tcoffee.org/homepage.html
A Platform for Scalable Satellite and Geospatial Data Analysis
NASA Astrophysics Data System (ADS)
Beneke, C. M.; Skillman, S.; Warren, M. S.; Kelton, T.; Brumby, S. P.; Chartrand, R.; Mathis, M.
2017-12-01
At Descartes Labs, we use the commercial cloud to run global-scale machine learning applications over satellite imagery. We have processed over 5 Petabytes of public and commercial satellite imagery, including the full Landsat and Sentinel archives. By combining open-source tools with a FUSE-based filesystem for cloud storage, we have enabled a scalable compute platform that has demonstrated reading over 200 GB/s of satellite imagery into cloud compute nodes. In one application, we generated global 15m Landsat-8, 20m Sentinel-1, and 10m Sentinel-2 composites from 15 trillion pixels, using over 10,000 CPUs. We recently created a public open-source Python client library that can be used to query and access preprocessed public satellite imagery from within our platform, and made this platform available to researchers for non-commercial projects. In this session, we will describe how you can use the Descartes Labs Platform for rapid prototyping and scaling of geospatial analyses and demonstrate examples in land cover classification.
Towards Efficient Scientific Data Management Using Cloud Storage
NASA Technical Reports Server (NTRS)
He, Qiming
2013-01-01
A software prototype allows users to backup and restore data to/from both public and private cloud storage such as Amazon's S3 and NASA's Nebula. Unlike other off-the-shelf tools, this software ensures user data security in the cloud (through encryption), and minimizes users operating costs by using space- and bandwidth-efficient compression and incremental backup. Parallel data processing utilities have also been developed by using massively scalable cloud computing in conjunction with cloud storage. One of the innovations in this software is using modified open source components to work with a private cloud like NASA Nebula. Another innovation is porting the complex backup to- cloud software to embedded Linux, running on the home networking devices, in order to benefit more users.
Self-Similar Spin Images for Point Cloud Matching
NASA Astrophysics Data System (ADS)
Pulido, Daniel
The rapid growth of Light Detection And Ranging (Lidar) technologies that collect, process, and disseminate 3D point clouds have allowed for increasingly accurate spatial modeling and analysis of the real world. Lidar sensors can generate massive 3D point clouds of a collection area that provide highly detailed spatial and radiometric information. However, a Lidar collection can be expensive and time consuming. Simultaneously, the growth of crowdsourced Web 2.0 data (e.g., Flickr, OpenStreetMap) have provided researchers with a wealth of freely available data sources that cover a variety of geographic areas. Crowdsourced data can be of varying quality and density. In addition, since it is typically not collected as part of a dedicated experiment but rather volunteered, when and where the data is collected is arbitrary. The integration of these two sources of geoinformation can provide researchers the ability to generate products and derive intelligence that mitigate their respective disadvantages and combine their advantages. Therefore, this research will address the problem of fusing two point clouds from potentially different sources. Specifically, we will consider two problems: scale matching and feature matching. Scale matching consists of computing feature metrics of each point cloud and analyzing their distributions to determine scale differences. Feature matching consists of defining local descriptors that are invariant to common dataset distortions (e.g., rotation and translation). Additionally, after matching the point clouds they can be registered and processed further (e.g., change detection). The objective of this research is to develop novel methods to fuse and enhance two point clouds from potentially disparate sources (e.g., Lidar and crowdsourced Web 2.0 datasets). The scope of this research is to investigate both scale and feature matching between two point clouds. The specific focus of this research will be in developing a novel local descriptor based on the concept of self-similarity to aid in the scale and feature matching steps. An open problem in fusion is how best to extract features from two point clouds and then perform feature-based matching. The proposed approach for this matching step is the use of local self-similarity as an invariant measure to match features. In particular, the proposed approach is to combine the concept of local self-similarity with a well-known feature descriptor, Spin Images, and thereby define "Self-Similar Spin Images". This approach is then extended to the case of matching two points clouds in very different coordinate systems (e.g., a geo-referenced Lidar point cloud and stereo-image derived point cloud without geo-referencing). The use of Self-Similar Spin Images is again applied to address this problem by introducing a "Self-Similar Keyscale" that matches the spatial scales of two point clouds. Another open problem is how best to detect changes in content between two point clouds. A method is proposed to find changes between two point clouds by analyzing the order statistics of the nearest neighbors between the two clouds, and thereby define the "Nearest Neighbor Order Statistic" method. Note that the well-known Hausdorff distance is a special case as being just the maximum order statistic. Therefore, by studying the entire histogram of these nearest neighbors it is expected to yield a more robust method to detect points that are present in one cloud but not the other. This approach is applied at multiple resolutions. Therefore, changes detected at the coarsest level will yield large missing targets and at finer levels will yield smaller targets.
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Cloud Computing Environments
NASA Astrophysics Data System (ADS)
Li, C.; Wang, J.; Cui, C.; He, B.; Fan, D.; Yang, Y.; Chen, J.; Zhang, H.; Yu, C.; Xiao, J.; Wang, C.; Cao, Z.; Fan, Y.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Wang, J.; Yin, S.
2015-09-01
AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Based on CloudStack, an open source software, we set up the cloud computing environment for AstroCloud Project. It consists of five distributed nodes across the mainland of China. Users can use and analysis data in this cloud computing environment. Based on GlusterFS, we built a scalable cloud storage system. Each user has a private space, which can be shared among different virtual machines and desktop systems. With this environments, astronomer can access to astronomical data collected by different telescopes and data centers easily, and data producers can archive their datasets safely.
How NASA is Building a Petabyte Scale Geospatial Archive in the Cloud
NASA Technical Reports Server (NTRS)
Pilone, Dan; Quinn, Patrick; Jazayeri, Alireza; Baynes, Kathleen; Murphy, Kevin J.
2018-01-01
NASA's Earth Observing System Data and Information System (EOSDIS) is working towards a vision of a cloud-based, highly-flexible, ingest, archive, management, and distribution system for its ever-growing and evolving data holdings. This free and open source system, Cumulus, is emerging from its prototyping stages and is poised to make a huge impact on how NASA manages and disseminates its Earth science data. This talk outlines the motivation for this work, present the achievements and hurdles of the past 18 months and charts a course for the future expansion of Cumulus. We explore not just the technical, but also the socio-technical challenges that we face in evolving a system of this magnitude into the cloud. The NASA EOSDIS archive is currently at nearly 30 PBs and will grow to over 300PBs in the coming years. We've presented progress on this effort at AWS re:Invent and the American Geophysical Union (AGU) Fall Meeting in 2017 and hope to have the opportunity to share with FOSS4G attendees information on the availability of the open sourced software and how NASA intends on making its Earth Observing Geospatial data available for free to the public in the cloud.
Leveraging Open Standards and Technologies to Enhance Community Access to Earth Science Lidar Data
NASA Astrophysics Data System (ADS)
Crosby, C. J.; Nandigam, V.; Krishnan, S.; Cowart, C.; Baru, C.; Arrowsmith, R.
2011-12-01
Lidar (Light Detection and Ranging) data, collected from space, airborne and terrestrial platforms, have emerged as an invaluable tool for a variety of Earth science applications ranging from ice sheet monitoring to modeling of earth surface processes. However, lidar present a unique suite of challenges from the perspective of building cyberinfrastructure systems that enable the scientific community to access these valuable research datasets. Lidar data are typically characterized by millions to billions of individual measurements of x,y,z position plus attributes; these "raw" data are also often accompanied by derived raster products and are frequently terabytes in size. As a relatively new and rapidly evolving data collection technology, relevant open data standards and software projects are immature compared to those for other remote sensing platforms. The NSF-funded OpenTopography Facility project has developed an online lidar data access and processing system that co-locates data with on-demand processing tools to enable users to access both raw point cloud data as well as custom derived products and visualizations. OpenTopography is built on a Service Oriented Architecture (SOA) in which applications and data resources are deployed as standards compliant (XML and SOAP) Web services with the open source Opal Toolkit. To develop the underlying applications for data access, filtering and conversion, and various processing tasks, OpenTopography has heavily leveraged existing open source software efforts for both lidar and raster data. Operating on the de facto LAS binary point cloud format (maintained by ASPRS), open source libLAS and LASlib libraries provide OpenTopography data ingestion, query and translation capabilities. Similarly, raster data manipulation is performed through a suite of services built on the Geospatial Data Abstraction Library (GDAL). OpenTopography has also developed our own algorithm for high-performance gridding of lidar point cloud data, Points2Grid, and have released the code as an open source project. An emerging conversation that the lidar community and OpenTopography are actively engaged in is the need for open, community supported standards and metadata for both full waveform and terrestrial (waveform and discrete return) lidar data. Further, given the immature nature of many lidar data archives and limited online access to public domain data, there is an opportunity to develop interoperable data catalogs based on an open standard such as the OGC CSW specification to facilitate discovery and access to Earth science oriented lidar data.
Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing
NASA Technical Reports Server (NTRS)
Pham, Long; Chen, Aijun; Kempler, Steven; Lynnes, Christopher; Theobald, Michael; Asghar, Esfandiari; Campino, Jane; Vollmer, Bruce
2011-01-01
Cloud Computing has been implemented in several commercial arenas. The NASA Nebula Cloud Computing platform is an Infrastructure as a Service (IaaS) built in 2008 at NASA Ames Research Center and 2010 at GSFC. Nebula is an open source Cloud platform intended to: a) Make NASA realize significant cost savings through efficient resource utilization, reduced energy consumption, and reduced labor costs. b) Provide an easier way for NASA scientists and researchers to efficiently explore and share large and complex data sets. c) Allow customers to provision, manage, and decommission computing capabilities on an as-needed bases
Scalable cloud without dedicated storage
NASA Astrophysics Data System (ADS)
Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.
2015-05-01
We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.
NASA Astrophysics Data System (ADS)
Lachat, E.; Landes, T.; Grussenmeyer, P.
2018-05-01
Terrestrial and airborne laser scanning, photogrammetry and more generally 3D recording techniques are used in a wide range of applications. After recording several individual 3D datasets known in local systems, one of the first crucial processing steps is the registration of these data into a common reference frame. To perform such a 3D transformation, commercial and open source software as well as programs from the academic community are available. Due to some lacks in terms of computation transparency and quality assessment in these solutions, it has been decided to develop an open source algorithm which is presented in this paper. It is dedicated to the simultaneous registration of multiple point clouds as well as their georeferencing. The idea is to use this algorithm as a start point for further implementations, involving the possibility of combining 3D data from different sources. Parallel to the presentation of the global registration methodology which has been employed, the aim of this paper is to confront the results achieved this way with the above-mentioned existing solutions. For this purpose, first results obtained with the proposed algorithm to perform the global registration of ten laser scanning point clouds are presented. An analysis of the quality criteria delivered by two selected software used in this study and a reflexion about these criteria is also performed to complete the comparison of the obtained results. The final aim of this paper is to validate the current efficiency of the proposed method through these comparisons.
Hybrid cloud and cluster computing paradigms for life science applications
2010-01-01
Background Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Results Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. Conclusions The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. Methods We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments. PMID:21210982
Hybrid cloud and cluster computing paradigms for life science applications.
Qiu, Judy; Ekanayake, Jaliya; Gunarathne, Thilina; Choi, Jong Youl; Bae, Seung-Hee; Li, Hui; Zhang, Bingjing; Wu, Tak-Lon; Ruan, Yang; Ekanayake, Saliya; Hughes, Adam; Fox, Geoffrey
2010-12-21
Clouds and MapReduce have shown themselves to be a broadly useful approach to scientific computing especially for parallel data intensive applications. However they have limited applicability to some areas such as data mining because MapReduce has poor performance on problems with an iterative structure present in the linear algebra that underlies much data analysis. Such problems can be run efficiently on clusters using MPI leading to a hybrid cloud and cluster environment. This motivates the design and implementation of an open source Iterative MapReduce system Twister. Comparisons of Amazon, Azure, and traditional Linux and Windows environments on common applications have shown encouraging performance and usability comparisons in several important non iterative cases. These are linked to MPI applications for final stages of the data analysis. Further we have released the open source Twister Iterative MapReduce and benchmarked it against basic MapReduce (Hadoop) and MPI in information retrieval and life sciences applications. The hybrid cloud (MapReduce) and cluster (MPI) approach offers an attractive production environment while Twister promises a uniform programming environment for many Life Sciences applications. We used commercial clouds Amazon and Azure and the NSF resource FutureGrid to perform detailed comparisons and evaluations of different approaches to data intensive computing. Several applications were developed in MPI, MapReduce and Twister in these different environments.
Large-scale virtual screening on public cloud resources with Apache Spark.
Capuccini, Marco; Ahmed, Laeeq; Schaal, Wesley; Laure, Erwin; Spjuth, Ola
2017-01-01
Structure-based virtual screening is an in-silico method to screen a target receptor against a virtual molecular library. Applying docking-based screening to large molecular libraries can be computationally expensive, however it constitutes a trivially parallelizable task. Most of the available parallel implementations are based on message passing interface, relying on low failure rate hardware and fast network connection. Google's MapReduce revolutionized large-scale analysis, enabling the processing of massive datasets on commodity hardware and cloud resources, providing transparent scalability and fault tolerance at the software level. Open source implementations of MapReduce include Apache Hadoop and the more recent Apache Spark. We developed a method to run existing docking-based screening software on distributed cloud resources, utilizing the MapReduce approach. We benchmarked our method, which is implemented in Apache Spark, docking a publicly available target receptor against [Formula: see text]2.2 M compounds. The performance experiments show a good parallel efficiency (87%) when running in a public cloud environment. Our method enables parallel Structure-based virtual screening on public cloud resources or commodity computer clusters. The degree of scalability that we achieve allows for trying out our method on relatively small libraries first and then to scale to larger libraries. Our implementation is named Spark-VS and it is freely available as open source from GitHub (https://github.com/mcapuccini/spark-vs).Graphical abstract.
Marine CCN Activation: A Battle Between Primary and Secondary Sources
NASA Astrophysics Data System (ADS)
Fossum, K. N.; Ovadnevaite, J.; Ceburnis, D.; Preissler, J.; O'Dowd, C. D. D.
2017-12-01
Low-altitude marine clouds are cooling components of the Earth's radiative budget, and the direct measurements of the properties of these cloud forming particles, called cloud condensation nuclei (CCN), helps modellers reconstruct aerosol-to-cloud droplet processes, improving climate predictions. In this study, CCN are directly measured (CCNC commercially available from Droplet Measurement Technologies, Inc.), resolving activation efficiency at varying supersaturated conditions. Previous studies show that sub-micron sea salt particulates activate competitively, reducing the cloud peak supersaturation and inhibiting the activation of sulphate particulates into cloud droplets, making chemical composition an important component in determining cloud droplet number concentration (CDNC). This effect and the sea salt numbers needed to induce it have not been previously studied long-term in the natural environment. As part of this work, data was analysed from a two month marine research ship campaign during the Antarctic Austral summer, in 2015. Ambient aerosol in the Scotia Sea region was sampled continuously, and through the use of multiple aerosol in-situ instruments, this study shows that CCN number in both the open ocean and ice-pack influenced air masses are largely proportionate to secondary aerosol. However, open ocean air masses show a significant primary aerosol influence which changes the aerosol characteristics. Higher sea salt mass concentrations in the open ocean lead to better CCN activation efficiencies. Coupled with high wind speeds and sea surface turbulence, open ocean air masses show a repression of the CDNC number compared with the theoretical values that should be expected with the sub-cloud aerosol number concentration. This is not seen in the ice-pack air masses. Work is ongoing, looking into a long-term North Atlantic marine aerosol data set, but it would appear that chemical composition plays a large role in aerosol to cloud droplet processes, and can initially restrict CDNC when sea salt is abundant and updraft velocities are relatively low.
Cloud-based Jupyter Notebooks for Water Data Analysis
NASA Astrophysics Data System (ADS)
Castronova, A. M.; Brazil, L.; Seul, M.
2017-12-01
The development and adoption of technologies by the water science community to improve our ability to openly collaborate and share workflows will have a transformative impact on how we address the challenges associated with collaborative and reproducible scientific research. Jupyter notebooks offer one solution by providing an open-source platform for creating metadata-rich toolchains for modeling and data analysis applications. Adoption of this technology within the water sciences, coupled with publicly available datasets from agencies such as USGS, NASA, and EPA enables researchers to easily prototype and execute data intensive toolchains. Moreover, implementing this software stack in a cloud-based environment extends its native functionality to provide researchers a mechanism to build and execute toolchains that are too large or computationally demanding for typical desktop computers. Additionally, this cloud-based solution enables scientists to disseminate data processing routines alongside journal publications in an effort to support reproducibility. For example, these data collection and analysis toolchains can be shared, archived, and published using the HydroShare platform or downloaded and executed locally to reproduce scientific analysis. This work presents the design and implementation of a cloud-based Jupyter environment and its application for collecting, aggregating, and munging various datasets in a transparent, sharable, and self-documented manner. The goals of this work are to establish a free and open source platform for domain scientists to (1) conduct data intensive and computationally intensive collaborative research, (2) utilize high performance libraries, models, and routines within a pre-configured cloud environment, and (3) enable dissemination of research products. This presentation will discuss recent efforts towards achieving these goals, and describe the architectural design of the notebook server in an effort to support collaborative and reproducible science.
NASA Update for Unidata Stratcomm
NASA Technical Reports Server (NTRS)
Lynnes, Chris
2017-01-01
The NASA representative to the Unidata Strategic Committee presented a semiannual update on NASAs work with and use of Unidata technologies. The talk updated Unidata on the program of cloud computing prototypes underway for the Earth Observing System Data and Information System (EOSDIS). Also discussed was a trade study on the use of the Open source Project for a Network Data Access Protocol (OPeNDAP) with Web Object Storage in the cloud.
Neinstein, Aaron; Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh
2016-03-01
Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application ("app"), Blip, to visualize the data. Tidepool's software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool's open source, cloud model for health data interoperability is applicable to other healthcare use cases. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association.
Wong, Jenise; Look, Howard; Arbiter, Brandon; Quirk, Kent; McCanne, Steve; Sun, Yao; Blum, Michael; Adi, Saleh
2016-01-01
Objective Develop a device-agnostic cloud platform to host diabetes device data and catalyze an ecosystem of software innovation for type 1 diabetes (T1D) management. Materials and Methods An interdisciplinary team decided to establish a nonprofit company, Tidepool, and build open-source software. Results Through a user-centered design process, the authors created a software platform, the Tidepool Platform, to upload and host T1D device data in an integrated, device-agnostic fashion, as well as an application (“app”), Blip, to visualize the data. Tidepool’s software utilizes the principles of modular components, modern web design including REST APIs and JavaScript, cloud computing, agile development methodology, and robust privacy and security. Discussion By consolidating the currently scattered and siloed T1D device data ecosystem into one open platform, Tidepool can improve access to the data and enable new possibilities and efficiencies in T1D clinical care and research. The Tidepool Platform decouples diabetes apps from diabetes devices, allowing software developers to build innovative apps without requiring them to design a unique back-end (e.g., database and security) or unique ways of ingesting device data. It allows people with T1D to choose to use any preferred app regardless of which device(s) they use. Conclusion The authors believe that the Tidepool Platform can solve two current problems in the T1D device landscape: 1) limited access to T1D device data and 2) poor interoperability of data from different devices. If proven effective, Tidepool’s open source, cloud model for health data interoperability is applicable to other healthcare use cases. PMID:26338218
NASA Astrophysics Data System (ADS)
McCoy, Isabel L.; Wood, Robert; Fletcher, Jennifer K.
2017-11-01
Mesoscale cellular convective (MCC) clouds occur in large-scale patterns over the ocean and have important radiative effects on the climate system. An examination of time-varying meteorological conditions associated with satellite-observed open and closed MCC clouds is conducted to illustrate the influence of large-scale meteorological conditions. Marine cold air outbreaks (MCAO) influence the development of open MCC clouds and the transition from closed to open MCC clouds. MCC neural network classifications on Moderate Resolution Imaging Spectroradiometer (MODIS) data for 2008 are collocated with Clouds and the Earth's Radiant Energy System (CERES) data and ERA-Interim reanalysis to determine the radiative effects of MCC clouds and their thermodynamic environments. Closed MCC clouds are found to have much higher albedo on average than open MCC clouds for the same cloud fraction. Three meteorological control metrics are tested: sea-air temperature difference (ΔT), estimated inversion strength (EIS), and a MCAO index (M). These predictive metrics illustrate the importance of atmospheric surface forcing and static stability for open and closed MCC cloud formation. Predictive sigmoidal relations are found between M and MCC cloud frequency globally and regionally: negative for closed MCC cloud and positive for open MCC cloud. The open MCC cloud seasonal cycle is well correlated with M, while the seasonality of closed MCC clouds is well correlated with M in the midlatitudes and EIS in the tropics and subtropics. M is found to best distinguish open and closed MCC clouds on average over shorter time scales. The possibility of a MCC cloud feedback is discussed.
Arctic sea ice melt leads to atmospheric new particle formation.
Dall Osto, M; Beddows, D C S; Tunved, P; Krejci, R; Ström, J; Hansson, H-C; Yoon, Y J; Park, Ki-Tae; Becagli, S; Udisti, R; Onasch, T; O Dowd, C D; Simó, R; Harrison, Roy M
2017-06-12
Atmospheric new particle formation (NPF) and growth significantly influences climate by supplying new seeds for cloud condensation and brightness. Currently, there is a lack of understanding of whether and how marine biota emissions affect aerosol-cloud-climate interactions in the Arctic. Here, the aerosol population was categorised via cluster analysis of aerosol size distributions taken at Mt Zeppelin (Svalbard) during a 11 year record. The daily temporal occurrence of NPF events likely caused by nucleation in the polar marine boundary layer was quantified annually as 18%, with a peak of 51% during summer months. Air mass trajectory analysis and atmospheric nitrogen and sulphur tracers link these frequent nucleation events to biogenic precursors released by open water and melting sea ice regions. The occurrence of such events across a full decade was anti-correlated with sea ice extent. New particles originating from open water and open pack ice increased the cloud condensation nuclei concentration background by at least ca. 20%, supporting a marine biosphere-climate link through sea ice melt and low altitude clouds that may have contributed to accelerate Arctic warming. Our results prompt a better representation of biogenic aerosol sources in Arctic climate models.
Managing a tier-2 computer centre with a private cloud infrastructure
NASA Astrophysics Data System (ADS)
Bagnasco, Stefano; Berzano, Dario; Brunetti, Riccardo; Lusso, Stefano; Vallero, Sara
2014-06-01
In a typical scientific computing centre, several applications coexist and share a single physical infrastructure. An underlying Private Cloud infrastructure eases the management and maintenance of such heterogeneous applications (such as multipurpose or application-specific batch farms, Grid sites, interactive data analysis facilities and others), allowing dynamic allocation resources to any application. Furthermore, the maintenance of large deployments of complex and rapidly evolving middleware and application software is eased by the use of virtual images and contextualization techniques. Such infrastructures are being deployed in some large centres (see e.g. the CERN Agile Infrastructure project), but with several open-source tools reaching maturity this is becoming viable also for smaller sites. In this contribution we describe the Private Cloud infrastructure at the INFN-Torino Computer Centre, that hosts a full-fledged WLCG Tier-2 centre, an Interactive Analysis Facility for the ALICE experiment at the CERN LHC and several smaller scientific computing applications. The private cloud building blocks include the OpenNebula software stack, the GlusterFS filesystem and the OpenWRT Linux distribution (used for network virtualization); a future integration into a federated higher-level infrastructure is made possible by exposing commonly used APIs like EC2 and OCCI.
Development of a cloud-based Bioinformatics Training Platform.
Revote, Jerico; Watson-Haigh, Nathan S; Quenette, Steve; Bethwaite, Blair; McGrath, Annette; Shang, Catherine A
2017-05-01
The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. © The Author 2016. Published by Oxford University Press.
Development of a cloud-based Bioinformatics Training Platform
Revote, Jerico; Watson-Haigh, Nathan S.; Quenette, Steve; Bethwaite, Blair; McGrath, Annette
2017-01-01
Abstract The Bioinformatics Training Platform (BTP) has been developed to provide access to the computational infrastructure required to deliver sophisticated hands-on bioinformatics training courses. The BTP is a cloud-based solution that is in active use for delivering next-generation sequencing training to Australian researchers at geographically dispersed locations. The BTP was built to provide an easy, accessible, consistent and cost-effective approach to delivering workshops at host universities and organizations with a high demand for bioinformatics training but lacking the dedicated bioinformatics training suites required. To support broad uptake of the BTP, the platform has been made compatible with multiple cloud infrastructures. The BTP is an open-source and open-access resource. To date, 20 training workshops have been delivered to over 700 trainees at over 10 venues across Australia using the BTP. PMID:27084333
Identification of Program Signatures from Cloud Computing System Telemetry Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nichols, Nicole M.; Greaves, Mark T.; Smith, William P.
Malicious cloud computing activity can take many forms, including running unauthorized programs in a virtual environment. Detection of these malicious activities while preserving the privacy of the user is an important research challenge. Prior work has shown the potential viability of using cloud service billing metrics as a mechanism for proxy identification of malicious programs. Previously this novel detection method has been evaluated in a synthetic and isolated computational environment. In this paper we demonstrate the ability of billing metrics to identify programs, in an active cloud computing environment, including multiple virtual machines running on the same hypervisor. The openmore » source cloud computing platform OpenStack, is used for private cloud management at Pacific Northwest National Laboratory. OpenStack provides a billing tool (Ceilometer) to collect system telemetry measurements. We identify four different programs running on four virtual machines under the same cloud user account. Programs were identified with up to 95% accuracy. This accuracy is dependent on the distinctiveness of telemetry measurements for the specific programs we tested. Future work will examine the scalability of this approach for a larger selection of programs to better understand the uniqueness needed to identify a program. Additionally, future work should address the separation of signatures when multiple programs are running on the same virtual machine.« less
A computational- And storage-cloud for integration of biodiversity collections
Matsunaga, A.; Thompson, A.; Figueiredo, R. J.; Germain-Aubrey, C.C; Collins, M.; Beeman, R.S; Macfadden, B.J.; Riccardi, G.; Soltis, P.S; Page, L. M.; Fortes, J.A.B
2013-01-01
A core mission of the Integrated Digitized Biocollections (iDigBio) project is the building and deployment of a cloud computing environment customized to support the digitization workflow and integration of data from all U.S. nonfederal biocollections. iDigBio chose to use cloud computing technologies to deliver a cyberinfrastructure that is flexible, agile, resilient, and scalable to meet the needs of the biodiversity community. In this context, this paper describes the integration of open source cloud middleware, applications, and third party services using standard formats, protocols, and services. In addition, this paper demonstrates the value of the digitized information from collections in a broader scenario involving multiple disciplines.
GATECloud.net: a platform for large-scale, open-source text processing on the cloud.
Tablan, Valentin; Roberts, Ian; Cunningham, Hamish; Bontcheva, Kalina
2013-01-28
Cloud computing is increasingly being regarded as a key enabler of the 'democratization of science', because on-demand, highly scalable cloud computing facilities enable researchers anywhere to carry out data-intensive experiments. In the context of natural language processing (NLP), algorithms tend to be complex, which makes their parallelization and deployment on cloud platforms a non-trivial task. This study presents a new, unique, cloud-based platform for large-scale NLP research--GATECloud. net. It enables researchers to carry out data-intensive NLP experiments by harnessing the vast, on-demand compute power of the Amazon cloud. Important infrastructural issues are dealt with by the platform, completely transparently for the researcher: load balancing, efficient data upload and storage, deployment on the virtual machines, security and fault tolerance. We also include a cost-benefit analysis and usage evaluation.
Cloudy Skies over AGN: Observations with Simbol-X
NASA Astrophysics Data System (ADS)
Salvati, M.; Risaliti, G.
2009-05-01
Recent time-resolved spectroscopic X-ray studies of bright obscured AGN show that column density variability on time scales of hours/days may be common, at least for sources with NH>1023 cm-2. This opens new oppurtunities in the analysis of the structure of the circumnuclear medium and of the X-ray source: resolving the variations due to single clouds covering/uncovering the X-ray source provides tight constraints on the source size, the clouds' size and distance, and their average number, density and column density. We show how Simbol-X will provide a breakthrough in this field, thanks to its broad band coverage, allowing (a) to precisely disentangle the continuum and NH variations, and (2) to extend the NH variability analysis to column densities >1023 cm-2.
Cloudweaver: Adaptive and Data-Driven Workload Manager for Generic Clouds
NASA Astrophysics Data System (ADS)
Li, Rui; Chen, Lei; Li, Wen-Syan
Cloud computing denotes the latest trend in application development for parallel computing on massive data volumes. It relies on clouds of servers to handle tasks that used to be managed by an individual server. With cloud computing, software vendors can provide business intelligence and data analytic services for internet scale data sets. Many open source projects, such as Hadoop, offer various software components that are essential for building a cloud infrastructure. Current Hadoop (and many others) requires users to configure cloud infrastructures via programs and APIs and such configuration is fixed during the runtime. In this chapter, we propose a workload manager (WLM), called CloudWeaver, which provides automated configuration of a cloud infrastructure for runtime execution. The workload management is data-driven and can adapt to dynamic nature of operator throughput during different execution phases. CloudWeaver works for a single job and a workload consisting of multiple jobs running concurrently, which aims at maximum throughput using a minimum set of processors.
NASA Astrophysics Data System (ADS)
López García, Álvaro; Fernández del Castillo, Enol; Orviz Fernández, Pablo
In this document we present an implementation of the Open Grid Forum's Open Cloud Computing Interface (OCCI) for OpenStack, namely ooi (Openstack occi interface, 2015) [1]. OCCI is an open standard for management tasks over cloud resources, focused on interoperability, portability and integration. ooi aims to implement this open interface for the OpenStack cloud middleware, promoting interoperability with other OCCI-enabled cloud management frameworks and infrastructures. ooi focuses on being non-invasive with a vanilla OpenStack installation, not tied to a particular OpenStack release version.
A Cloud-Based Infrastructure for Near-Real-Time Processing and Dissemination of NPP Data
NASA Astrophysics Data System (ADS)
Evans, J. D.; Valente, E. G.; Chettri, S. S.
2011-12-01
We are building a scalable cloud-based infrastructure for generating and disseminating near-real-time data products from a variety of geospatial and meteorological data sources, including the new National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP). Our approach relies on linking Direct Broadcast and other data streams to a suite of scientific algorithms coordinated by NASA's International Polar-Orbiter Processing Package (IPOPP). The resulting data products are directly accessible to a wide variety of end-user applications, via industry-standard protocols such as OGC Web Services, Unidata Local Data Manager, or OPeNDAP, using open source software components. The processing chain employs on-demand computing resources from Amazon.com's Elastic Compute Cloud and NASA's Nebula cloud services. Our current prototype targets short-term weather forecasting, in collaboration with NASA's Short-term Prediction Research and Transition (SPoRT) program and the National Weather Service. Direct Broadcast is especially crucial for NPP, whose current ground segment is unlikely to deliver data quickly enough for short-term weather forecasters and other near-real-time users. Direct Broadcast also allows full local control over data handling, from the receiving antenna to end-user applications: this provides opportunities to streamline processes for data ingest, processing, and dissemination, and thus to make interpreted data products (Environmental Data Records) available to practitioners within minutes of data capture at the sensor. Cloud computing lets us grow and shrink computing resources to meet large and rapid fluctuations in data availability (twice daily for polar orbiters) - and similarly large fluctuations in demand from our target (near-real-time) users. This offers a compelling business case for cloud computing: the processing or dissemination systems can grow arbitrarily large to sustain near-real time data access despite surges in data volumes or user demand, but that computing capacity (and hourly costs) can be dropped almost instantly once the surge passes. Cloud computing also allows low-risk experimentation with a variety of machine architectures (processor types; bandwidth, memory, and storage capacities, etc.) and of system configurations (including massively parallel computing patterns). Finally, our service-based approach (in which user applications invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored products on demand. To maximize the usefulness and impact of our technology, we have emphasized open, industry-standard software interfaces. We are also using and developing open source software to facilitate the widespread adoption of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sources.
NASA Astrophysics Data System (ADS)
Balbin, Jessie R.; Pinugu, Jasmine Nadja J.; Basco, Abigail Joy S.; Cabanada, Myla B.; Gonzales, Patrisha Melrose V.; Marasigan, Juan Carlos C.
2017-06-01
The research aims to build a tool in assessing patients for post-traumatic stress disorder or PTSD. The parameters used are heart rate, skin conductivity, and facial gestures. Facial gestures are recorded using OpenFace, an open-source face recognition program that uses facial action units in to track facial movements. Heart rate and skin conductivity is measured through sensors operated using Raspberry Pi. Results are stored in a database for easy and quick access. Databases to be used are uploaded to a cloud platform so that doctors have direct access to the data. This research aims to analyze these parameters and give accurate assessment of the patient.
Cloud based, Open Source Software Application for Mitigating Herbicide Drift
NASA Astrophysics Data System (ADS)
Saraswat, D.; Scott, B.
2014-12-01
The spread of herbicide resistant weeds has resulted in the need for clearly marked fields. In response to this need, the University of Arkansas Cooperative Extension Service launched a program named Flag the Technology in 2011. This program uses color-coded flags as a visual alert of the herbicide trait technology within a farm field. The flag based program also serves to help avoid herbicide misapplication and prevent herbicide drift damage between fields with differing crop technologies. This program has been endorsed by Southern Weed Science Society of America and is attracting interest from across the USA, Canada, and Australia. However, flags have risk of misplacement or disappearance due to mischief or severe windstorms/thunderstorms, respectively. This presentation will discuss the design and development of a cloud-based, free application utilizing open-source technologies, called Flag the Technology Cloud (FTTCloud), for allowing agricultural stakeholders to color code their farm fields for indicating herbicide resistant technologies. The developed software utilizes modern web development practices, widely used design technologies, and basic geographic information system (GIS) based interactive interfaces for representing, color-coding, searching, and visualizing fields. This program has also been made compatible for a wider usability on different size devices- smartphones, tablets, desktops and laptops.
MarDRe: efficient MapReduce-based removal of duplicate DNA reads in the cloud.
Expósito, Roberto R; Veiga, Jorge; González-Domínguez, Jorge; Touriño, Juan
2017-09-01
This article presents MarDRe, a de novo cloud-ready duplicate and near-duplicate removal tool that can process single- and paired-end reads from FASTQ/FASTA datasets. MarDRe takes advantage of the widely adopted MapReduce programming model to fully exploit Big Data technologies on cloud-based infrastructures. Written in Java to maximize cross-platform compatibility, MarDRe is built upon the open-source Apache Hadoop project, the most popular distributed computing framework for scalable Big Data processing. On a 16-node cluster deployed on the Amazon EC2 cloud platform, MarDRe is up to 8.52 times faster than a representative state-of-the-art tool. Source code in Java and Hadoop as well as a user's guide are freely available under the GNU GPLv3 license at http://mardre.des.udc.es . rreye@udc.es. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Sector and Sphere: the design and implementation of a high-performance data cloud
Gu, Yunhong; Grossman, Robert L.
2009-01-01
Cloud computing has demonstrated that processing very large datasets over commodity clusters can be done simply, given the right programming model and infrastructure. In this paper, we describe the design and implementation of the Sector storage cloud and the Sphere compute cloud. By contrast with the existing storage and compute clouds, Sector can manage data not only within a data centre, but also across geographically distributed data centres. Similarly, the Sphere compute cloud supports user-defined functions (UDFs) over data both within and across data centres. As a special case, MapReduce-style programming can be implemented in Sphere by using a Map UDF followed by a Reduce UDF. We describe some experimental studies comparing Sector/Sphere and Hadoop using the Terasort benchmark. In these studies, Sector is approximately twice as fast as Hadoop. Sector/Sphere is open source. PMID:19451100
A Simple Technique for Securing Data at Rest Stored in a Computing Cloud
NASA Astrophysics Data System (ADS)
Sedayao, Jeff; Su, Steven; Ma, Xiaohao; Jiang, Minghao; Miao, Kai
"Cloud Computing" offers many potential benefits, including cost savings, the ability to deploy applications and services quickly, and the ease of scaling those application and services once they are deployed. A key barrier for enterprise adoption is the confidentiality of data stored on Cloud Computing Infrastructure. Our simple technique implemented with Open Source software solves this problem by using public key encryption to render stored data at rest unreadable by unauthorized personnel, including system administrators of the cloud computing service on which the data is stored. We validate our approach on a network measurement system implemented on PlanetLab. We then use it on a service where confidentiality is critical - a scanning application that validates external firewall implementations.
RBioCloud: A Light-Weight Framework for Bioconductor and R-based Jobs on the Cloud.
Varghese, Blesson; Patel, Ishan; Barker, Adam
2015-01-01
Large-scale ad hoc analytics of genomic data is popular using the R-programming language supported by over 700 software packages provided by Bioconductor. More recently, analytical jobs are benefitting from on-demand computing and storage, their scalability and their low maintenance cost, all of which are offered by the cloud. While biologists and bioinformaticists can take an analytical job and execute it on their personal workstations, it remains challenging to seamlessly execute the job on the cloud infrastructure without extensive knowledge of the cloud dashboard. How analytical jobs can not only with minimum effort be executed on the cloud, but also how both the resources and data required by the job can be managed is explored in this paper. An open-source light-weight framework for executing R-scripts using Bioconductor packages, referred to as `RBioCloud', is designed and developed. RBioCloud offers a set of simple command-line tools for managing the cloud resources, the data and the execution of the job. Three biological test cases validate the feasibility of RBioCloud. The framework is available from http://www.rbiocloud.com.
Cloud Based Metalearning System for Predictive Modeling of Biomedical Data
Vukićević, Milan
2014-01-01
Rapid growth and storage of biomedical data enabled many opportunities for predictive modeling and improvement of healthcare processes. On the other side analysis of such large amounts of data is a difficult and computationally intensive task for most existing data mining algorithms. This problem is addressed by proposing a cloud based system that integrates metalearning framework for ranking and selection of best predictive algorithms for data at hand and open source big data technologies for analysis of biomedical data. PMID:24892101
Implementation and use of a highly available and innovative IaaS solution: the Cloud Area Padovana
NASA Astrophysics Data System (ADS)
Aiftimiei, C.; Andreetto, P.; Bertocco, S.; Biasotto, M.; Dal Pra, S.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Frizziero, E.; Gulmini, M.; Michelotto, M.; Sgaravatto, M.; Traldi, S.; Venaruzzo, M.; Verlato, M.; Zangrando, L.
2015-12-01
While in the business world the cloud paradigm is typically implemented purchasing resources and services from third party providers (e.g. Amazon), in the scientific environment there's usually the need of on-premises IaaS infrastructures which allow efficient usage of the hardware distributed among (and owned by) different scientific administrative domains. In addition, the requirement of open source adoption has led to the choice of products like OpenStack by many organizations. We describe a use case of the Italian National Institute for Nuclear Physics (INFN) which resulted in the implementation of a unique cloud service, called ’Cloud Area Padovana’, which encompasses resources spread over two different sites: the INFN Legnaro National Laboratories and the INFN Padova division. We describe how this IaaS has been implemented, which technologies have been adopted and how services have been configured in high-availability (HA) mode. We also discuss how identity and authorization management were implemented, adopting a widely accepted standard architecture based on SAML2 and OpenID: by leveraging the versatility of those standards the integration with authentication federations like IDEM was implemented. We also discuss some other innovative developments, such as a pluggable scheduler, implemented as an extension of the native OpenStack scheduler, which allows the allocation of resources according to a fair-share based model and which provides a persistent queuing mechanism for handling user requests that can not be immediately served. Tools, technologies, procedures used to install, configure, monitor, operate this cloud service are also discussed. Finally we present some examples that show how this IaaS infrastructure is being used.
Development of an open-source cloud-connected sensor-monitoring platform
USDA-ARS?s Scientific Manuscript database
Rapid advances in electronics and communications technologies offer continuously evolving options for sensing and awareness of the physical environment. Many of these advances are becoming increasingly available to “non-professionals,” that is, those without formal training or expertise in discipli...
Digital Textbooks. Research Brief
ERIC Educational Resources Information Center
Johnston, Howard
2011-01-01
Despite their growing popularity, digital alternatives to conventional textbooks are stirring up controversy. With the introduction of tablet computers, and the growing trend toward "cloud computing" and "open source" software, the trend is accelerating because costs are coming down and free or inexpensive materials are becoming more available.…
Trends and New Directions in Software Architecture
2014-10-10
frameworks Open source Cloud strategies NoSQL Machine Learning MDD Incremental approaches Dashboards Distributed development...complexity grows NoSQL Models are not created equal 2014 Our Current Research Lightweight Evaluation and Architecture Prototyping for Big Data
Halligan, Brian D.; Geiger, Joey F.; Vallejos, Andrew K.; Greene, Andrew S.; Twigger, Simon N.
2009-01-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step by step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center website (http://proteomics.mcw.edu/vipdac). PMID:19358578
NASA Astrophysics Data System (ADS)
Kempton, Eliza M.-R.; Lupu, Roxana; Owusu-Asare, Albert; Slough, Patrick; Cale, Bryson
2017-04-01
We present Exo-Transmit, a software package to calculate exoplanet transmission spectra for planets of varied composition. The code is designed to generate spectra of planets with a wide range of atmospheric composition, temperature, surface gravity, and size, and is therefore applicable to exoplanets ranging in mass and size from hot Jupiters down to rocky super-Earths. Spectra can be generated with or without clouds or hazes with options to (1) include an optically thick cloud deck at a user-specified atmospheric pressure or (2) to augment the nominal Rayleigh scattering by a user-specified factor. The Exo-Transmit code is written in C and is extremely easy to use. Typically the user will only need to edit parameters in a single user input file in order to run the code for a planet of their choosing. Exo-Transmit is available publicly on Github with open-source licensing at https://github.com/elizakempton/Exo_Transmit.
Halligan, Brian D; Geiger, Joey F; Vallejos, Andrew K; Greene, Andrew S; Twigger, Simon N
2009-06-01
One of the major difficulties for many laboratories setting up proteomics programs has been obtaining and maintaining the computational infrastructure required for the analysis of the large flow of proteomics data. We describe a system that combines distributed cloud computing and open source software to allow laboratories to set up scalable virtual proteomics analysis clusters without the investment in computational hardware or software licensing fees. Additionally, the pricing structure of distributed computing providers, such as Amazon Web Services, allows laboratories or even individuals to have large-scale computational resources at their disposal at a very low cost per run. We provide detailed step-by-step instructions on how to implement the virtual proteomics analysis clusters as well as a list of current available preconfigured Amazon machine images containing the OMSSA and X!Tandem search algorithms and sequence databases on the Medical College of Wisconsin Proteomics Center Web site ( http://proteomics.mcw.edu/vipdac ).
Enabling a systems biology knowledgebase with gaggle and firegoose
DOE Office of Scientific and Technical Information (OSTI.GOV)
Baliga, Nitin S.
The overall goal of this project was to extend the existing Gaggle and Firegoose systems to develop an open-source technology that runs over the web and links desktop applications with many databases and software applications. This technology would enable researchers to incorporate workflows for data analysis that can be executed from this interface to other online applications. The four specific aims were to (1) provide one-click mapping of genes, proteins, and complexes across databases and species; (2) enable multiple simultaneous workflows; (3) expand sophisticated data analysis for online resources; and enhance open-source development of the Gaggle-Firegoose infrastructure. Gaggle is anmore » open-source Java software system that integrates existing bioinformatics programs and data sources into a user-friendly, extensible environment to allow interactive exploration, visualization, and analysis of systems biology data. Firegoose is an extension to the Mozilla Firefox web browser that enables data transfer between websites and desktop tools including Gaggle. In the last phase of this funding period, we have made substantial progress on development and application of the Gaggle integration framework. We implemented the workspace to the Network Portal. Users can capture data from Firegoose and save them to the workspace. Users can create workflows to start multiple software components programmatically and pass data between them. Results of analysis can be saved to the cloud so that they can be easily restored on any machine. We also developed the Gaggle Chrome Goose, a plugin for the Google Chrome browser in tandem with an opencpu server in the Amazon EC2 cloud. This allows users to interactively perform data analysis on a single web page using the R packages deployed on the opencpu server. The cloud-based framework facilitates collaboration between researchers from multiple organizations. We have made a number of enhancements to the cmonkey2 application to enable and improve the integration within different environments, and we have created a new tools pipeline for generating EGRIN2 models in a largely automated way.« less
WASS: An open-source pipeline for 3D stereo reconstruction of ocean waves
NASA Astrophysics Data System (ADS)
Bergamasco, Filippo; Torsello, Andrea; Sclavo, Mauro; Barbariol, Francesco; Benetazzo, Alvise
2017-10-01
Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community and industry. Indeed, recent advances of both computer vision algorithms and computer processing power now allow the study of the spatio-temporal wave field with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner, so that the implementation of a sea-waves 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well tested software package that automates the reconstruction process from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS (http://www.dais.unive.it/wass), an Open-Source stereo processing pipeline for sea waves 3D reconstruction. Our tool completely automates all the steps required to estimate dense point clouds from stereo images. Namely, it computes the extrinsic parameters of the stereo rig so that no delicate calibration has to be performed on the field. It implements a fast 3D dense stereo reconstruction procedure based on the consolidated OpenCV library and, lastly, it includes set of filtering techniques both on the disparity map and the produced point cloud to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface. In this paper, we describe the architecture of WASS and the internal algorithms involved. The pipeline workflow is shown step-by-step and demonstrated on real datasets acquired at sea.
Marine boundary layer cloud regimes and POC formation in an LES coupled to a bulk aerosol scheme
NASA Astrophysics Data System (ADS)
Berner, A. H.; Bretherton, C. S.; Wood, R.; Muhlbauer, A.
2013-07-01
A large-eddy simulation (LES) coupled to a new bulk aerosol scheme is used to study long-lived regimes of aerosol-boundary layer cloud-precipitation interaction and the development of pockets of open cells (POCs) in subtropical stratocumulus cloud layers. The aerosol scheme prognoses mass and number concentration of a single log-normal accumulation mode with surface and entrainment sources, evolving subject to processing of activated aerosol and scavenging of dry aerosol by cloud and rain. The LES with the aerosol scheme is applied to a range of steadily-forced simulations idealized from a well-observed POC case. The long-term system evolution is explored with extended two-dimensional simulations of up to 20 days, mostly with diurnally-averaged insolation. One three-dimensional two-day simulation confirms the initial development of the corresponding two-dimensional case. With weak mean subsidence, an initially aerosol-rich mixed layer deepens, the capping stratocumulus cloud slowly thickens and increasingly depletes aerosol via precipitation accretion, then the boundary layer transitions within a few hours into an open-cell regime with scattered precipitating cumuli, in which entrainment is much weaker. The inversion slowly collapses for several days until the cumulus clouds are too shallow to efficiently precipitate. Inversion cloud then reforms and radiatively drives renewed entrainment, allowing the boundary layer to deepen and become more aerosol-rich, until the stratocumulus layer thickens enough to undergo another cycle of open-cell formation. If mean subsidence is stronger, the stratocumulus never thickens enough to initiate drizzle and settles into a steady state. With lower initial aerosol concentrations, this system quickly transitions into open cells, collapses, and redevelops into a different steady state with a shallow, optically thin cloud layer. In these steady states, interstitial scavenging by cloud droplets is the main sink of aerosol number. The system is described in a reduced two-dimensional phase plane with inversion height and boundary-layer average aerosol concentrations as the state variables. Simulations with a full diurnal cycle show similar evolutions, except that open-cell formation is phase-locked into the early morning hours. The same steadily-forced modeling framework is applied to the development and evolution of a POC and the surrounding overcast boundary layer. An initial aerosol perturbation applied to a portion of the model domain leads that portion to transition into open-cell convection, forming a POC. Reduced entrainment in the POC induces a negative feedback between areal fraction covered by the POC and boundary layer depth changes. This stabilizes the system by controlling liquid water path and precipitation sinks of aerosol number in the overcast region, while also preventing boundary-layer collapse within the POC, allowing the POC and overcast to coexist indefinitely in a quasi-steady equilibrium.
BioContainers: an open-source and community-driven framework for software standardization.
da Veiga Leprevost, Felipe; Grüning, Björn A; Alves Aflitos, Saulo; Röst, Hannes L; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I; Perez-Riverol, Yasset
2017-08-15
BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). The software is freely available at github.com/BioContainers/. yperez@ebi.ac.uk. © The Author(s) 2017. Published by Oxford University Press.
BioContainers: an open-source and community-driven framework for software standardization
da Veiga Leprevost, Felipe; Grüning, Björn A.; Alves Aflitos, Saulo; Röst, Hannes L.; Uszkoreit, Julian; Barsnes, Harald; Vaudel, Marc; Moreno, Pablo; Gatto, Laurent; Weber, Jonas; Bai, Mingze; Jimenez, Rafael C.; Sachsenberg, Timo; Pfeuffer, Julianus; Vera Alvarez, Roberto; Griss, Johannes; Nesvizhskii, Alexey I.; Perez-Riverol, Yasset
2017-01-01
Abstract Motivation BioContainers (biocontainers.pro) is an open-source and community-driven framework which provides platform independent executable environments for bioinformatics software. BioContainers allows labs of all sizes to easily install bioinformatics software, maintain multiple versions of the same software and combine tools into powerful analysis pipelines. BioContainers is based on popular open-source projects Docker and rkt frameworks, that allow software to be installed and executed under an isolated and controlled environment. Also, it provides infrastructure and basic guidelines to create, manage and distribute bioinformatics containers with a special focus on omics technologies. These containers can be integrated into more comprehensive bioinformatics pipelines and different architectures (local desktop, cloud environments or HPC clusters). Availability and Implementation The software is freely available at github.com/BioContainers/. Contact yperez@ebi.ac.uk PMID:28379341
Key Lessons in Building "Data Commons": The Open Science Data Cloud Ecosystem
NASA Astrophysics Data System (ADS)
Patterson, M.; Grossman, R.; Heath, A.; Murphy, M.; Wells, W.
2015-12-01
Cloud computing technology has created a shift around data and data analysis by allowing researchers to push computation to data as opposed to having to pull data to an individual researcher's computer. Subsequently, cloud-based resources can provide unique opportunities to capture computing environments used both to access raw data in its original form and also to create analysis products which may be the source of data for tables and figures presented in research publications. Since 2008, the Open Cloud Consortium (OCC) has operated the Open Science Data Cloud (OSDC), which provides scientific researchers with computational resources for storing, sharing, and analyzing large (terabyte and petabyte-scale) scientific datasets. OSDC has provided compute and storage services to over 750 researchers in a wide variety of data intensive disciplines. Recently, internal users have logged about 2 million core hours each month. The OSDC also serves the research community by colocating these resources with access to nearly a petabyte of public scientific datasets in a variety of fields also accessible for download externally by the public. In our experience operating these resources, researchers are well served by "data commons," meaning cyberinfrastructure that colocates data archives, computing, and storage infrastructure and supports essential tools and services for working with scientific data. In addition to the OSDC public data commons, the OCC operates a data commons in collaboration with NASA and is developing a data commons for NOAA datasets. As cloud-based infrastructures for distributing and computing over data become more pervasive, we ask, "What does it mean to publish data in a data commons?" Here we present the OSDC perspective and discuss several services that are key in architecting data commons, including digital identifier services.
ERIC Educational Resources Information Center
Warschauer, Mark; Arada, Kathleen; Zheng, Binbin
2010-01-01
Can daily access to laptop computers help students become better writers? Are such programs affordable? Evidence from the Inspired Writing program in Littleton Public Schools, Colorado, USA, provides a resounding yes to both questions. The program employs student netbooks, open-source software, cloud computing, and social media to help students in…
Possible external sources of terrestrial cloud cover variability: the solar wind
NASA Astrophysics Data System (ADS)
Voiculescu, Mirela; Usoskin, Ilya; Condurache-Bota, Simona
2014-05-01
Cloud cover plays an important role in the terrestrial radiation budget. The possible influence of the solar activity on cloud cover is still an open question with contradictory answers. An extraterrestrial factor potentially affecting the cloud cover is related to fields associated with solar wind. We focus here on a derived quantity, the interplanetary electric field (IEF), defined as the product between the solar wind speed and the meridional component, Bz, of the interplanetary magnetic field (IMF) in the Geocentric Solar Magnetospheric (GSM) system. We show that cloud cover at mid-high latitudes systematically correlates with positive IEF, which has a clear energetic input into the atmosphere, but not with negative IEF, in general agreement with predictions of the global electric circuit (GEC)-related mechanism. Since the IEF responds differently to solar activity than, for instance, cosmic ray flux or solar irradiance, we also show that such a study allows distinguishing one solar-driven mechanism of cloud evolution, via the GEC, from others. We also present results showing that the link between cloud cover and IMF varies depending on composition and altitude of clouds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martins, JV
2016-04-01
The Open Imaging Nephelometer (O-I-Neph) instrument is an adaptation of a proven laboratory instrument built and tested at the University of Maryland, Baltimore County (UMBC), the Polarized Imaging Nephelometer (PI-Neph). The instrument design of both imaging nephelometers uses a narrow-beam laser source and a wide-field-of-view imaging camera to capture the entire scattering-phase function in one image, quasi-instantaneously.
Cloud-Based Distributed Control of Unmanned Systems
2015-04-01
during mission execution. At best, the data is saved onto hard-drives and is accessible only by the local team. Data history in a form available and...following open source technologies: GeoServer, OpenLayers, PostgreSQL , and PostGIS are chosen to implement the back-end database and server. A brief...geospatial map data. 3. PostgreSQL : An SQL-compliant object-relational database that easily scales to accommodate large amounts of data - upwards to
NASA Astrophysics Data System (ADS)
Ham, J. M.
2016-12-01
New microprocessor boards, open-source sensors, and cloud infrastructure developed for the Internet of Things (IoT) can be used to create low-cost monitoring systems for environmental research. This project describes two applications in soil science and hydrology: 1) remote monitoring of the soil temperature regime near oil and gas operations to detect the thermal signature associated with the natural source zone degradation of hydrocarbon contaminants in the vadose zone, and 2) remote monitoring of soil water content near the surface as part of a global citizen science network. In both cases, prototype data collection systems were built around the cellular (2G/3G) "Electron" microcontroller (www.particle.io). This device allows connectivity to the cloud using a low-cost global SIM and data plan. The systems have cellular connectivity in over 100 countries and data can be logged to the cloud for storage. Users can view data real time over any internet connection or via their smart phone. For both projects, data logging, storage, and visualization was done using IoT services like Thingspeak (thingspeak.com). The soil thermal monitoring system was tested on experimental plots in Colorado USA to evaluate the accuracy and reliability of different temperature sensors and 3D printed housings. The soil water experiment included comparison opens-source capacitance-based sensors to commercial versions. Results demonstrate the power of leveraging IoT technology for field research.
NASA Astrophysics Data System (ADS)
Clarke, A. D.; Snider, J.; Freitag, S.; Feingold, G.; Campos, T. L.; Breckhovskikh, V.; Kazil, J.
2011-12-01
The worlds largest stratus deck over the South East Pacific (SEP) was a study target for the VOCALS (http://www.eol.ucar.edu/projects/vocals/) experiment in October 2008. Aerosol-cloud interactions were one major goal of several ship and aircraft studies including results from 14 flights of the NCAR C-130 aircraft reported here. Each flight covered about a 1000 km range with multiple profiles and legs below, in and above the Sc deck. Strong aerosol sources along the coast of Chile were expected and found to influence cloud condensation nuclei (CCN) in coastal clouds. However; "rivers" of elevated CO, black carbon (BC) associated with combustion aerosol effective as CCN at <0.3%S were also common in subsiding FT air overlying the extensive Sc deck for over 1000km offshore. This subsidence, linked to the Hadley circulation, brought in aerosol from sources over the western Pacific as well as South America. Observed entrainment of this aerosol appeared linked to cloud related turbulence. When present, this combustion aerosol increased available CCN and decreased effective radius compared to clouds in "clean" MBL air advected from the South Pacific. We hypothesize that this entrainment can help buffer MBL clouds over the SEP against depletion of CCN by drizzle. This may delay transition of closed cell to open cell convection, potentially leading to increased lifetimes of Sc clouds that entrain such aerosol.
Enabling a new Paradigm to Address Big Data and Open Science Challenges
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; Fisher, Ward
2017-04-01
Data are not only the lifeblood of the geosciences but they have become the currency of the modern world in science and society. Rapid advances in computing, communi¬cations, and observational technologies — along with concomitant advances in high-resolution modeling, ensemble and coupled-systems predictions of the Earth system — are revolutionizing nearly every aspect of our field. Modern data volumes from high-resolution ensemble prediction/projection/simulation systems and next-generation remote-sensing systems like hyper-spectral satellite sensors and phased-array radars are staggering. For example, CMIP efforts alone will generate many petabytes of climate projection data for use in assessments of climate change. And NOAA's National Climatic Data Center projects that it will archive over 350 petabytes by 2030. For researchers and educators, this deluge and the increasing complexity of data brings challenges along with the opportunities for discovery and scientific breakthroughs. The potential for big data to transform the geosciences is enormous, but realizing the next frontier depends on effectively managing, analyzing, and exploiting these heterogeneous data sources, extracting knowledge and useful information from heterogeneous data sources in ways that were previously impossible, to enable discoveries and gain new insights. At the same time, there is a growing focus on the area of "Reproducibility or Replicability in Science" that has implications for Open Science. The advent of cloud computing has opened new avenues for not only addressing both big data and Open Science challenges to accelerate scientific discoveries. However, to successfully leverage the enormous potential of cloud technologies, it will require the data providers and the scientific communities to develop new paradigms to enable next-generation workflows and transform the conduct of science. Making data readily available is a necessary but not a sufficient condition. Data providers also need to give scientists an ecosystem that includes data, tools, workflows and other services needed to perform analytics, integration, interpretation, and synthesis - all in the same environment or platform. Instead of moving data to processing systems near users, as is the tradition, the cloud permits one to bring processing, computing, analysis and visualization to data - so called data proximate workbench capabilities, also known as server-side processing. In this talk, I will present the ongoing work at Unidata to facilitate a new paradigm for doing science by offering a suite of tools, resources, and platforms to leverage cloud technologies for addressing both big data and Open Science/reproducibility challenges. That work includes the development and deployment of new protocols for data access and server-side operations and Docker container images of key applications, JupyterHub Python notebook tools, and cloud-based analysis and visualization capability via the CloudIDV tool to enable reproducible workflows and effectively use the accessed data.
Learner Ownership of Technology-Enhanced Learning
ERIC Educational Resources Information Center
Dommett, Eleanor J.
2018-01-01
Purpose: This paper aims to examine the different ways in which learners may have ownership over technology-enhanced learning by reflecting on technical, legal and psychological ownership. Design/methodology/approach: The paper uses a variety of examples of technology-enhanced learning ranging from open-source software to cloud storage to discuss…
WASS: an open-source stereo processing pipeline for sea waves 3D reconstruction
NASA Astrophysics Data System (ADS)
Bergamasco, Filippo; Benetazzo, Alvise; Torsello, Andrea; Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro
2017-04-01
Stereo 3D reconstruction of ocean waves is gaining more and more popularity in the oceanographic community. In fact, recent advances of both computer vision algorithms and CPU processing power can now allow the study of the spatio-temporal wave fields with unprecedented accuracy, especially at small scales. Even if simple in theory, multiple details are difficult to be mastered for a practitioner so that the implementation of a 3D reconstruction pipeline is in general considered a complex task. For instance, camera calibration, reliable stereo feature matching and mean sea-plane estimation are all factors for which a well designed implementation can make the difference to obtain valuable results. For this reason, we believe that the open availability of a well-tested software package that automates the steps from stereo images to a 3D point cloud would be a valuable addition for future researches in this area. We present WASS, a completely Open-Source stereo processing pipeline for sea waves 3D reconstruction, available at http://www.dais.unive.it/wass/. Our tool completely automates the recovery of dense point clouds from stereo images by providing three main functionalities. First, WASS can automatically recover the extrinsic parameters of the stereo rig (up to scale) so that no delicate calibration has to be performed on the field. Second, WASS implements a fast 3D dense stereo reconstruction procedure so that an accurate 3D point cloud can be computed from each stereo pair. We rely on the well-consolidated OpenCV library both for the image stereo rectification and disparity map recovery. Lastly, a set of 2D and 3D filtering techniques both on the disparity map and the produced point cloud are implemented to remove the vast majority of erroneous points that can naturally arise while analyzing the optically complex nature of the water surface (examples are sun-glares, large white-capped areas, fog and water areosol, etc). Developed to be as fast as possible, WASS can process roughly four 5 MPixel stereo frames per minute (on a consumer i7 CPU) to produce a sequence of outlier-free point clouds with more than 3 million points each. Finally, it comes with an easy to use user interface and designed to be scalable on multiple parallel CPUs.
NASA Technical Reports Server (NTRS)
Patterson, Maria T.; Anderson, Nicholas; Bennett, Collin; Bruggemann, Jacob; Grossman, Robert L.; Handy, Matthew; Ly, Vuong; Mandl, Daniel J.; Pederson, Shane; Pivarski, James;
2016-01-01
Project Matsu is a collaboration between the Open Commons Consortium and NASA focused on developing open source technology for cloud-based processing of Earth satellite imagery with practical applications to aid in natural disaster detection and relief. Project Matsu has developed an open source cloud-based infrastructure to process, analyze, and reanalyze large collections of hyperspectral satellite image data using OpenStack, Hadoop, MapReduce and related technologies. We describe a framework for efficient analysis of large amounts of data called the Matsu "Wheel." The Matsu Wheel is currently used to process incoming hyperspectral satellite data produced daily by NASA's Earth Observing-1 (EO-1) satellite. The framework allows batches of analytics, scanning for new data, to be applied to data as it flows in. In the Matsu Wheel, the data only need to be accessed and preprocessed once, regardless of the number or types of analytics, which can easily be slotted into the existing framework. The Matsu Wheel system provides a significantly more efficient use of computational resources over alternative methods when the data are large, have high-volume throughput, may require heavy preprocessing, and are typically used for many types of analysis. We also describe our preliminary Wheel analytics, including an anomaly detector for rare spectral signatures or thermal anomalies in hyperspectral data and a land cover classifier that can be used for water and flood detection. Each of these analytics can generate visual reports accessible via the web for the public and interested decision makers. The result products of the analytics are also made accessible through an Open Geospatial Compliant (OGC)-compliant Web Map Service (WMS) for further distribution. The Matsu Wheel allows many shared data services to be performed together to efficiently use resources for processing hyperspectral satellite image data and other, e.g., large environmental datasets that may be analyzed for many purposes.
Evaluation of Arctic Clouds And Their Response to External Forcing in Climate Models
NASA Astrophysics Data System (ADS)
Wang, Y.; Jiang, J. H.; Ming, Y.; Su, H.; Yung, Y. L.
2017-12-01
A warming Arctic is undergoing significant environmental changes, mostly evidenced by the reduction in Arctic sea-ice extent (SIE). However, the role of Arctic clouds in determining the sea ice melting remains elusive, as different phases of clouds can induce either positive or negative radiative forcing in different seasons. The possible cloud feedbacks following the opened ocean surface are also debatable due to variations of polar boundary structure. Therefore, Arctic cloud simulation has long been considered as the largest source of uncertainty in the climate sensitivity assessment. Other local or remote atmospheric factors, such as poleward moisture and heat transport as well as atmospheric aerosols seeding liquid and ice clouds, further complicate our understanding of the Arctic cloud change. Our recent efforts focus on the post-CMIP5 and CMIP6 models, which improve atmospheric compositions, cloud macro- and microphysics, convection parameterizations, etc. In this study, we utilize long-term satellite measurements with high-resolution coverage and broad wavelength spectrum to evaluate the mean states and variations of mixed-phase clouds in the Arctic, along with the concurrent moisture and SIE measurements. The model sensitivity experiments to understand external perturbations on the atmosphere-cryosphere coupling in the Arctic will be presented.
Identity federation in OpenStack - an introduction to hybrid clouds
NASA Astrophysics Data System (ADS)
Denis, Marek; Castro Leon, Jose; Ormancey, Emmanuel; Tedesco, Paolo
2015-12-01
We are evaluating cloud identity federation available in the OpenStack ecosystem that allows for on premise bursting into remote clouds with use of local identities (i.e. domain accounts). Further enhancements to identity federation are a clear way to hybrid cloud architectures - virtualized infrastructures layered across independent private and public clouds.
Cloud prediction of protein structure and function with PredictProtein for Debian.
Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Staniewski, Cedric; Rost, Burkhard
2013-01-01
We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome.
Cloud Prediction of Protein Structure and Function with PredictProtein for Debian
Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Rost, Burkhard
2013-01-01
We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome. PMID:23971032
NASA Astrophysics Data System (ADS)
Nguyen, L.; Chee, T.; Palikonda, R.; Smith, W. L., Jr.; Bedka, K. M.; Spangenberg, D.; Vakhnin, A.; Lutz, N. E.; Walter, J.; Kusterer, J.
2017-12-01
Cloud Computing offers new opportunities for large-scale scientific data producers to utilize Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) IT resources to process and deliver data products in an operational environment where timely delivery, reliability, and availability are critical. The NASA Langley Research Center Atmospheric Science Data Center (ASDC) is building and testing a private and public facing cloud for users in the Science Directorate to utilize as an everyday production environment. The NASA SatCORPS (Satellite ClOud and Radiation Property Retrieval System) team processes and derives near real-time (NRT) global cloud products from operational geostationary (GEO) satellite imager datasets. To deliver these products, we will utilize the public facing cloud and OpenShift to deploy a load-balanced webserver for data storage, access, and dissemination. The OpenStack private cloud will host data ingest and computational capabilities for SatCORPS processing. This paper will discuss the SatCORPS migration towards, and usage of, the ASDC Cloud Services in an operational environment. Detailed lessons learned from use of prior cloud providers, specifically the Amazon Web Services (AWS) GovCloud and the Government Cloud administered by the Langley Managed Cloud Environment (LMCE) will also be discussed.
A Cloud Microphysics Model for the Gas Giant Planets
NASA Astrophysics Data System (ADS)
Palotai, Csaba J.; Le Beau, Raymond P.; Shankar, Ramanakumar; Flom, Abigail; Lashley, Jacob; McCabe, Tyler
2016-10-01
Recent studies have significantly increased the quality and the number of observed meteorological features on the jovian planets, revealing banded cloud structures and discrete features. Our current understanding of the formation and decay of those clouds also defines the conceptual modes about the underlying atmospheric dynamics. The full interpretation of the new observational data set and the related theories requires modeling these features in a general circulation model (GCM). Here, we present details of our bulk cloud microphysics model that was designed to simulate clouds in the Explicit Planetary Hybrid-Isentropic Coordinate (EPIC) GCM for the jovian planets. The cloud module includes hydrological cycles for each condensable species that consist of interactive vapor, cloud and precipitation phases and it also accounts for latent heating and cooling throughout the transfer processes (Palotai and Dowling, 2008. Icarus, 194, 303-326). Previously, the self-organizing clouds in our simulations successfully reproduced the vertical and horizontal ammonia cloud structure in the vicinity of Jupiter's Great Red Spot and Oval BA (Palotai et al. 2014, Icarus, 232, 141-156). In our recent work, we extended this model to include water clouds on Jupiter and Saturn, ammonia clouds on Saturn, and methane clouds on Uranus and Neptune. Details of our cloud parameterization scheme, our initial results and their comparison with observations will be shown. The latest version of EPIC model is available as open source software from NASA's PDS Atmospheres Node.
Advancing global marine biogeography research with open-source GIS software and cloud-computing
Fujioka, Ei; Vanden Berghe, Edward; Donnelly, Ben; Castillo, Julio; Cleary, Jesse; Holmes, Chris; McKnight, Sean; Halpin, patrick
2012-01-01
Across many scientific domains, the ability to aggregate disparate datasets enables more meaningful global analyses. Within marine biology, the Census of Marine Life served as the catalyst for such a global data aggregation effort. Under the Census framework, the Ocean Biogeographic Information System was established to coordinate an unprecedented aggregation of global marine biogeography data. The OBIS data system now contains 31.3 million observations, freely accessible through a geospatial portal. The challenges of storing, querying, disseminating, and mapping a global data collection of this complexity and magnitude are significant. In the face of declining performance and expanding feature requests, a redevelopment of the OBIS data system was undertaken. Following an Open Source philosophy, the OBIS technology stack was rebuilt using PostgreSQL, PostGIS, GeoServer and OpenLayers. This approach has markedly improved the performance and online user experience while maintaining a standards-compliant and interoperable framework. Due to the distributed nature of the project and increasing needs for storage, scalability and deployment flexibility, the entire hardware and software stack was built on a Cloud Computing environment. The flexibility of the platform, combined with the power of the application stack, enabled rapid re-development of the OBIS infrastructure, and ensured complete standards-compliance.
Sea spray as a source of ice nucleating particles - results from the AIDA Ocean03 campaign
NASA Astrophysics Data System (ADS)
Salter, M. E.; Ickes, L.; Adams, M.; Bierbauer, S.; Bilde, M.; Christiansen, S.; Ekman, A.; Gorokhova, E.; Höhler, K.; Kiselev, A. A.; Leck, C.; Mohr, C.; Mohler, O.; Murray, B. J.; Porter, G.; Ullrich, R.; Wagner, R.
2017-12-01
Clouds and their radiative effects are one of the major influences on the radiative fluxes in the atmosphere, but at the same time they remain the largest uncertainty in climate models. This lack of understanding is especially pronounced in the high Arctic. Summertime clouds can persist over long periods in this region, which is difficult to replicate in models based on our current understanding. The clouds most often encountered in the summertime high Arctic consist of a mixture of ice crystals and super-cooled water droplets, so-called mixed-phase clouds. This cloud type is sensitive to the availability of aerosol particles, which can act as cloud condensation nuclei and ice nuclei. However, since the high Arctic is a pristine region, aerosol particles are not very abundant, and the hypothesis of open leads in the Arctic as a potentially important source of cloud and ice nucleating particles via bubble bursting has emerged. In this context, we have conducted a series of experiments at the AIDA chamber at KIT, designed to investigate the mechanisms linking marine biology, seawater chemistry and aerosol physics/potential cloud impacts. During this campaign, two marine diatom species (Melosira arctica and Skeletonema marinoi) as well as sea surface microlayer samples collected during several Arctic Ocean research cruises were investigated. To aerosolize the samples, a variety of methods were used including a sea spray simulation chamber to mimic the process of bubble-bursting. The ice nucleating efficiency (mixed-phase cloud regime) of the samples was determined either directly in the AIDA chamber during adiabatic expansions, or using the INKA continuous flow diffusion chamber, or a cold stage. Results from the campaign along with the potential implications are presented.
BioBlend: automating pipeline analyses within Galaxy and CloudMan.
Sloggett, Clare; Goonasekera, Nuwan; Afgan, Enis
2013-07-01
We present BioBlend, a unified API in a high-level language (python) that wraps the functionality of Galaxy and CloudMan APIs. BioBlend makes it easy for bioinformaticians to automate end-to-end large data analysis, from scratch, in a way that is highly accessible to collaborators, by allowing them to both provide the required infrastructure and automate complex analyses over large datasets within the familiar Galaxy environment. http://bioblend.readthedocs.org/. Automated installation of BioBlend is available via PyPI (e.g. pip install bioblend). Alternatively, the source code is available from the GitHub repository (https://github.com/afgane/bioblend) under the MIT open source license. The library has been tested and is working on Linux, Macintosh and Windows-based systems.
Performance testing of 3D point cloud software
NASA Astrophysics Data System (ADS)
Varela-González, M.; González-Jorge, H.; Riveiro, B.; Arias, P.
2013-10-01
LiDAR systems are being used widely in recent years for many applications in the engineering field: civil engineering, cultural heritage, mining, industry and environmental engineering. One of the most important limitations of this technology is the large computational requirements involved in data processing, especially for large mobile LiDAR datasets. Several software solutions for data managing are available in the market, including open source suites, however, users often unknown methodologies to verify their performance properly. In this work a methodology for LiDAR software performance testing is presented and four different suites are studied: QT Modeler, VR Mesh, AutoCAD 3D Civil and the Point Cloud Library running in software developed at the University of Vigo (SITEGI). The software based on the Point Cloud Library shows better results in the loading time of the point clouds and CPU usage. However, it is not as strong as commercial suites in working set and commit size tests.
Study on Huizhou architecture of point cloud registration based on optimized ICP algorithm
NASA Astrophysics Data System (ADS)
Zhang, Runmei; Wu, Yulu; Zhang, Guangbin; Zhou, Wei; Tao, Yuqian
2018-03-01
In view of the current point cloud registration software has high hardware requirements, heavy workload and moltiple interactive definition, the source of software with better processing effect is not open, a two--step registration method based on normal vector distribution feature and coarse feature based iterative closest point (ICP) algorithm is proposed in this paper. This method combines fast point feature histogram (FPFH) algorithm, define the adjacency region of point cloud and the calculation model of the distribution of normal vectors, setting up the local coordinate system for each key point, and obtaining the transformation matrix to finish rough registration, the rough registration results of two stations are accurately registered by using the ICP algorithm. Experimental results show that, compared with the traditional ICP algorithm, the method used in this paper has obvious time and precision advantages for large amount of point clouds.
Application research of Ganglia in Hadoop monitoring and management
NASA Astrophysics Data System (ADS)
Li, Gang; Ding, Jing; Zhou, Lixia; Yang, Yi; Liu, Lei; Wang, Xiaolei
2017-03-01
There are many applications of Hadoop System in the field of large data, cloud computing. The test bench of storage and application in seismic network at Earthquake Administration of Tianjin use with Hadoop system, which is used the open source software of Ganglia to operate and monitor. This paper reviews the function, installation and configuration process, application effect of operating and monitoring in Hadoop system of the Ganglia system. It briefly introduces the idea and effect of Nagios software monitoring Hadoop system. It is valuable for the industry in the monitoring system of cloud computing platform.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343
OntoMaton: a bioportal powered ontology widget for Google Spreadsheets.
Maguire, Eamonn; González-Beltrán, Alejandra; Whetzel, Patricia L; Sansone, Susanna-Assunta; Rocca-Serra, Philippe
2013-02-15
Data collection in spreadsheets is ubiquitous, but current solutions lack support for collaborative semantic annotation that would promote shared and interdisciplinary annotation practices, supporting geographically distributed players. OntoMaton is an open source solution that brings ontology lookup and tagging capabilities into a cloud-based collaborative editing environment, harnessing Google Spreadsheets and the NCBO Web services. It is a general purpose, format-agnostic tool that may serve as a component of the ISA software suite. OntoMaton can also be used to assist the ontology development process. OntoMaton is freely available from Google widgets under the CPAL open source license; documentation and examples at: https://github.com/ISA-tools/OntoMaton.
Wiewiórka, Marek S; Messina, Antonio; Pacholewska, Alicja; Maffioletti, Sergio; Gawrysiak, Piotr; Okoniewski, Michał J
2014-09-15
Many time-consuming analyses of next -: generation sequencing data can be addressed with modern cloud computing. The Apache Hadoop-based solutions have become popular in genomics BECAUSE OF: their scalability in a cloud infrastructure. So far, most of these tools have been used for batch data processing rather than interactive data querying. The SparkSeq software has been created to take advantage of a new MapReduce framework, Apache Spark, for next-generation sequencing data. SparkSeq is a general-purpose, flexible and easily extendable library for genomic cloud computing. It can be used to build genomic analysis pipelines in Scala and run them in an interactive way. SparkSeq opens up the possibility of customized ad hoc secondary analyses and iterative machine learning algorithms. This article demonstrates its scalability and overall fast performance by running the analyses of sequencing datasets. Tests of SparkSeq also prove that the use of cache and HDFS block size can be tuned for the optimal performance on multiple worker nodes. Available under open source Apache 2.0 license: https://bitbucket.org/mwiewiorka/sparkseq/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Support of Multidimensional Parallelism in the OpenMP Programming Model
NASA Technical Reports Server (NTRS)
Jin, Hao-Qiang; Jost, Gabriele
2003-01-01
OpenMP is the current standard for shared-memory programming. While providing ease of parallel programming, the OpenMP programming model also has limitations which often effect the scalability of applications. Examples for these limitations are work distribution and point-to-point synchronization among threads. We propose extensions to the OpenMP programming model which allow the user to easily distribute the work in multiple dimensions and synchronize the workflow among the threads. The proposed extensions include four new constructs and the associated runtime library. They do not require changes to the source code and can be implemented based on the existing OpenMP standard. We illustrate the concept in a prototype translator and test with benchmark codes and a cloud modeling code.
LINUX, Virtualization, and the Cloud: A Hands-On Student Introductory Lab
ERIC Educational Resources Information Center
Serapiglia, Anthony
2013-01-01
Many students are entering Computer Science education with limited exposure to operating systems and applications other than those produced by Apple or Microsoft. This gap in familiarity with the Open Source community can quickly be bridged with a simple exercise that can also be used to strengthen two other important current computing concepts,…
NASA Astrophysics Data System (ADS)
Engel, P.; Schweimler, B.
2016-04-01
The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the Neubrandenburg University of Applied Sciences (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.
NASA Astrophysics Data System (ADS)
Morrison, A. L.; Kay, J. E.; Chepfer, H.; Guzman, R.; Yettella, V.
2018-01-01
While the radiative influence of clouds on Arctic sea ice is known, the influence of sea ice cover on Arctic clouds is challenging to detect, separate from atmospheric circulation, and attribute to human activities. Providing observational constraints on the two-way relationship between sea ice cover and Arctic clouds is important for predicting the rate of future sea ice loss. Here we use 8 years of CALIPSO (Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations) spaceborne lidar observations from 2008 to 2015 to analyze Arctic cloud profiles over sea ice and over open water. Using a novel surface mask to restrict our analysis to where sea ice concentration varies, we isolate the influence of sea ice cover on Arctic Ocean clouds. The study focuses on clouds containing liquid water because liquid-containing clouds are the most important cloud type for radiative fluxes and therefore for sea ice melt and growth. Summer is the only season with no observed cloud response to sea ice cover variability: liquid cloud profiles are nearly identical over sea ice and over open water. These results suggest that shortwave summer cloud feedbacks do not slow long-term summer sea ice loss. In contrast, more liquid clouds are observed over open water than over sea ice in the winter, spring, and fall in the 8 year mean and in each individual year. Observed fall sea ice loss cannot be explained by natural variability alone, which suggests that observed increases in fall Arctic cloud cover over newly open water are linked to human activities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karthik, Rajasekar
2014-01-01
In this paper, an architecture for building Scalable And Mobile Environment For High-Performance Computing with spatial capabilities called SAME4HPC is described using cutting-edge technologies and standards such as Node.js, HTML5, ECMAScript 6, and PostgreSQL 9.4. Mobile devices are increasingly becoming powerful enough to run high-performance apps. At the same time, there exist a significant number of low-end and older devices that rely heavily on the server or the cloud infrastructure to do the heavy lifting. Our architecture aims to support both of these types of devices to provide high-performance and rich user experience. A cloud infrastructure consisting of OpenStack withmore » Ubuntu, GeoServer, and high-performance JavaScript frameworks are some of the key open-source and industry standard practices that has been adopted in this architecture.« less
Scaling Agile Infrastructure to People
NASA Astrophysics Data System (ADS)
Jones, B.; McCance, G.; Traylen, S.; Barrientos Arias, N.
2015-12-01
When CERN migrated its infrastructure away from homegrown fabric management tools to emerging industry-standard open-source solutions, the immediate technical challenges and motivation were clear. The move to a multi-site Cloud Computing model meant that the tool chains that were growing around this ecosystem would be a good choice, the challenge was to leverage them. The use of open-source tools brings challenges other than merely how to deploy them. Homegrown software, for all the deficiencies identified at the outset of the project, has the benefit of growing with the organization. This paper will examine what challenges there were in adapting open-source tools to the needs of the organization, particularly in the areas of multi-group development and security. Additionally, the increase in scale of the plant required changes to how Change Management was organized and managed. Continuous Integration techniques are used in order to manage the rate of change across multiple groups, and the tools and workflow for this will be examined.
Scaling the CERN OpenStack cloud
NASA Astrophysics Data System (ADS)
Bell, T.; Bompastor, B.; Bukowiec, S.; Castro Leon, J.; Denis, M. K.; van Eldik, J.; Fermin Lobo, M.; Fernandez Alvarez, L.; Fernandez Rodriguez, D.; Marino, A.; Moreira, B.; Noel, B.; Oulevey, T.; Takase, W.; Wiebalck, A.; Zilli, S.
2015-12-01
CERN has been running a production OpenStack cloud since July 2013 to support physics computing and infrastructure services for the site. In the past year, CERN Cloud Infrastructure has seen a constant increase in nodes, virtual machines, users and projects. This paper will present what has been done in order to make the CERN cloud infrastructure scale out.
NASA Astrophysics Data System (ADS)
Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang
2016-02-01
With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.
Open Biomedical Engineering education in Africa.
Ahluwalia, Arti; Atwine, Daniel; De Maria, Carmelo; Ibingira, Charles; Kipkorir, Emmauel; Kiros, Fasil; Madete, June; Mazzei, Daniele; Molyneux, Elisabeth; Moonga, Kando; Moshi, Mainen; Nzomo, Martin; Oduol, Vitalice; Okuonzi, John
2015-08-01
Despite the virtual revolution, the mainstream academic community in most countries remains largely ignorant of the potential of web-based teaching resources and of the expansion of open source software, hardware and rapid prototyping. In the context of Biomedical Engineering (BME), where human safety and wellbeing is paramount, a high level of supervision and quality control is required before open source concepts can be embraced by universities and integrated into the curriculum. In the meantime, students, more than their teachers, have become attuned to continuous streams of digital information, and teaching methods need to adapt rapidly by giving them the skills to filter meaningful information and by supporting collaboration and co-construction of knowledge using open, cloud and crowd based technology. In this paper we present our experience in bringing these concepts to university education in Africa, as a way of enabling rapid development and self-sufficiency in health care. We describe the three summer schools held in sub-Saharan Africa where both students and teachers embraced the philosophy of open BME education with enthusiasm, and discuss the advantages and disadvantages of opening education in this way in the developing and developed world.
Evidence in Magnetic Clouds for Systematic Open Flux Transport on the Sun
NASA Technical Reports Server (NTRS)
Crooker, N. U.; Kahler, S. W.; Gosling, J. T.; Lepping, R. P.
2008-01-01
Most magnetic clouds encountered by spacecraft at 1 AU display a mix of unidirectional suprathermal electrons signaling open field lines and counterstreaming electrons signaling loops connected to the Sun at both ends. Assuming the open fields were originally loops that underwent interchange reconnection with open fields at the Sun, we determine the sense of connectedness of the open fields found in 72 of 97 magnetic clouds identified by the Wind spacecraft in order to obtain information on the location and sense of the reconnection and resulting flux transport at the Sun. The true polarity of the open fields in each magnetic cloud was determined from the direction of the suprathermal electron flow relative to the magnetic field direction. Results indicate that the polarity of all open fields within a given magnetic cloud is the same 89% of the time, implying that interchange reconnection at the Sun most often occurs in only one leg of a flux rope loop, thus transporting open flux in a single direction, from a coronal hole near that leg to the foot point of the opposite leg. This pattern is consistent with the view that interchange reconnection in coronal mass ejections systematically transports an amount of open flux sufficient to reverse the polarity of the heliospheric field through the course of the solar cycle. Using the same electron data, we also find that the fields encountered in magnetic clouds are only a third as likely to be locally inverted as not. While one might expect inversions to be equally as common as not in flux rope coils, consideration of the geometry of spacecraft trajectories relative to the modeled magnetic cloud axes leads us to conclude that the result is reasonable.
ERIC Educational Resources Information Center
El-Seoud, M. Samir Abou; El-Sofany, Hosam F.; Taj-Eddin, Islam A. T. F.; Nosseir, Ann; El-Khouly, Mahmoud M.
2013-01-01
The information technology educational programs at most universities in Egypt face many obstacles that can be overcome using technology enhanced learning. An open source Moodle eLearning platform has been implemented at many public and private universities in Egypt, as an aid to deliver e-content and to provide the institution with various…
ERIC Educational Resources Information Center
Mumba, Frackson; Zhu, Mengxia
2013-01-01
This paper presents a Simulation-based interactive Virtual ClassRoom web system (SVCR: www.vclasie.com) powered by the state-of-the-art cloud computing technology from Google SVCR integrates popular free open-source math, science and engineering simulations and provides functions such as secure user access control and management of courses,…
Dem Generation from Close-Range Photogrammetry Using Extended Python Photogrammetry Toolbox
NASA Astrophysics Data System (ADS)
Belmonte, A. A.; Biong, M. M. P.; Macatulad, E. G.
2017-10-01
Digital elevation models (DEMs) are widely used raster data for different applications concerning terrain, such as for flood modelling, viewshed analysis, mining, land development, engineering design projects, to name a few. DEMs can be obtained through various methods, including topographic survey, LiDAR or photogrammetry, and internet sources. Terrestrial close-range photogrammetry is one of the alternative methods to produce DEMs through the processing of images using photogrammetry software. There are already powerful photogrammetry software that are commercially-available and can produce high-accuracy DEMs. However, this entails corresponding cost. Although, some of these software have free or demo trials, these trials have limits in their usable features and usage time. One alternative is the use of free and open-source software (FOSS), such as the Python Photogrammetry Toolbox (PPT), which provides an interface for performing photogrammetric processes implemented through python script. For relatively small areas such as in mining or construction excavation, a relatively inexpensive, fast and accurate method would be advantageous. In this study, PPT was used to generate 3D point cloud data from images of an open pit excavation. The PPT was extended to add an algorithm converting the generated point cloud data into a usable DEM.
Architectures for Rainfall Property Estimation From Polarimetric Radar
NASA Astrophysics Data System (ADS)
Collis, S. M.; Giangrande, S. E.; Helmus, J.; Troemel, S.
2014-12-01
Radars that transmit and receive signals in polarizations aligned both horizontal and vertical to the horizon collect a number of measurements. The relation both between these measurements and between measurements and desired microphysical quantities (such as rainfall rate) is complicated due to a number of scattering mechanisms. The result is that there ends up being an intractable number of often incompatible techniques for extracting geophysical insight. This presentation will discuss methods developed by the Atmospheric Measurement Climate (ARM) Research Facility to streamline the creation of application chains for retrieving rainfall properties for the purposes of fine scale model evaluation. By using a Common Data Model (CDM) approach and working in the popular open source Python scientific environment analysis techniques such as Linear Programming (LP) can be bought to bear on the task of retrieving insight from radar signals. This presentation will outline how we have used these techniques to detangle polarimetric phase signals, estimate a three-dimensional precipitation field and then objectively compare to cloud resolving model derived rainfall fields from the NASA/DoE Mid-Latitude Continental Convective Clouds Experiment (MC3E). All techniques show will be available, open source, in the Python-ARM Radar Toolkit (Py-ART).
Earthscape, a Multi-Purpose Interactive 3d Globe Viewer for Hybrid Data Visualization and Analysis
NASA Astrophysics Data System (ADS)
Sarthou, A.; Mas, S.; Jacquin, M.; Moreno, N.; Salamon, A.
2015-08-01
The hybrid visualization and interaction tool EarthScape is presented here. The software is able to display simultaneously LiDAR point clouds, draped videos with moving footprint, volume scientific data (using volume rendering, isosurface and slice plane), raster data such as still satellite images, vector data and 3D models such as buildings or vehicles. The application runs on touch screen devices such as tablets. The software is based on open source libraries, such as OpenSceneGraph, osgEarth and OpenCV, and shader programming is used to implement volume rendering of scientific data. The next goal of EarthScape is to perform data analysis using ENVI Services Engine, a cloud data analysis solution. EarthScape is also designed to be a client of Jagwire which provides multisource geo-referenced video fluxes. When all these components will be included, EarthScape will be a multi-purpose platform that will provide at the same time data analysis, hybrid visualization and complex interactions. The software is available on demand for free at france@exelisvis.com.
How Will Aerosol-Cloud Interactions Change in an Ice-Free Arctic Summer?
NASA Astrophysics Data System (ADS)
Gilgen, Anina; Katty Huang, Wan Ting; Ickes, Luisa; Lohmann, Ulrike
2016-04-01
Future temperatures in the Arctic are expected to increase more than the global mean temperature, which will lead to a pronounced retreat in Arctic sea ice. Before mid-century, most sea ice will likely have vanished in late Arctic summers. This will allow ships to cruise in the Arctic Ocean, e.g. to shorten their transport passage or to extract oil. Since both ships and open water emit aerosol particles and precursors, Arctic clouds and radiation may be affected via aerosol-cloud and cloud-radiation interactions. The change in radiation feeds back on temperature and sea ice retreat. In addition to aerosol particles, also the temperature and the open ocean as a humidity source should have a strong effect on clouds. The main goal of this study is to assess the impact of sea ice retreat on the Arctic climate with focus on aerosol emissions and cloud properties. To this purpose, we conducted ensemble runs with the global climate model ECHAM6-HAM2 under present-day and future (2050) conditions. ECHAM6-HAM2 was coupled with a mixed layer ocean model, which includes a sea ice model. To estimate Arctic aerosol emissions from ships, we used an elaborated ship emission inventory (Peters et al. 2011); changes in aerosol emissions from the ocean are calculated online. Preliminary results show that the sea salt aerosol and the dimethyl sulfide burdens over the Arctic Ocean significantly increase. While the ice water path decreases, the total water path increases. Due to the decrease in surface albedo, the cooling effect of the Arctic clouds becomes more important in 2050. Enhanced Arctic shipping has only a very small impact. The increase in the aersol burden due to shipping is less pronounced than the increase due to natural emissions even if the ship emissions are increased by a factor of ten. Hence, there is hardly an effect on clouds and radiation caused by shipping. References Peters et al. (2011), Atmos. Chem. Phys., 11, 5305-5320
flexCloud: Deployment of the FLEXPART Atmospheric Transport Model as a Cloud SaaS Environment
NASA Astrophysics Data System (ADS)
Morton, Don; Arnold, Dèlia
2014-05-01
FLEXPART (FLEXible PARTicle dispersion model) is a Lagrangian transport and dispersion model used by a growing international community. We have used it to simulate and forecast the atmospheric transport of wildfire smoke, volcanic ash and radionuclides. Additionally, FLEXPART may be run in backwards mode to provide information for the determination of emission sources such as nuclear emissions and greenhouse gases. This open source software is distributed in source code form, and has several compiler and library dependencies that users need to address. Although well-documented, getting it compiled, set up, running, and post-processed is often tedious, making it difficult for the inexperienced user. Our interest is in moving scientific modeling and simulation activities from site-specific clusters and supercomputers to a cloud model as a service paradigm. Choosing FLEXPART for our prototyping, our vision is to construct customised IaaS images containing fully-compiled and configured FLEXPART codes, including pre-processing, execution and postprocessing components. In addition, with the inclusion of a small web server in the image, we introduce a web-accessible graphical user interface that drives the system. A further initiative being pursued is the deployment of multiple, simultaneous FLEXPART ensembles in the cloud. A single front-end web interface is used to define the ensemble members, and separate cloud instances are launched, on-demand, to run the individual models and to conglomerate the outputs into a unified display. The outcome of this work is a Software as a Service (Saas) deployment whereby the details of the underlying modeling systems are hidden, allowing modelers to perform their science activities without the burden of considering implementation details.
Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.
Trudgian, David C; Mirzaei, Hamid
2012-12-07
We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.
On the reversibility of transitions between closed and open cellular convection
Feingold, G.; Koren, I.; Yamaguchi, T.; ...
2015-07-08
The two-way transition between closed and open cellular convection is addressed in an idealized cloud-resolving modeling framework. A series of cloud-resolving simulations shows that the transition between closed and open cellular states is asymmetrical and characterized by a rapid ("runaway") transition from the closed- to the open-cell state but slower recovery to the closed-cell state. Given that precipitation initiates the closed–open cell transition and that the recovery requires a suppression of the precipitation, we apply an ad hoc time-varying drop concentration to initiate and suppress precipitation. We show that the asymmetry in the two-way transition occurs even for very rapidmore » drop concentration replenishment. The primary barrier to recovery is the loss in turbulence kinetic energy (TKE) associated with the loss in cloud water (and associated radiative cooling) and the vertical stratification of the boundary layer during the open-cell period. In transitioning from the open to the closed state, the system faces the task of replenishing cloud water fast enough to counter precipitation losses, such that it can generate radiative cooling and TKE. It is hampered by a stable layer below cloud base that has to be overcome before water vapor can be transported more efficiently into the cloud layer. Recovery to the closed-cell state is slower when radiative cooling is inefficient such as in the presence of free tropospheric clouds or after sunrise, when it is hampered by the absorption of shortwave radiation. Tests suggest that recovery to the closed-cell state is faster when the drizzle is smaller in amount and of shorter duration, i.e., when the precipitation causes less boundary layer stratification. Cloud-resolving model results on recovery rates are supported by simulations with a simple predator–prey dynamical system analogue. It is suggested that the observed closing of open cells by ship effluent likely occurs when aerosol intrusions are large, when contact comes prior to the heaviest drizzle in the early morning hours, and when the free troposphere is cloud free.« less
Building a Billion Spatio-Temporal Object Search and Visualization Platform
NASA Astrophysics Data System (ADS)
Kakkar, D.; Lewis, B.
2017-10-01
With funding from the Sloan Foundation and Harvard Dataverse, the Harvard Center for Geographic Analysis (CGA) has developed a prototype spatio-temporal visualization platform called the Billion Object Platform or BOP. The goal of the project is to lower barriers for scholars who wish to access large, streaming, spatio-temporal datasets. The BOP is now loaded with the latest billion geo-tweets, and is fed a real-time stream of about 1 million tweets per day. The geo-tweets are enriched with sentiment and census/admin boundary codes when they enter the system. The system is open source and is currently hosted on Massachusetts Open Cloud (MOC), an OpenStack environment with all components deployed in Docker orchestrated by Kontena. This paper will provide an overview of the BOP architecture, which is built on an open source stack consisting of Apache Lucene, Solr, Kafka, Zookeeper, Swagger, scikit-learn, OpenLayers, and AngularJS. The paper will further discuss the approach used for harvesting, enriching, streaming, storing, indexing, visualizing and querying a billion streaming geo-tweets.
Preliminary Spreadsheet of Eruption Source Parameters for Volcanoes of the World
Mastin, Larry G.; Guffanti, Marianne; Ewert, John W.; Spiegel, Jessica
2009-01-01
Volcanic eruptions that spew tephra into the atmosphere pose a hazard to jet aircraft. For this reason, the International Civil Aviation Organization (ICAO) has designated nine Volcanic Ash and Aviation Centers (VAACs) around the world whose purpose is to track ash clouds from eruptions and notify aircraft so that they may avoid these ash clouds. During eruptions, VAACs and their collaborators run volcanic-ashtransport- and-dispersion (VATD) models that forecast the location and movement of ash clouds. These models require as input parameters the plume height H, the mass-eruption rate , duration D, erupted volume V (in cubic kilometers of bubble-free or 'dense rock equivalent' [DRE] magma), and the mass fraction of erupted tephra with a particle size smaller than 63 um (m63). Some parameters, such as mass-eruption rate and mass fraction of fine debris, are not obtainable by direct observation; others, such as plume height or duration, are obtainable from observations but may be unavailable in the early hours of an eruption when VATD models are being initiated. For this reason, ash-cloud modelers need to have at their disposal source parameters for a particular volcano that are based on its recent eruptive history and represent the most likely anticipated eruption. They also need source parameters that encompass the range of uncertainty in eruption size or characteristics. In spring of 2007, a workshop was held at the U.S. Geological Survey (USGS) Cascades Volcano Observatory to derive a protocol for assigning eruption source parameters to ash-cloud models during eruptions. The protocol derived from this effort was published by Mastin and others (in press), along with a world map displaying the assigned eruption type for each of the world's volcanoes. Their report, however, did not include the assigned eruption types in tabular form. Therefore, this Open-File Report presents that table in the form of an Excel spreadsheet. These assignments are preliminary and will be modified to follow upcoming recommendations by the volcanological and aviation communities.
NASA Technical Reports Server (NTRS)
Ganeshan, Manisha; Wu, Dongliang
2016-01-01
The increasing ice-free area during late summer has transformed the Arctic to a climate system with more dynamic boundary layer (BL) clouds and seasonal sea ice growth. The open-ocean sensible heat flux, a crucial mechanism of excessive ocean heat loss to the atmosphere during the fall freeze season, is speculated to play an important role in the recently observed cloud cover increase and BL instability. However, lack of observations and understanding of the resilience of the proposed mechanisms, especially in relation to meteorological and interannual variability, has left a poorly constrained BL parameterization scheme in Arctic climate models. In this study, we use multiyear Japanese cruise-ship observations from RV Mirai over the open Arctic Ocean to characterize the surface sensible heat flux (SSHF) during early fall and investigate its contribution to BL turbulence. It is found that mixing by SSHF is favored during episodes of high surface wind speed and is also influenced by the prevailing cloud regime. The deepest BLs and maximum ocean-atmosphere temperature difference are observed during cold air advection (associated with the stratocumulus regime), yet, contrary to previous speculation, the efficiency of sensible heat exchange is low. On the other hand, the SSHF contributes significantly to BL mixing during the uplift (low pressure) followed by the highly stable (stratus) regime. Overall, it can explain 10 of the open ocean BL height variability, whereas cloud-driven (moisture and radiative) mechanisms appear to be the other dominant source of convective turbulence. Nevertheless, there is strong interannual variability in the relationship between the SSHF and the BL height which can be intensified by the changing occurrence of Arctic climate patterns, such as positive surface wind speed anomalies and more frequent conditions of uplift. This study highlights the need for comprehensive BL observations like the RV Mirai for better understanding and predicting the dynamic nature of the Arctic climate.
Enabling Research Network Connectivity to Clouds with Virtual Router Technology
NASA Astrophysics Data System (ADS)
Seuster, R.; Casteels, K.; Leavett-Brown, CR; Paterson, M.; Sobie, RJ
2017-10-01
The use of opportunistic cloud resources by HEP experiments has significantly increased over the past few years. Clouds that are owned or managed by the HEP community are connected to the LHCONE network or the research network with global access to HEP computing resources. Private clouds, such as those supported by non-HEP research funds are generally connected to the international research network; however, commercial clouds are either not connected to the research network or only connect to research sites within their national boundaries. Since research network connectivity is a requirement for HEP applications, we need to find a solution that provides a high-speed connection. We are studying a solution with a virtual router that will address the use case when a commercial cloud has research network connectivity in a limited region. In this situation, we host a virtual router in our HEP site and require that all traffic from the commercial site transit through the virtual router. Although this may increase the network path and also the load on the HEP site, it is a workable solution that would enable the use of the remote cloud for low I/O applications. We are exploring some simple open-source solutions. In this paper, we present the results of our studies and how it will benefit our use of private and public clouds for HEP computing.
NASA Astrophysics Data System (ADS)
Melton, R.; Thomas, J.
With the rapid growth in the number of space actors, there has been a marked increase in the complexity and diversity of software systems utilized to support SSA target tracking, indication, warning, and collision avoidance. Historically, most SSA software has been constructed with "closed" proprietary code, which limits interoperability, inhibits the code transparency that some SSA customers need to develop domain expertise, and prevents the rapid injection of innovative concepts into these systems. Open-source aerospace software, a rapidly emerging, alternative trend in code development, is based on open collaboration, which has the potential to bring greater transparency, interoperability, flexibility, and reduced development costs. Open-source software is easily adaptable, geared to rapidly changing mission needs, and can generally be delivered at lower costs to meet mission requirements. This paper outlines Ball's COSMOS C2 system, a fully open-source, web-enabled, command-and-control software architecture which provides several unique capabilities to move the current legacy SSA software paradigm to an open source model that effectively enables pre- and post-launch asset command and control. Among the unique characteristics of COSMOS is the ease with which it can integrate with diverse hardware. This characteristic enables COSMOS to serve as the command-and-control platform for the full life-cycle development of SSA assets, from board test, to box test, to system integration and test, to on-orbit operations. The use of a modern scripting language, Ruby, also permits automated procedures to provide highly complex decision making for the tasking of SSA assets based on both telemetry data and data received from outside sources. Detailed logging enables quick anomaly detection and resolution. Integrated real-time and offline data graphing renders the visualization of the both ground and on-orbit assets simple and straightforward.
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
Open-cell and closed-cell clouds off Peru
2010-04-27
2010/107 - 04/17 at 21 :05 UTC. Open-cell and closed-cell clouds off Peru, Pacific Ocean Resembling a frosted window on a cold winter's day, this lacy pattern of marine clouds was captured off the coast of Peru in the Pacific Ocean by the MODIS on the Aqua satellite on April 19, 2010. The image reveals both open- and closed-cell cumulus cloud patterns. These cells, or parcels of air, often occur in roughly hexagonal arrays in a layer of fluid (the atmosphere often behaves like a fluid) that begins to "boil," or convect, due to heating at the base or cooling at the top of the layer. In "closed" cells warm air is rising in the center, and sinking around the edges, so clouds appear in cell centers, but evaporate around cell edges. This produces cloud formations like those that dominate the lower left. The reverse flow can also occur: air can sink in the center of the cell and rise at the edge. This process is called "open cell" convection, and clouds form at cell edges around open centers, which creates a lacy, hollow-looking pattern like the clouds in the upper right. Closed and open cell convection represent two stable atmospheric configurations — two sides of the convection coin. But what determines which path the "boiling" atmosphere will take? Apparently the process is highly chaotic, and there appears to be no way to predict whether convection will result in open or closed cells. Indeed, the atmosphere may sometimes flip between one mode and another in no predictable pattern. Satellite: Aqua NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team To learn more about MODIS go to: rapidfire.sci.gsfc.nasa.gov/gallery/?latest NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
Virtualized Networks and Virtualized Optical Line Terminal (vOLT)
NASA Astrophysics Data System (ADS)
Ma, Jonathan; Israel, Stephen
2017-03-01
The success of the Internet and the proliferation of the Internet of Things (IoT) devices is forcing telecommunications carriers to re-architecture a central office as a datacenter (CORD) so as to bring the datacenter economics and cloud agility to a central office (CO). The Open Network Operating System (ONOS) is the first open-source software-defined network (SDN) operating system which is capable of managing and controlling network, computing, and storage resources to support CORD infrastructure and network virtualization. The virtualized Optical Line Termination (vOLT) is one of the key components in such virtualized networks.
Executable research compendia in geoscience research infrastructures
NASA Astrophysics Data System (ADS)
Nüst, Daniel
2017-04-01
From generation through analysis and collaboration to communication, scientific research requires the right tools. Scientists create their own software using third party libraries and platforms. Cloud computing, Open Science, public data infrastructures, and Open Source enable scientists with unprecedented opportunites, nowadays often in a field "Computational X" (e.g. computational seismology) or X-informatics (e.g. geoinformatics) [0]. This increases complexity and generates more innovation, e.g. Environmental Research Infrastructures (environmental RIs [1]). Researchers in Computational X write their software relying on both source code (e.g. from https://github.com) and binary libraries (e.g. from package managers such as APT, https://wiki.debian.org/Apt, or CRAN, https://cran.r-project.org/). They download data from domain specific (cf. https://re3data.org) or generic (e.g. https://zenodo.org) data repositories, and deploy computations remotely (e.g. European Open Science Cloud). The results themselves are archived, given persistent identifiers, connected to other works (e.g. using https://orcid.org/), and listed in metadata catalogues. A single researcher, intentionally or not, interacts with all sub-systems of RIs: data acquisition, data access, data processing, data curation, and community support [3]. To preserve computational research [3] proposes the Executable Research Compendium (ERC), a container format closing the gap of dependency preservation by encapsulating the runtime environment. ERCs and RIs can be integrated for different uses: (i) Coherence: ERC services validate completeness, integrity and results (ii) Metadata: ERCs connect the different parts of a piece of research and faciliate discovery (iii) Exchange and Preservation: ERC as usable building blocks are the shared and archived entity (iv) Self-consistency: ERCs remove dependence on ephemeral sources (v) Execution: ERC services create and execute a packaged analysis but integrate with existing platforms for display and control These integrations are vital for capturing workflows in RIs and connect key stakeholders (scientists, publishers, librarians). They are demonstrated using developments by the DFG-funded project Opening Reproducible Research (http://o2r.info). Semi-automatic creation of ERCs based on research workflows is a core goal of the project. References [0] Tony Hey, Stewart Tansley, Kristin Tolle (eds), 2009. The Fourth Paradigm: Data-Intensive Scientific Discovery. Microsoft Research. [1] P. Martin et al., Open Information Linking for Environmental Research Infrastructures, 2015 IEEE 11th International Conference on e-Science, Munich, 2015, pp. 513-520. doi: 10.1109/eScience.2015.66 [2] Y. Chen et al., Analysis of Common Requirements for Environmental Science Research Infrastructures, The International Symposium on Grids and Clouds (ISGC) 2013, Taipei, 2013, http://pos.sissa.it/archive/conferences/179/032/ISGC [3] Opening Reproducible Research, Geophysical Research Abstracts Vol. 18, EGU2016-7396, 2016, http://meetingorganizer.copernicus.org/EGU2016/EGU2016-7396.pdf
NASA Astrophysics Data System (ADS)
Yue, S. S.; Wen, Y. N.; Lv, G. N.; Hu, D.
2013-10-01
In recent years, the increasing development of cloud computing technologies laid critical foundation for efficiently solving complicated geographic issues. However, it is still difficult to realize the cooperative operation of massive heterogeneous geographical models. Traditional cloud architecture is apt to provide centralized solution to end users, while all the required resources are often offered by large enterprises or special agencies. Thus, it's a closed framework from the perspective of resource utilization. Solving comprehensive geographic issues requires integrating multifarious heterogeneous geographical models and data. In this case, an open computing platform is in need, with which the model owners can package and deploy their models into cloud conveniently, while model users can search, access and utilize those models with cloud facility. Based on this concept, the open cloud service strategies for the sharing of heterogeneous geographic analysis models is studied in this article. The key technology: unified cloud interface strategy, sharing platform based on cloud service, and computing platform based on cloud service are discussed in detail, and related experiments are conducted for further verification.
Seqcrawler: biological data indexing and browsing platform.
Sallou, Olivier; Bretaudeau, Anthony; Roult, Aurelien
2012-07-24
Seqcrawler takes its roots in software like SRS or Lucegene. It provides an indexing platform to ease the search of data and meta-data in biological banks and it can scale to face the current flow of data. While many biological bank search tools are available on the Internet, mainly provided by large organizations to search their data, there is a lack of free and open source solutions to browse one's own set of data with a flexible query system and able to scale from a single computer to a cloud system. A personal index platform will help labs and bioinformaticians to search their meta-data but also to build a larger information system with custom subsets of data. The software is scalable from a single computer to a cloud-based infrastructure. It has been successfully tested in a private cloud with 3 index shards (pieces of index) hosting ~400 millions of sequence information (whole GenBank, UniProt, PDB and others) for a total size of 600 GB in a fault tolerant architecture (high-availability). It has also been successfully integrated with software to add extra meta-data from blast results to enhance users' result analysis. Seqcrawler provides a complete open source search and store solution for labs or platforms needing to manage large amount of data/meta-data with a flexible and customizable web interface. All components (search engine, visualization and data storage), though independent, share a common and coherent data system that can be queried with a simple HTTP interface. The solution scales easily and can also provide a high availability infrastructure.
NASA Astrophysics Data System (ADS)
Bada, Adedayo; Wang, Qi; Alcaraz-Calero, Jose M.; Grecos, Christos
2016-04-01
This paper proposes a new approach to improving the application of 3D video rendering and streaming by jointly exploring and optimizing both cloud-based virtualization and web-based delivery. The proposed web service architecture firstly establishes a software virtualization layer based on QEMU (Quick Emulator), an open-source virtualization software that has been able to virtualize system components except for 3D rendering, which is still in its infancy. The architecture then explores the cloud environment to boost the speed of the rendering at the QEMU software virtualization layer. The capabilities and inherent limitations of Virgil 3D, which is one of the most advanced 3D virtual Graphics Processing Unit (GPU) available, are analyzed through benchmarking experiments and integrated into the architecture to further speed up the rendering. Experimental results are reported and analyzed to demonstrate the benefits of the proposed approach.
Efficient LIDAR Point Cloud Data Managing and Processing in a Hadoop-Based Distributed Framework
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Sha, D.; Han, X.
2017-10-01
Light Detection and Ranging (LiDAR) is one of the most promising technologies in surveying and mapping city management, forestry, object recognition, computer vision engineer and others. However, it is challenging to efficiently storage, query and analyze the high-resolution 3D LiDAR data due to its volume and complexity. In order to improve the productivity of Lidar data processing, this study proposes a Hadoop-based framework to efficiently manage and process LiDAR data in a distributed and parallel manner, which takes advantage of Hadoop's storage and computing ability. At the same time, the Point Cloud Library (PCL), an open-source project for 2D/3D image and point cloud processing, is integrated with HDFS and MapReduce to conduct the Lidar data analysis algorithms provided by PCL in a parallel fashion. The experiment results show that the proposed framework can efficiently manage and process big LiDAR data.
Context-aware distributed cloud computing using CloudScheduler
NASA Astrophysics Data System (ADS)
Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.
2017-10-01
The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.
Aerosol and CCN over the Southern Ocean: Sources, Sinks and Processes
NASA Astrophysics Data System (ADS)
Clarke, A. D.; Freitag, S.; Howell, S. G.; Snider, J. R.; Kazil, J.; Feingold, G.; McNaughton, C. S.; Brekhovskikh, V.; Kapustin, V.; Campos, T. L.; Shank, L.
2013-12-01
Aerosol able to activate as cloud condensation nuclei (CCN) in marine stratus play an important role in cloud properties and processes. The 2008 VOCALS experiment (http://www.eol.ucar.edu/projects/vocals/) explored the aerosol cloud system over the South East Pacific (SEP). There, marine boundary layer (MBL) air from the Southern Ocean is directed north parallel to the South American coast and exposed to continental emissions. During this transport the initial clean MBL aerosol is modified in response to production, processing, entrainment, mixing, and removal. Here we discuss how the aerosol, the CCN and the clouds over the SEP are coupled by these processes. VOCALS data along 20S indicated cleanest air offshore and west of about 78W. However, some of the cleanest air (lowest CO concentrations) over the SEP were present in pockets of open cells (POC's). This suggests POC's are favored in places where remnants of Southern Ocean MBL air experienced the least mixing with higher CO sources during transport, either coastal or via entrainment of free troposphere air. Entrainment from the free troposphere (FT) was found to be an important source of marine boundary layer (MBL) aerosol in both near-shore and off-shore regions while direct advection of continental aerosol tended to influence aerosol and CCN closer to the coast. Entrainment from the FT included diverse sources from South America as well as long range transport from the western Pacific. Entrainment of FT aerosol can resupply the MBL with CCN and this process appears greatly enhanced when patchy 'rivers' of pollution lie directly above the inversion. This process was evident both offshore and near the coast. Production of CCN from sea spray aerosol (SSA) were found to increase with wind speed but atmospheric concentrations did not generally increase in the higher wind offshore regions because these regions had greater drizzle removal that compensated for increased production. Generally SSA larger than 60 nm were effective as marine cloud CCN but were only about 20% of the total. Elevated organic aerosol was usually associated with biomass burning sources and a natural marine organic aerosol source was weak. Although nucleation in clean scavenged air layers near the top of the boundary layer were observed under certain conditions, the resulting small aerosol sizes did not appear to provide an effective source of CCN sizes compared to other processes.
NASA Astrophysics Data System (ADS)
Hasenkopf, C. A.
2017-12-01
Increasingly, open data, open-source projects are unearthing rich datasets and tools, previously impossible for more traditional avenues to generate. These projects are possible, in part, because of the emergence of online collaborative and code-sharing tools, decreasing costs of cloud-based services to fetch, store, and serve data, and increasing interest of individuals to contribute their time and skills to 'open projects.' While such projects have generated palpable enthusiasm from many sectors, many of these projects face uncharted paths for sustainability, visibility, and acceptance. Our project, OpenAQ, is an example of an open-source, open data community that is currently forging its own uncharted path. OpenAQ is an open air quality data platform that aggregates and universally formats government and research-grade air quality data from 50 countries across the world. To date, we make available more than 76 million air quality (PM2.5, PM10, SO2, NO2, O3, CO and black carbon) data points through an open Application Programming Interface (API) and a user-customizable download interface at https://openaq.org. The goal of the platform is to enable an ecosystem of users to advance air pollution efforts from science to policy to the private sector. The platform is also an open-source project (https://github.com/openaq) and has only been made possible through the coding and data contributions of individuals around the world. In our first two years of existence, we have seen requests for data to our API skyrocket to more than 6 million datapoints per month, and use-cases as varied as ingesting data aggregated from our system into real-time models of wildfires to building open-source statistical packages (e.g. ropenaq and py-openaq) on top of the platform to creating public-friendly apps and chatbots. We will share a whirl-wind trip through our evolution and the many lessons learned so far related to platform structure, community engagement, organizational model type and sustainability.
H II Region G46.5-0.2: The Interplay between Ionizing Radiation, Molecular Gas, and Star Formation
NASA Astrophysics Data System (ADS)
Paron, S.; Ortega, M. E.; Dubner, G.; Yuan, Jing-Hua; Petriella, A.; Giacani, E.; Zeng Li, Jin; Wu, Yuefang; Liu, Hongli; Huang, Ya Fang; Zhang, Si-Ju
2015-06-01
H ii regions are particularly interesting because they can generate dense layers of gas and dust, elongated columns or pillars of gas pointing toward the ionizing sources, and cometary globules of dense gas where triggered star formation can occur. Understanding the interplay between the ionizing radiation and the dense surrounding gas is very important to explain the origin of these peculiar structures, and hence to characterize triggered star formation. G46.5-0.2 (G46), a poorly studied galactic H ii region located at about 4 kpc, is an excellent target for performing this kind of study. Using public molecular data extracted from the Galactic Ring Survey (13CO J = 1-0) and from the James Clerk Maxwell Telescope data archive (12CO, 13CO, C18O J = 3-2, HCO+, and HCN J = 4-3), and infrared data from the GLIMPSE and MIPSGAL surveys, we perform a complete study of G46, its molecular environment, and the young stellar objects (YSOs) placed around it. We found that G46, probably excited by an O7V star, is located close to the edge of the GRSMC G046.34-00.21 molecular cloud. It presents a horse-shoe morphology opening in the direction of the cloud. We observed a filamentary structure in the molecular gas likely related to G46 and not considerable molecular emission toward its open border. We found that about 10‧ to the southwest of G46 there are some pillar-like features, shining at 8 μm and pointing toward the H ii region open border. We propose that the pillar-like features were carved and sculpted by the ionizing flux from G46. We found several YSOs likely embedded in the molecular cloud grouped in two main concentrations: one, closer to the G46 open border consisting of Class II type sources, and another mostly composed of Class I type YSOs located just ahead of the pillar-like features, strongly suggesting an age gradient in the YSO distribution.
H ii REGION G46.5-0.2: THE INTERPLAY BETWEEN IONIZING RADIATION, MOLECULAR GAS, AND STAR FORMATION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paron, S.; Ortega, M. E.; Dubner, G.
2015-06-15
H ii regions are particularly interesting because they can generate dense layers of gas and dust, elongated columns or pillars of gas pointing toward the ionizing sources, and cometary globules of dense gas where triggered star formation can occur. Understanding the interplay between the ionizing radiation and the dense surrounding gas is very important to explain the origin of these peculiar structures, and hence to characterize triggered star formation. G46.5-0.2 (G46), a poorly studied galactic H ii region located at about 4 kpc, is an excellent target for performing this kind of study. Using public molecular data extracted from themore » Galactic Ring Survey ({sup 13}CO J = 1–0) and from the James Clerk Maxwell Telescope data archive ({sup 12}CO, {sup 13}CO, C{sup 18}O J = 3–2, HCO{sup +}, and HCN J = 4–3), and infrared data from the GLIMPSE and MIPSGAL surveys, we perform a complete study of G46, its molecular environment, and the young stellar objects (YSOs) placed around it. We found that G46, probably excited by an O7V star, is located close to the edge of the GRSMC G046.34-00.21 molecular cloud. It presents a horse-shoe morphology opening in the direction of the cloud. We observed a filamentary structure in the molecular gas likely related to G46 and not considerable molecular emission toward its open border. We found that about 10′ to the southwest of G46 there are some pillar-like features, shining at 8 μm and pointing toward the H ii region open border. We propose that the pillar-like features were carved and sculpted by the ionizing flux from G46. We found several YSOs likely embedded in the molecular cloud grouped in two main concentrations: one, closer to the G46 open border consisting of Class II type sources, and another mostly composed of Class I type YSOs located just ahead of the pillar-like features, strongly suggesting an age gradient in the YSO distribution.« less
Waggle: A Framework for Intelligent Attentive Sensing and Actuation
NASA Astrophysics Data System (ADS)
Sankaran, R.; Jacob, R. L.; Beckman, P. H.; Catlett, C. E.; Keahey, K.
2014-12-01
Advances in sensor-driven computation and computationally steered sensing will greatly enable future research in fields including environmental and atmospheric sciences. We will present "Waggle," an open-source hardware and software infrastructure developed with two goals: (1) reducing the separation and latency between sensing and computing and (2) improving the reliability and longevity of sensing-actuation platforms in challenging and costly deployments. Inspired by "deep-space probe" systems, the Waggle platform design includes features that can support longitudinal studies, deployments with varying communication links, and remote management capabilities. Waggle lowers the barrier for scientists to incorporate real-time data from their sensors into their computations and to manipulate the sensors or provide feedback through actuators. A standardized software and hardware design allows quick addition of new sensors/actuators and associated software in the nodes and enables them to be coupled with computational codes both insitu and on external compute infrastructure. The Waggle framework currently drives the deployment of two observational systems - a portable and self-sufficient weather platform for study of small-scale effects in Chicago's urban core and an open-ended distributed instrument in Chicago that aims to support several research pursuits across a broad range of disciplines including urban planning, microbiology and computer science. Built around open-source software, hardware, and Linux OS, the Waggle system comprises two components - the Waggle field-node and Waggle cloud-computing infrastructure. Waggle field-node affords a modular, scalable, fault-tolerant, secure, and extensible platform for hosting sensors and actuators in the field. It supports insitu computation and data storage, and integration with cloud-computing infrastructure. The Waggle cloud infrastructure is designed with the goal of scaling to several hundreds of thousands of Waggle nodes. It supports aggregating data from sensors hosted by the nodes, staging computation, relaying feedback to the nodes and serving data to end-users. We will discuss the Waggle design principles and their applicability to various observational research pursuits, and demonstrate its capabilities.
Automating NEURON Simulation Deployment in Cloud Resources.
Stockton, David B; Santamaria, Fidel
2017-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.
Automating NEURON Simulation Deployment in Cloud Resources
Santamaria, Fidel
2016-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341
Cloud Response to Arctic Sea Ice Loss and Implications for Feedbacks in the CESM1 Climate Model
NASA Astrophysics Data System (ADS)
Morrison, A.; Kay, J. E.; Chepfer, H.; Guzman, R.; Bonazzola, M.
2017-12-01
Clouds have the potential to accelerate or slow the rate of Arctic sea ice loss through their radiative influence on the surface. Cloud feedbacks can therefore play into Arctic warming as clouds respond to changes in sea ice cover. As the Arctic moves toward an ice-free state, understanding how cloud - sea ice relationships change in response to sea ice loss is critical for predicting the future climate trajectory. From satellite observations we know the effect of present-day sea ice cover on clouds, but how will clouds respond to sea ice loss as the Arctic transitions to a seasonally open water state? In this study we use a lidar simulator to first evaluate cloud - sea ice relationships in the Community Earth System Model (CESM1) against present-day observations (2006-2015). In the current climate, the cloud response to sea ice is well-represented in CESM1: we see no summer cloud response to changes in sea ice cover, but more fall clouds over open water than over sea ice. Since CESM1 is credible for the current Arctic climate, we next assess if our process-based understanding of Arctic cloud feedbacks related to sea ice loss is relevant for understanding future Arctic clouds. In the future Arctic, summer cloud structure continues to be insensitive to surface conditions. As the Arctic warms in the fall, however, the boundary layer deepens and cloud fraction increases over open ocean during each consecutive decade from 2020 - 2100. This study will also explore seasonal changes in cloud properties such as opacity and liquid water path. Results thus far suggest that a positive fall cloud - sea ice feedback exists in the present-day and future Arctic climate.
ERIC Educational Resources Information Center
Ramanarayanan, Vikram; Suendermann-Oeft, David; Lange, Patrick; Ivanov, Alexei V.; Evanini, Keelan; Yu, Zhou; Tsuprun, Eugene; Qian, Yao
2016-01-01
We propose a crowdsourcing-based framework to iteratively and rapidly bootstrap a dialog system from scratch for a new domain. We leverage the open-source modular HALEF dialog system to deploy dialog applications. We illustrate the usefulness of this framework using four different prototype dialog items with applications in the educational domain…
C+ detection of warm dark gas in diffuse clouds
NASA Astrophysics Data System (ADS)
Langer, W. D.; Velusamy, T.; Pineda, J. L.; Goldsmith, P. F.; Li, D.; Yorke, H. W.
2010-10-01
We present the first results of the Herschel open time key program, Galactic Observations of Terahertz C+ (GOT C+) survey of the [CII] 2P3/2-2P1/2 fine-structure line at 1.9 THz (158 μm) using the HIFI instrument on Herschel. We detected 146 interstellar clouds along sixteen lines-of-sight towards the inner Galaxy. We also acquired HI and CO isotopologue data along each line-of-sight for analysis of the physical conditions in these clouds. Here we analyze 29 diffuse clouds (AV < 1.3 mag) in this sample characterized by having [CII] and HI emission, but no detectable CO. We find that [CII] emission is generally stronger than expected for diffuse atomic clouds, and in a number of sources is much stronger than anticipated based on their HI column density. We show that excess [CII] emission in these clouds is best explained by the presence of a significant diffuse warm H2, dark gas, component. This first [CII] 158 μm detection of warm dark gas demonstrates the value of this tracer for mapping this gas throughout the Milky Way and in galaxies. Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.
Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B; Schürmann, Felix; Segev, Idan; Markram, Henry
2016-01-01
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases.
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics
2010-01-01
Background Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. Description An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Conclusions Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms. PMID:21210976
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics.
Taylor, Ronald C
2010-12-21
Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms.
Design Patterns to Achieve 300x Speedup for Oceanographic Analytics in the Cloud
NASA Astrophysics Data System (ADS)
Jacob, J. C.; Greguska, F. R., III; Huang, T.; Quach, N.; Wilson, B. D.
2017-12-01
We describe how we achieve super-linear speedup over standard approaches for oceanographic analytics on a cluster computer and the Amazon Web Services (AWS) cloud. NEXUS is an open source platform for big data analytics in the cloud that enables this performance through a combination of horizontally scalable data parallelism with Apache Spark and rapid data search, subset, and retrieval with tiled array storage in cloud-aware NoSQL databases like Solr and Cassandra. NEXUS is the engine behind several public portals at NASA and OceanWorks is a newly funded project for the ocean community that will mature and extend this capability for improved data discovery, subset, quality screening, analysis, matchup of satellite and in situ measurements, and visualization. We review the Python language API for Spark and how to use it to quickly convert existing programs to use Spark to run with cloud-scale parallelism, and discuss strategies to improve performance. We explain how partitioning the data over space, time, or both leads to algorithmic design patterns for Spark analytics that can be applied to many different algorithms. We use NEXUS analytics as examples, including area-averaged time series, time averaged map, and correlation map.
75 FR 13258 - Announcing a Meeting of the Information Security and Privacy Advisory Board
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-19
.../index.html/ . Agenda: --Cloud Computing Implementations --Health IT --OpenID --Pending Cyber Security... will be available for the public and media. --OpenID --Cloud Computing Implementations --Security...
A Geospatial Information Grid Framework for Geological Survey.
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.
A Geospatial Information Grid Framework for Geological Survey
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255
The Research of the Parallel Computing Development from the Angle of Cloud Computing
NASA Astrophysics Data System (ADS)
Peng, Zhensheng; Gong, Qingge; Duan, Yanyu; Wang, Yun
2017-10-01
Cloud computing is the development of parallel computing, distributed computing and grid computing. The development of cloud computing makes parallel computing come into people’s lives. Firstly, this paper expounds the concept of cloud computing and introduces two several traditional parallel programming model. Secondly, it analyzes and studies the principles, advantages and disadvantages of OpenMP, MPI and Map Reduce respectively. Finally, it takes MPI, OpenMP models compared to Map Reduce from the angle of cloud computing. The results of this paper are intended to provide a reference for the development of parallel computing.
Processing Shotgun Proteomics Data on the Amazon Cloud with the Trans-Proteomic Pipeline*
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W.; Moritz, Robert L.
2015-01-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. PMID:25418363
Processing shotgun proteomics data on the Amazon cloud with the trans-proteomic pipeline.
Slagel, Joseph; Mendoza, Luis; Shteynberg, David; Deutsch, Eric W; Moritz, Robert L
2015-02-01
Cloud computing, where scalable, on-demand compute cycles and storage are available as a service, has the potential to accelerate mass spectrometry-based proteomics research by providing simple, expandable, and affordable large-scale computing to all laboratories regardless of location or information technology expertise. We present new cloud computing functionality for the Trans-Proteomic Pipeline, a free and open-source suite of tools for the processing and analysis of tandem mass spectrometry datasets. Enabled with Amazon Web Services cloud computing, the Trans-Proteomic Pipeline now accesses large scale computing resources, limited only by the available Amazon Web Services infrastructure, for all users. The Trans-Proteomic Pipeline runs in an environment fully hosted on Amazon Web Services, where all software and data reside on cloud resources to tackle large search studies. In addition, it can also be run on a local computer with computationally intensive tasks launched onto the Amazon Elastic Compute Cloud service to greatly decrease analysis times. We describe the new Trans-Proteomic Pipeline cloud service components, compare the relative performance and costs of various Elastic Compute Cloud service instance types, and present on-line tutorials that enable users to learn how to deploy cloud computing technology rapidly with the Trans-Proteomic Pipeline. We provide tools for estimating the necessary computing resources and costs given the scale of a job and demonstrate the use of cloud enabled Trans-Proteomic Pipeline by performing over 1100 tandem mass spectrometry files through four proteomic search engines in 9 h and at a very low cost. © 2015 by The American Society for Biochemistry and Molecular Biology, Inc.
Open Source Dataturbine (OSDT) Android Sensorpod in Environmental Observing Systems
NASA Astrophysics Data System (ADS)
Fountain, T. R.; Shin, P.; Tilak, S.; Trinh, T.; Smith, J.; Kram, S.
2014-12-01
The OSDT Android SensorPod is a custom-designed mobile computing platform for assembling wireless sensor networks for environmental monitoring applications. Funded by an award from the Gordon and Betty Moore Foundation, the OSDT SensorPod represents a significant technological advance in the application of mobile and cloud computing technologies to near-real-time applications in environmental science, natural resources management, and disaster response and recovery. It provides a modular architecture based on open standards and open-source software that allows system developers to align their projects with industry best practices and technology trends, while avoiding commercial vendor lock-in to expensive proprietary software and hardware systems. The integration of mobile and cloud-computing infrastructure represents a disruptive technology in the field of environmental science, since basic assumptions about technology requirements are now open to revision, e.g., the roles of special purpose data loggers and dedicated site infrastructure. The OSDT Android SensorPod was designed with these considerations in mind, and the resulting system exhibits the following characteristics - it is flexible, efficient and robust. The system was developed and tested in the three science applications: 1) a fresh water limnology deployment in Wisconsin, 2) a near coastal marine science deployment at the UCSD Scripps Pier, and 3) a terrestrial ecological deployment in the mountains of Taiwan. As part of a public education and outreach effort, a Facebook page with daily ocean pH measurements from the UCSD Scripps pier was developed. Wireless sensor networks and the virtualization of data and network services is the future of environmental science infrastructure. The OSDT Android SensorPod was designed and developed to harness these new technology developments for environmental monitoring applications.
NASA Astrophysics Data System (ADS)
Miles, B.; Chepudira, K.; LaBar, W.
2017-12-01
The Open Geospatial Consortium (OGC) SensorThings API (STA) specification, ratified in 2016, is a next-generation open standard for enabling real-time communication of sensor data. Building on over a decade of OGC Sensor Web Enablement (SWE) Standards, STA offers a rich data model that can represent a range of sensor and phenomena types (e.g. fixed sensors sensing fixed phenomena, fixed sensors sensing moving phenomena, mobile sensors sensing fixed phenomena, and mobile sensors sensing moving phenomena) and is data agnostic. Additionally, and in contrast to previous SWE standards, STA is developer-friendly, as is evident from its convenient JSON serialization, and expressive OData-based query language (with support for geospatial queries); with its Message Queue Telemetry Transport (MQTT), STA is also well-suited to efficient real-time data publishing and discovery. All these attributes make STA potentially useful for use in environmental monitoring sensor networks. Here we present Kinota(TM), an Open-Source NoSQL implementation of OGC SensorThings for large-scale high-resolution real-time environmental monitoring. Kinota, which roughly stands for Knowledge from Internet of Things Analyses, relies on Cassandra its underlying data store, which is a horizontally scalable, fault-tolerant open-source database that is often used to store time-series data for Big Data applications (though integration with other NoSQL or rational databases is possible). With this foundation, Kinota can scale to store data from an arbitrary number of sensors collecting data every 500 milliseconds. Additionally, Kinota architecture is very modular allowing for customization by adopters who can choose to replace parts of the existing implementation when desirable. The architecture is also highly portable providing the flexibility to choose between cloud providers like azure, amazon, google etc. The scalable, flexible and cloud friendly architecture of Kinota makes it ideal for use in next-generation large-scale and high-resolution real-time environmental monitoring networks used in domains such as hydrology, geomorphology, and geophysics, as well as management applications such as flood early warning, and regulatory enforcement.
Curvature computation in volume-of-fluid method based on point-cloud sampling
NASA Astrophysics Data System (ADS)
Kassar, Bruno B. M.; Carneiro, João N. E.; Nieckele, Angela O.
2018-01-01
This work proposes a novel approach to compute interface curvature in multiphase flow simulation based on Volume of Fluid (VOF) method. It is well documented in the literature that curvature and normal vector computation in VOF may lack accuracy mainly due to abrupt changes in the volume fraction field across the interfaces. This may cause deterioration on the interface tension forces estimates, often resulting in inaccurate results for interface tension dominated flows. Many techniques have been presented over the last years in order to enhance accuracy in normal vectors and curvature estimates including height functions, parabolic fitting of the volume fraction, reconstructing distance functions, coupling Level Set method with VOF, convolving the volume fraction field with smoothing kernels among others. We propose a novel technique based on a representation of the interface by a cloud of points. The curvatures and the interface normal vectors are computed geometrically at each point of the cloud and projected onto the Eulerian grid in a Front-Tracking manner. Results are compared to benchmark data and significant reduction on spurious currents as well as improvement in the pressure jump are observed. The method was developed in the open source suite OpenFOAM® extending its standard VOF implementation, the interFoam solver.
NASA Astrophysics Data System (ADS)
McCoy, Isabel; Wood, Robert; Fletcher, Jennifer
Marine low clouds are key influencers of the climate and contribute significantly to uncertainty in model climate sensitivity due to their small scale and complex processes. Many low clouds occur in large-scale cellular patterns, known as open and closed mesoscale cellular convection (MCC), which have significantly different radiative and microphysical properties. Investigating MCC development and meteorological controls will improve our understanding of their impacts on the climate. We conducted an examination of time-varying meteorological conditions associated with satellite-determined open and closed MCC. The spatial and temporal patterns of MCC clouds were compared with key meteorological control variables calculated from ERA-Interim Reanalysis to highlight dependencies and major differences. This illustrated the influence of environmental stability and surface forcing as well as the role of marine cold air outbreaks (MCAO, the movement of cold air from polar-regions across warmer waters) in MCC cloud formation. Such outbreaks are important to open MCC development and may also influence the transition from open to closed MCC. Our results may lead to improvements in the parameterization of cloudiness and advance the simulation of marine low clouds. National Science Foundation Graduate Research Fellowship Grant (DGE-1256082).
MicMac GIS application: free open source
NASA Astrophysics Data System (ADS)
Duarte, L.; Moutinho, O.; Teodoro, A.
2016-10-01
The use of Remotely Piloted Aerial System (RPAS) for remote sensing applications is becoming more frequent as the technologies on on-board cameras and the platform itself are becoming a serious contender to satellite and airplane imagery. MicMac is a photogrammetric tool for image matching that can be used in different contexts. It is an open source software and it can be used as a command line or with a graphic interface (for each command). The main objective of this work was the integration of MicMac with QGIS, which is also an open source software, in order to create a new open source tool applied to photogrammetry/remote sensing. Python language was used to develop the application. This tool would be very useful in the manipulation and 3D modelling of a set of images. The main objective was to create a toolbar in QGIS with the basic functionalities with intuitive graphic interfaces. The toolbar is composed by three buttons: produce the points cloud, create the Digital Elevation Model (DEM) and produce the orthophoto of the study area. The application was tested considering 35 photos, a subset of images acquired by a RPAS in the Aguda beach area, Porto, Portugal. They were used in order to create a 3D terrain model and from this model obtain an orthophoto and the corresponding DEM. The code is open and can be modified according to the user requirements. This integration would be very useful in photogrammetry and remote sensing community combined with GIS capabilities.
Bimodal pair f-KdV dynamics in star-forming clouds
NASA Astrophysics Data System (ADS)
Karmakar, Pralay Kumar; Haloi, Archana; Roy, Supriya
2018-04-01
A theoretical formalism for investigating the bimodal conjugational mode dynamics of hybrid source, dictated by a unique pair of forced Korteweg-de Vries (f-KdV) equations in a complex turbo-magnetized star-forming cloud, is reported. It uses a standard multi-scale analysis executed over the cloud-governing equations in a closure form to derive the conjugated pair f-KdV system. We numerically see the structural features of two distinctive classes of eigenmode patterns stemming from the conjoint gravito-electrostatic interplay. The electrostatic compressive monotonic aperiodic shock-like patterns and gravitational compressive non-monotonic oscillatory shock-like structures are excitable. It is specifically revealed that the constitutive grain-charge (grain-mass) acts as electrostatic stabilizer (gravitational destabilizer) against the global cloud collapse dynamics. The basic features of the nonlinear coherent structures are confirmed in systematic phase-plane landscapes, indicating electrostatic irregular non-homoclinic open trajectories and gravitational atypical non-chaotic homoclinic fixed-point attractors. The relevance in the real astro-cosmic scenarios of the early phases of structure formation via wave-driven fluid-accretive transport processes is summarily emphasized.
Cloud Environment Automation: from infrastructure deployment to application monitoring
NASA Astrophysics Data System (ADS)
Aiftimiei, C.; Costantini, A.; Bucchi, R.; Italiano, A.; Michelotto, D.; Panella, M.; Pergolesi, M.; Saletta, M.; Traldi, S.; Vistoli, C.; Zizzi, G.; Salomoni, D.
2017-10-01
The potential offered by the cloud paradigm is often limited by technical issues, rules and regulations. In particular, the activities related to the design and deployment of the Infrastructure as a Service (IaaS) cloud layer can be difficult to apply and time-consuming for the infrastructure maintainers. In this paper the research activity, carried out during the Open City Platform (OCP) research project [1], aimed at designing and developing an automatic tool for cloud-based IaaS deployment is presented. Open City Platform is an industrial research project funded by the Italian Ministry of University and Research (MIUR), started in 2014. It intends to research, develop and test new technological solutions open, interoperable and usable on-demand in the field of Cloud Computing, along with new sustainable organizational models that can be deployed for and adopted by the Public Administrations (PA). The presented work and the related outcomes are aimed at simplifying the deployment and maintenance of a complete IaaS cloud-based infrastructure.
Truong, Dennis Q; Hüber, Mathias; Xie, Xihe; Datta, Abhishek; Rahman, Asif; Parra, Lucas C; Dmochowski, Jacek P; Bikson, Marom
2014-01-01
Computational models of brain current flow during transcranial electrical stimulation (tES), including transcranial direct current stimulation (tDCS) and transcranial alternating current stimulation (tACS), are increasingly used to understand and optimize clinical trials. We propose that broad dissemination requires a simple graphical user interface (GUI) software that allows users to explore and design montages in real-time, based on their own clinical/experimental experience and objectives. We introduce two complimentary open-source platforms for this purpose: BONSAI and SPHERES. BONSAI is a web (cloud) based application (available at neuralengr.com/bonsai) that can be accessed through any flash-supported browser interface. SPHERES (available at neuralengr.com/spheres) is a stand-alone GUI application that allow consideration of arbitrary montages on a concentric sphere model by leveraging an analytical solution. These open-source tES modeling platforms are designed go be upgraded and enhanced. Trade-offs between open-access approaches that balance ease of access, speed, and flexibility are discussed. Copyright © 2014 Elsevier Inc. All rights reserved.
Design and Implementation of a Modern Automatic Deformation Monitoring System
NASA Astrophysics Data System (ADS)
Engel, Philipp; Schweimler, Björn
2016-03-01
The deformation monitoring of structures and buildings is an important task field of modern engineering surveying, ensuring the standing and reliability of supervised objects over a long period. Several commercial hardware and software solutions for the realization of such monitoring measurements are available on the market. In addition to them, a research team at the University of Applied Sciences in Neubrandenburg (NUAS) is actively developing a software package for monitoring purposes in geodesy and geotechnics, which is distributed under an open source licence and free of charge. The task of managing an open source project is well-known in computer science, but it is fairly new in a geodetic context. This paper contributes to that issue by detailing applications, frameworks, and interfaces for the design and implementation of open hardware and software solutions for sensor control, sensor networks, and data management in automatic deformation monitoring. It will be discussed how the development effort of networked applications can be reduced by using free programming tools, cloud computing technologies, and rapid prototyping methods.
Providing Access and Visualization to Global Cloud Properties from GEO Satellites
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.; Ayers, J. K.
2015-12-01
Providing public access to cloud macro and microphysical properties is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and method that allows end users to easily browse and access cloud information that is otherwise difficult to acquire and manipulate. The core of the tool is an application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the dynamically generated imagery as an input into their own work flows for both image generation and cloud product requisition. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite derived cloud product information available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider. Using the Open Geospatial Consortium's Web Processing Service as our access method, we describe a system that uses a hybrid local and cloud based parallel processing system that can return both satellite imagery and cloud product imagery as well as the binary data used to generate them in multiple formats. The images and cloud products are sourced from multiple satellites and also "merged" datasets created by temporally and spatially matching satellite sensors. Finally, the tool and API allow users to access information that spans the time ranges that our group has information available. In the case of satellite imagery, the temporal range can span the entire lifetime of the sensor.
Open-cell and closed-cell clouds off Peru [detail
2017-12-08
2010/107 - 04/17 at 21 :05 UTC. Open-cell and closed-cell clouds off Peru, Pacific Ocean. To view the full fame of this image to go: www.flickr.com/photos/gsfc/4557497219/ Resembling a frosted window on a cold winter's day, this lacy pattern of marine clouds was captured off the coast of Peru in the Pacific Ocean by the MODIS on the Aqua satellite on April 19, 2010. The image reveals both open- and closed-cell cumulus cloud patterns. These cells, or parcels of air, often occur in roughly hexagonal arrays in a layer of fluid (the atmosphere often behaves like a fluid) that begins to "boil," or convect, due to heating at the base or cooling at the top of the layer. In "closed" cells warm air is rising in the center, and sinking around the edges, so clouds appear in cell centers, but evaporate around cell edges. This produces cloud formations like those that dominate the lower left. The reverse flow can also occur: air can sink in the center of the cell and rise at the edge. This process is called "open cell" convection, and clouds form at cell edges around open centers, which creates a lacy, hollow-looking pattern like the clouds in the upper right. Closed and open cell convection represent two stable atmospheric configurations — two sides of the convection coin. But what determines which path the "boiling" atmosphere will take? Apparently the process is highly chaotic, and there appears to be no way to predict whether convection will result in open or closed cells. Indeed, the atmosphere may sometimes flip between one mode and another in no predictable pattern. Satellite: Aqua NASA/GSFC/Jeff Schmaltz/MODIS Land Rapid Response Team To learn more about MODIS go to: rapidfire.sci.gsfc.nasa.gov/gallery/?latest NASA Goddard Space Flight Center is home to the nation's largest organization of combined scientists, engineers and technologists that build spacecraft, instruments and new technology to study the Earth, the sun, our solar system, and the universe.
Microphysical and macrophysical responses of marine stratocumulus polluted by underlying ships
NASA Astrophysics Data System (ADS)
Christensen, Matthew Wells
Multiple sensors flying in the A-train constellation of satellites were used to determine the extent to which aerosol plumes from ships passing below marine stratocumulus alter the microphysical and macrophysical properties of the clouds. Aerosol plumes generated by ships sometimes influence cloud microphysical properties (effective radius) and, to a largely undetermined extent, cloud macrophysical properties (liquid water path, coverage, depth, precipitation, and longevity). Aerosol indirect effects were brought into focus, using observations from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) and the 94-GHZ radar onboard CloudSat. To assess local cloud scale responses to aerosol, the locations of over one thousand ship tracks coinciding with the radar were meticulously logged by hand from the Moderate Resolution Imaging Spectroradiometer (MODIS) imagery. MODIS imagery was used to distinguish ship tracks that were embedded in closed, open, and unclassifiable mesoscale cellular cloud structures. The impact of aerosol on the microphysical cloud properties in both the closed and open cell regimes were consistent with the changes predicted by the Twomey hypothesis. For the macrophysical changes, differences in the sign and magnitude of these properties were observed between cloud regimes. The results demonstrate that the spatial extent of rainfall (rain cover fraction) and intensity decrease in the clouds contaminated by the ship plume compared to the ambient pristine clouds. Although reductions of precipitation were common amongst the clouds with detectable rainfall (72% of cases), a substantial fraction of ship tracks (28% of cases) exhibited the opposite response. The sign and strength of the response was tied to the type of stratocumulus (e.g., closed vs open cells), depth of the boundary layer, and humidity in the free-troposphere. When closed cellular clouds were identified, liquid water path, drizzle rate, and rain cover fraction (an average relative decrease of 61%) was significantly smaller in the ship-contaminated clouds. Differences in drizzle rate resulted primarily from the reductions in rain cover fraction (i.e., fewer pixels were identified with rain in the clouds polluted by the ship). The opposite occurred in the open cell regime. Ship plumes ingested into this regime resulted in significantly deeper and brighter clouds with higher liquid water amounts and rain rates. Enhanced rain rates (average relative increase of 89%) were primarily due to the changes in intensity (i.e., rain rates on the 1.1 km pixel scale were higher in the ship contaminated clouds) and, to a lesser extent, rain cover fraction. One implication for these differences is that the local aerosol indirect radiative forcing was more than five times larger for ship tracks observed in the open cell regime (-59 W m-2) compared to those identified in the closed cell regime (-12 W m -2). The results presented here underline the need to consider the mesoscale structure of stratocumulus when examining the cloud dynamic response to changes in aerosol concentration. In the final part of the dissertation, the focus shifted to the climate scale to examine the impact of shipping on the Earth's radiation budget. Two studies were employed, in the first; changes to the radiative properties of boundary layer clouds (i.e., cloud top heights less than 3 km) were examined in response to the substantial decreases in ship traffic that resulted from the recent world economic recession in 2008. Differences in the annually averaged droplet effective radius and top of atmosphere outgoing shortwave radiative flux between 2007 and 2009 did not manifest as a clear response in the climate system and, was probably masked either due to competing aerosol cloud feedbacks or by interannual climate variability. In the second study, a method was developed to estimate the radiative forcing from shipping by convolving lanes of densely populated ships onto the global distributions of closed and open cell stratocumulus clouds. Closed cells were observed more than twice as often as open cells. Despite the smaller abundance of open cells, a significant portion of the radiaitve forcing from shipping was claimed by this regime. On the whole, the global radiative forcing from ship tracks was small (approximately -0.45 mW m-2) compared to the radiative forcing associated with the atmospheric buildup of anthropogenic CO2.
JINR cloud infrastructure evolution
NASA Astrophysics Data System (ADS)
Baranov, A. V.; Balashov, N. A.; Kutovskiy, N. A.; Semenov, R. N.
2016-09-01
To fulfil JINR commitments in different national and international projects related to the use of modern information technologies such as cloud and grid computing as well as to provide a modern tool for JINR users for their scientific research a cloud infrastructure was deployed at Laboratory of Information Technologies of Joint Institute for Nuclear Research. OpenNebula software was chosen as a cloud platform. Initially it was set up in simple configuration with single front-end host and a few cloud nodes. Some custom development was done to tune JINR cloud installation to fit local needs: web form in the cloud web-interface for resources request, a menu item with cloud utilization statistics, user authentication via Kerberos, custom driver for OpenVZ containers. Because of high demand in that cloud service and its resources over-utilization it was re-designed to cover increasing users' needs in capacity, availability and reliability. Recently a new cloud instance has been deployed in high-availability configuration with distributed network file system and additional computing power.
In situ observations of Arctic cloud properties across the Beaufort Sea marginal ice zone
NASA Astrophysics Data System (ADS)
Corr, C.; Moore, R.; Winstead, E.; Thornhill, K. L., II; Crosbie, E.; Ziemba, L. D.; Beyersdorf, A. J.; Chen, G.; Martin, R.; Shook, M.; Corbett, J.; Smith, W. L., Jr.; Anderson, B. E.
2016-12-01
Clouds play an important role in Arctic climate. This is particularly true over the Arctic Ocean where feedbacks between clouds and sea-ice impact the surface radiation budget through modifications of sea-ice extent, ice thickness, cloud base height, and cloud cover. This work summarizes measurements of Arctic cloud properties made aboard the NASA C-130 aircraft over the Beaufort Sea during ARISE (Arctic Radiation - IceBridge Sea&Ice Experiment) in September 2014. The influence of surface-type on cloud properties is also investigated. Specifically, liquid water content (LWC), droplet concentrations, and droplet size distributions are compared for clouds sampled over three distinct regimes in the Beaufort Sea: 1) open water, 2) the marginal ice zone, and 3) sea-ice. Regardless of surface type, nearly all clouds intercepted during ARISE were liquid-phase clouds. However, differences in droplet size distributions and concentrations were evident for the surface types; clouds over the MIZ and sea-ice generally had fewer and larger droplets compared to those over open water. The potential implication these results have for understanding cloud-surface albedo climate feedbacks in Arctic are discussed.
Georeferencing UAS Derivatives Through Point Cloud Registration with Archived Lidar Datasets
NASA Astrophysics Data System (ADS)
Magtalas, M. S. L. Y.; Aves, J. C. L.; Blanco, A. C.
2016-10-01
Georeferencing gathered images is a common step before performing spatial analysis and other processes on acquired datasets using unmanned aerial systems (UAS). Methods of applying spatial information to aerial images or their derivatives is through onboard GPS (Global Positioning Systems) geotagging, or through tying of models through GCPs (Ground Control Points) acquired in the field. Currently, UAS (Unmanned Aerial System) derivatives are limited to meter-levels of accuracy when their generation is unaided with points of known position on the ground. The use of ground control points established using survey-grade GPS or GNSS receivers can greatly reduce model errors to centimeter levels. However, this comes with additional costs not only with instrument acquisition and survey operations, but also in actual time spent in the field. This study uses a workflow for cloud-based post-processing of UAS data in combination with already existing LiDAR data. The georeferencing of the UAV point cloud is executed using the Iterative Closest Point algorithm (ICP). It is applied through the open-source CloudCompare software (Girardeau-Montaut, 2006) on a `skeleton point cloud'. This skeleton point cloud consists of manually extracted features consistent on both LiDAR and UAV data. For this cloud, roads and buildings with minimal deviations given their differing dates of acquisition are considered consistent. Transformation parameters are computed for the skeleton cloud which could then be applied to the whole UAS dataset. In addition, a separate cloud consisting of non-vegetation features automatically derived using CANUPO classification algorithm (Brodu and Lague, 2012) was used to generate a separate set of parameters. Ground survey is done to validate the transformed cloud. An RMSE value of around 16 centimeters was found when comparing validation data to the models georeferenced using the CANUPO cloud and the manual skeleton cloud. Cloud-to-cloud distance computations of CANUPO and manual skeleton clouds were obtained with values for both equal to around 0.67 meters at 1.73 standard deviation.
Accessing Cloud Properties and Satellite Imagery: A tool for visualization and data mining
NASA Astrophysics Data System (ADS)
Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Palikonda, R.
2016-12-01
Providing public access to imagery of cloud macro and microphysical properties and the underlying satellite imagery is a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a tool and system that allows end users to easily browse cloud information and satellite imagery that is otherwise difficult to acquire and manipulate. The tool has two uses, one to visualize the data and the other to access the data directly. It uses a widely used access protocol, the Open Geospatial Consortium's Web Map and Processing Services, to encourage user to access the data we produce. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud. One goal of the tool is to provide a demonstration of the back end capability to end users so that they can use the dynamically generated imagery and data as an input to their own work flows or to set up data mining constraints. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information and satellite imagery accessible and easily searchable. Increasingly, information is used in a "mash-up" form where multiple sources of information are combined to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much cutting edge scientific knowledge, observations and products available to the citizen science, research and interested communities for these kinds of "mash-ups" as well as provide a means for automated systems to data mine our information. This tool and access method provides a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.
Precipitation-generated oscillations in open cellular cloud fields.
Feingold, Graham; Koren, Ilan; Wang, Hailong; Xue, Huiwen; Brewer, Wm Alan
2010-08-12
Cloud fields adopt many different patterns that can have a profound effect on the amount of sunlight reflected back to space, with important implications for the Earth's climate. These cloud patterns can be observed in satellite images of the Earth and often exhibit distinct cell-like structures associated with organized convection at scales of tens of kilometres. Recent evidence has shown that atmospheric aerosol particles-through their influence on precipitation formation-help to determine whether cloud fields take on closed (more reflective) or open (less reflective) cellular patterns. The physical mechanisms controlling the formation and evolution of these cells, however, are still poorly understood, limiting our ability to simulate realistically the effects of clouds on global reflectance. Here we use satellite imagery and numerical models to show how precipitating clouds produce an open cellular cloud pattern that oscillates between different, weakly stable states. The oscillations are a result of precipitation causing downward motion and outflow from clouds that were previously positively buoyant. The evaporating precipitation drives air down to the Earth's surface, where it diverges and collides with the outflows of neighbouring precipitating cells. These colliding outflows form surface convergence zones and new cloud formation. In turn, the newly formed clouds produce precipitation and new colliding outflow patterns that are displaced from the previous ones. As successive cycles of this kind unfold, convergence zones alternate with divergence zones and new cloud patterns emerge to replace old ones. The result is an oscillating, self-organized system with a characteristic cell size and precipitation frequency.
Battlespace Awareness: Heterogeneous Sensor Maps of Large Scale, Complex Environments
2017-06-13
reference frames enable a system designer to describe the position of any sensor or platform at any point of time. This section introduces the...analysis to evaluate the quality of reconstructions created by our algorithms. CloudCompare is an open-source tool designed for this purpose [65]. In...structure of the data. The data term seeks to keep the proposed solution (u) similar to the originally observed values ( f ). A systems designer must
Unified Geophysical Cloud Platform (UGCP) for Seismic Monitoring and other Geophysical Applications.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Starovoit, Y. O.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.
2016-12-01
We present Unified Geophysical Cloud Platform (UGCP) or UniGeoCloud as an innovative approach for geophysical data processing in the Cloud environment with the ability to run any type of data processing software in isolated environment within the single Cloud platform. We've developed a simple and quick method of several open-source widely known software seismic packages (SeisComp3, Earthworm, Geotool, MSNoise) installation which does not require knowledge of system administration, configuration, OS compatibility issues etc. and other often annoying details preventing time wasting for system configuration work. Installation process is simplified as "mouse click" on selected software package from the Cloud market place. The main objective of the developed capability was the software tools conception with which users are able to design and install quickly their own highly reliable and highly available virtual IT-infrastructure for the organization of seismic (and in future other geophysical) data processing for either research or monitoring purposes. These tools provide access to any seismic station data available in open IP configuration from the different networks affiliated with different Institutions and Organizations. It allows also setting up your own network as you desire by selecting either regionally deployed stations or the worldwide global network based on stations selection form the global map. The processing software and products and research results could be easily monitored from everywhere using variety of user's devices form desk top computers to IT gadgets. Currents efforts of the development team are directed to achieve Scalability, Reliability and Sustainability (SRS) of proposed solutions allowing any user to run their applications with the confidence of no data loss and no failure of the monitoring or research software components. The system is suitable for quick rollout of NDC-in-Box software package developed for State Signatories and aimed for promotion of data processing collected by the IMS Network.
Athey, Brian D; Braxenthaler, Michael; Haas, Magali; Guo, Yike
2013-01-01
tranSMART is an emerging global open source public private partnership community developing a comprehensive informatics-based analysis and data-sharing cloud platform for clinical and translational research. The tranSMART consortium includes pharmaceutical and other companies, not-for-profits, academic entities, patient advocacy groups, and government stakeholders. The tranSMART value proposition relies on the concept that the global community of users, developers, and stakeholders are the best source of innovation for applications and for useful data. Continued development and use of the tranSMART platform will create a means to enable "pre-competitive" data sharing broadly, saving money and, potentially accelerating research translation to cures. Significant transformative effects of tranSMART includes 1) allowing for all its user community to benefit from experts globally, 2) capturing the best of innovation in analytic tools, 3) a growing 'big data' resource, 4) convergent standards, and 5) new informatics-enabled translational science in the pharma, academic, and not-for-profit sectors.
Sharing knowledge of Planetary Datasets through the Web-Based PRoGIS
NASA Astrophysics Data System (ADS)
Giordano, M. G.; Morley, J. M.; Muller, J. P. M.; Barnes, R. B.; Tao, Y. T.
2015-10-01
The large amount of raw and derived data available from various planetary surface missions (e.g. Mars and Moon in our case) has been integrated withco-registered and geocoded orbital image data to provide rover traverses and camera site locations in universal global co-ordinates [1]. This then allows an integrated GIS to use these geocoded products for scientific applications: we aim to create a web interface, PRoGIS, with minimal controls focusing on the usability and visualisation of the data, to allow planetary geologists to share annotated surface observations. These observations in a common context are shared between different tools and software (PRoGIS, Pro3D, 3D point cloud viewer). Our aim is to use only Open Source components that integrate Open Web Services for planetary data to make available an universal platform with a WebGIS interface, as well as a 3D point cloud and a Panorama viewer to explore derived data. On top of these tools we are building capabilities to make and share annotations amongst users. We use Python and Django for the server-side framework and Open Layers 3 for the WebGIS client. For good performance previewing 3D data (point clouds, pictures on the surface and panoramas) we employ ThreeJS, a WebGL Javascript library. Additionally, user and group controls allow scientists to store and share their observations. PRoGIS not only displays data but also launches sophisticated 3D vision reprocessing (PRoVIP) and an immersive 3D analysis environment (PRo3D).
Community Intercomparison Suite (CIS) v1.4.0: a tool for intercomparing models and observations
NASA Astrophysics Data System (ADS)
Watson-Parris, Duncan; Schutgens, Nick; Cook, Nicholas; Kipling, Zak; Kershaw, Philip; Gryspeerdt, Edward; Lawrence, Bryan; Stier, Philip
2016-09-01
The Community Intercomparison Suite (CIS) is an easy-to-use command-line tool which has been developed to allow the straightforward intercomparison of remote sensing, in situ and model data. While there are a number of tools available for working with climate model data, the large diversity of sources (and formats) of remote sensing and in situ measurements necessitated a novel software solution. Developed by a professional software company, CIS supports a large number of gridded and ungridded data sources "out-of-the-box", including climate model output in NetCDF or the UK Met Office pp file format, CloudSat, CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization), MODIS (MODerate resolution Imaging Spectroradiometer), Cloud and Aerosol CCI (Climate Change Initiative) level 2 satellite data and a number of in situ aircraft and ground station data sets. The open-source architecture also supports user-defined plugins to allow many other sources to be easily added. Many of the key operations required when comparing heterogenous data sets are provided by CIS, including subsetting, aggregating, collocating and plotting the data. Output data are written to CF-compliant NetCDF files to ensure interoperability with other tools and systems. The latest documentation, including a user manual and installation instructions, can be found on our website (http://cistools.net). Here, we describe the need which this tool fulfils, followed by descriptions of its main functionality (as at version 1.4.0) and plugin architecture which make it unique in the field.
Sirepo for Synchrotron Radiation Workshop
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagler, Robert; Moeller, Paul; Rakitin, Maksim
Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jinja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is the Synchrotron Radiation Workshop (SRW). SRW computes synchrotron radiation from relativistic electrons in arbitrary magnetic fields and propagates the radiation wavefronts through optical beamlines. SRW is open source and is primarily supported by Dr. Oleg Chubar of NSLS-II at Brookhaven National Laboratory.« less
Leveraging human oversight and intervention in large-scale parallel processing of open-source data
NASA Astrophysics Data System (ADS)
Casini, Enrico; Suri, Niranjan; Bradshaw, Jeffrey M.
2015-05-01
The popularity of cloud computing along with the increased availability of cheap storage have led to the necessity of elaboration and transformation of large volumes of open-source data, all in parallel. One way to handle such extensive volumes of information properly is to take advantage of distributed computing frameworks like Map-Reduce. Unfortunately, an entirely automated approach that excludes human intervention is often unpredictable and error prone. Highly accurate data processing and decision-making can be achieved by supporting an automatic process through human collaboration, in a variety of environments such as warfare, cyber security and threat monitoring. Although this mutual participation seems easily exploitable, human-machine collaboration in the field of data analysis presents several challenges. First, due to the asynchronous nature of human intervention, it is necessary to verify that once a correction is made, all the necessary reprocessing is done in chain. Second, it is often needed to minimize the amount of reprocessing in order to optimize the usage of resources due to limited availability. In order to improve on these strict requirements, this paper introduces improvements to an innovative approach for human-machine collaboration in the processing of large amounts of open-source data in parallel.
NASA Astrophysics Data System (ADS)
Green, Joel D.; DIGIT OTKP Team
2010-01-01
The DIGIT (Dust, Ice, and Gas In Time) Open Time Key Project utilizes the PACS spectrometer (57-210 um) onboard the Herschel Space Observatory to study the colder regions of young stellar objects and protostellar cores, complementary to recent observations from Spitzer and ground-based observatories. DIGIT focuses on 30 embedded sources and 64 disk sources, and includes supporting photometry from PACS and SPIRE, as well as spectroscopy from HIFI, selected from nearby molecular clouds. For the embedded sources, PACS spectroscopy will allow us to address the origin of [CI] and high-J CO lines observed with ISO-LWS. Our observations are sensitive to the presence of cold crystalline water ice, diopside, and carbonates. Additionally, PACS scans are 5x5 maps of the embedded sources and their outflows. Observations of more evolved disk sources will sample low and intermediate mass objects as well as a variety of spectral types from A to M. Many of these sources are extremely rich in mid-IR crystalline dust features, enabling us to test whether similar features can be detected at larger radii, via colder dust emission at longer wavelengths. If processed grains are present only in the inner disk (in the case of full disks) or from the emitting wall surface which marks the outer edge of the gap (in the case of transitional disks), there must be short timescales for dust processing; if processed grains are detected in the outer disk, radial transport must be rapid and efficient. Weak bands of forsterite and clino- and ortho-enstatite in the 60-75 um range provide information about the conditions under which these materials were formed. For the Science Demonstration Phase we are observing an embedded protostar (DK Cha) and a Herbig Ae/Be star (HD 100546), exemplars of the kind of science that DIGIT will achieve over the full program.
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
Background As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Methods Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Results Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Conclusions Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. PMID:24464852
MarFS, a Near-POSIX Interface to Cloud Objects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inman, Jeffrey Thornton; Vining, William Flynn; Ransom, Garrett Wilson
The engineering forces driving development of “cloud” storage have produced resilient, cost-effective storage systems that can scale to 100s of petabytes, with good parallel access and bandwidth. These features would make a good match for the vast storage needs of High-Performance Computing datacenters, but cloud storage gains some of its capability from its use of HTTP-style Representational State Transfer (REST) semantics, whereas most large datacenters have legacy applications that rely on POSIX file-system semantics. MarFS is an open-source project at Los Alamos National Laboratory that allows us to present cloud-style object-storage as a scalable near-POSIX file system. We have alsomore » developed a new storage architecture to improve bandwidth and scalability beyond what’s available in commodity object stores, while retaining their resilience and economy. Additionally, we present a scheme for scaling the POSIX interface to allow billions of files in a single directory and trillions of files in total.« less
MarFS, a Near-POSIX Interface to Cloud Objects
Inman, Jeffrey Thornton; Vining, William Flynn; Ransom, Garrett Wilson; ...
2017-01-01
The engineering forces driving development of “cloud” storage have produced resilient, cost-effective storage systems that can scale to 100s of petabytes, with good parallel access and bandwidth. These features would make a good match for the vast storage needs of High-Performance Computing datacenters, but cloud storage gains some of its capability from its use of HTTP-style Representational State Transfer (REST) semantics, whereas most large datacenters have legacy applications that rely on POSIX file-system semantics. MarFS is an open-source project at Los Alamos National Laboratory that allows us to present cloud-style object-storage as a scalable near-POSIX file system. We have alsomore » developed a new storage architecture to improve bandwidth and scalability beyond what’s available in commodity object stores, while retaining their resilience and economy. Additionally, we present a scheme for scaling the POSIX interface to allow billions of files in a single directory and trillions of files in total.« less
AUGUSTO'S Sundial: Image-Based Modeling for Reverse Engeneering Purposes
NASA Astrophysics Data System (ADS)
Baiocchi, V.; Barbarella, M.; Del Pizzo, S.; Giannone, F.; Troisi, S.; Piccaro, C.; Marcantonio, D.
2017-02-01
A photogrammetric survey of a unique archaeological site is reported in this paper. The survey was performed using both a panoramic image-based solution and by classical procedure. The panoramic image-based solution was carried out employing a commercial solution: the Trimble V10 Imaging Rover (IR). Such instrument is an integrated cameras system that captures 360 degrees digital panoramas, composed of 12 images, with a single push. The direct comparison of the point clouds obtained with traditional photogrammetric procedure and V10 stations, using the same GCP coordinates has been carried out in Cloud Compare, open source software that can provide the comparison between two point clouds supplied by all the main statistical data. The site is a portion of the dial plate of the "Horologium Augusti" inaugurated in 9 B.C.E. in the area of Campo Marzio and still present intact in the same position, in a cellar of a building in Rome, around 7 meter below the present ground level.
NASA Astrophysics Data System (ADS)
Goren, Tom; Muelmenstaedt, Johannes; Rosenfeld, Daniel; Quaas, Johannes
2017-04-01
Marine stratocumulus clouds (MSC) occur in two main cloud regimes of open and closed cells that differ significantly by their cloud cover. Closed cells gradually get cleansed of high CCN concentrations in a process that involves initiation of drizzle that breaks the full cloud cover into open cells. The drizzle creates downdrafts that organize the convection along converging gust fronts, which in turn produce stronger updrafts that can sustain more cloud water that compensates the depletion of the cloud water by the rain. In addition, having stronger updrafts allow the clouds to grow relatively deep before rain starts to deplete its cloud water. Therefore, lower droplet concentrations and stronger rain would lead to lower cloud fraction, but not necessary also to lower liquid water path (LWP). The fundamental relationships between these key variables derived from global climate model (GCM) simulations are analyzed with respect to observations in order to determine whether the GCM parameterizations can represent well the governing physical mechanisms upon MSC regime transitions. The results are used to evaluate the feasibility of GCM's for estimating aerosol cloud-mediated radiative forcing upon MSC regime transitions, which are responsible for the largest aerosol cloud-mediated radiative forcing.
Van Geit, Werner; Gevaert, Michael; Chindemi, Giuseppe; Rössert, Christian; Courcol, Jean-Denis; Muller, Eilif B.; Schürmann, Felix; Segev, Idan; Markram, Henry
2016-01-01
At many scales in neuroscience, appropriate mathematical models take the form of complex dynamical systems. Parameterizing such models to conform to the multitude of available experimental constraints is a global non-linear optimisation problem with a complex fitness landscape, requiring numerical techniques to find suitable approximate solutions. Stochastic optimisation approaches, such as evolutionary algorithms, have been shown to be effective, but often the setting up of such optimisations and the choice of a specific search algorithm and its parameters is non-trivial, requiring domain-specific expertise. Here we describe BluePyOpt, a Python package targeted at the broad neuroscience community to simplify this task. BluePyOpt is an extensible framework for data-driven model parameter optimisation that wraps and standardizes several existing open-source tools. It simplifies the task of creating and sharing these optimisations, and the associated techniques and knowledge. This is achieved by abstracting the optimisation and evaluation tasks into various reusable and flexible discrete elements according to established best-practices. Further, BluePyOpt provides methods for setting up both small- and large-scale optimisations on a variety of platforms, ranging from laptops to Linux clusters and cloud-based compute infrastructures. The versatility of the BluePyOpt framework is demonstrated by working through three representative neuroscience specific use cases. PMID:27375471
Cloud-based Web Services for Near-Real-Time Web access to NPP Satellite Imagery and other Data
NASA Astrophysics Data System (ADS)
Evans, J. D.; Valente, E. G.
2010-12-01
We are building a scalable, cloud computing-based infrastructure for Web access to near-real-time data products synthesized from the U.S. National Polar-Orbiting Environmental Satellite System (NPOESS) Preparatory Project (NPP) and other geospatial and meteorological data. Given recent and ongoing changes in the the NPP and NPOESS programs (now Joint Polar Satellite System), the need for timely delivery of NPP data is urgent. We propose an alternative to a traditional, centralized ground segment, using distributed Direct Broadcast facilities linked to industry-standard Web services by a streamlined processing chain running in a scalable cloud computing environment. Our processing chain, currently implemented on Amazon.com's Elastic Compute Cloud (EC2), retrieves raw data from NASA's Moderate Resolution Imaging Spectroradiometer (MODIS) and synthesizes data products such as Sea-Surface Temperature, Vegetation Indices, etc. The cloud computing approach lets us grow and shrink computing resources to meet large and rapid fluctuations (twice daily) in both end-user demand and data availability from polar-orbiting sensors. Early prototypes have delivered various data products to end-users with latencies between 6 and 32 minutes. We have begun to replicate machine instances in the cloud, so as to reduce latency and maintain near-real time data access regardless of increased data input rates or user demand -- all at quite moderate monthly costs. Our service-based approach (in which users invoke software processes on a Web-accessible server) facilitates access into datasets of arbitrary size and resolution, and allows users to request and receive tailored and composite (e.g., false-color multiband) products on demand. To facilitate broad impact and adoption of our technology, we have emphasized open, industry-standard software interfaces and open source software. Through our work, we envision the widespread establishment of similar, derived, or interoperable systems for processing and serving near-real-time data from NPP and other sensors. A scalable architecture based on cloud computing ensures cost-effective, real-time processing and delivery of NPP and other data. Access via standard Web services maximizes its interoperability and usefulness.
CCN and IN concentration measurements during the Antarctic Circumnavigation Expedition
NASA Astrophysics Data System (ADS)
Stratmann, F.; Henning, S.; Löffler, M.; Welti, A.; Hartmann, M.; Wernli, H.; Baccarini, A.; Schmale, J.
2017-12-01
Cloud condensation nuclei (CCN) and ice nuclei (IN) concentrations measured during the Antarctic Circumnavigation Expedition (ACE) within the Study of Preindustrial-like Aerosol-Climate Effects (SPACE) are presented. The measurements give a circumpolar transect through the Sub Antarctic Ocean, where existing measurements are scarce. ACE took place during the austral summer 2016/17 and included exploration of different environments from pristine open Ocean to Antarctic islands and the southernmost ports of the 3 surrounding continents. CCN concentrations are measured over the entire range of expected in-cloud supersaturations from 0.1 to 1% using a CCNc instrument from DMT. IN concentrations are determined from filter samples at water saturated conditions from -5°C to -25°C, covering common temperatures of mixed-phase cloud glaciation. The sensitivity of measured IN and CCN concentrations to meteorological parameters, activity of marine biology and location is assessed to gain insight into potential sources of CCN and IN. Back trajectory modelling is used to allocate regional variations to aerosol sources originating in the marine boundary layer or long-range transport. The gained datasets constrain CCN and IN concentrations in the marine boundary layer along the cruise track. The comprehensive set of parallel measured parameters during ACE allow to evaluate contributions of local ocean-surface sources versus long-range transport to Sub-Antarctic CCN and IN. The measurements can be used as input to climate models, e.g. pristine Sub Antarctic conditions can provide an approximation for a pre-industrial environment.
QCloud: A cloud-based quality control system for mass spectrometry-based proteomics laboratories
Chiva, Cristina; Olivella, Roger; Borràs, Eva; Espadas, Guadalupe; Pastor, Olga; Solé, Amanda
2018-01-01
The increasing number of biomedical and translational applications in mass spectrometry-based proteomics poses new analytical challenges and raises the need for automated quality control systems. Despite previous efforts to set standard file formats, data processing workflows and key evaluation parameters for quality control, automated quality control systems are not yet widespread among proteomics laboratories, which limits the acquisition of high-quality results, inter-laboratory comparisons and the assessment of variability of instrumental platforms. Here we present QCloud, a cloud-based system to support proteomics laboratories in daily quality assessment using a user-friendly interface, easy setup, automated data processing and archiving, and unbiased instrument evaluation. QCloud supports the most common targeted and untargeted proteomics workflows, it accepts data formats from different vendors and it enables the annotation of acquired data and reporting incidences. A complete version of the QCloud system has successfully been developed and it is now open to the proteomics community (http://qcloud.crg.eu). QCloud system is an open source project, publicly available under a Creative Commons License Attribution-ShareAlike 4.0. PMID:29324744
NASA Astrophysics Data System (ADS)
Vijay Singh, Ran; Agilandeeswari, L.
2017-11-01
To handle the large amount of client’s data in open cloud lots of security issues need to be address. Client’s privacy should not be known to other group members without data owner’s valid permission. Sometime clients are fended to have accessing with open cloud servers due to some restrictions. To overcome the security issues and these restrictions related to storing, data sharing in an inter domain network and privacy checking, we propose a model in this paper which is based on an identity based cryptography in data transmission and intermediate entity which have client’s reference with identity that will take control handling of data transmission in an open cloud environment and an extended remote privacy checking technique which will work at admin side. On behalf of data owner’s authority this proposed model will give best options to have secure cryptography in data transmission and remote privacy checking either as private or public or instructed. The hardness of Computational Diffie-Hellman assumption algorithm for key exchange makes this proposed model more secure than existing models which are being used for public cloud environment.
The Arctic Summer Cloud Ocean Study (ASCOS): overview and experimental design
NASA Astrophysics Data System (ADS)
Tjernström, M.; Leck, C.; Birch, C. E.; Bottenheim, J. W.; Brooks, B. J.; Brooks, I. M.; Bäcklin, L.; Chang, R. Y.-W.; de Leeuw, G.; Di Liberto, L.; de la Rosa, S.; Granath, E.; Graus, M.; Hansel, A.; Heintzenberg, J.; Held, A.; Hind, A.; Johnston, P.; Knulst, J.; Martin, M.; Matrai, P. A.; Mauritsen, T.; Müller, M.; Norris, S. J.; Orellana, M. V.; Orsini, D. A.; Paatero, J.; Persson, P. O. G.; Gao, Q.; Rauschenberg, C.; Ristovski, Z.; Sedlar, J.; Shupe, M. D.; Sierau, B.; Sirevaag, A.; Sjogren, S.; Stetzer, O.; Swietlicki, E.; Szczodrak, M.; Vaattovaara, P.; Wahlberg, N.; Westberg, M.; Wheeler, C. R.
2014-03-01
The climate in the Arctic is changing faster than anywhere else on earth. Poorly understood feedback processes relating to Arctic clouds and aerosol-cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in situ in this difficult-to-reach region with logistically demanding environmental conditions. The Arctic Summer Cloud Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007-2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait: two in open water and two in the marginal ice zone. After traversing the pack ice northward, an ice camp was set up on 12 August at 87°21' N, 01°29' W and remained in operation through 1 September, drifting with the ice. During this time, extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first-ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggests the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations, and the balance between local and remote aerosols sources remains open. Lack of cloud condensation nuclei (CCN) was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.
Voltages induced on a power distribution line by overhead cloud lightning
NASA Technical Reports Server (NTRS)
Yacoub, Ziad; Rubinstein, Marcos; Uman, Martin A.; Thomson, Ewen M.; Medelius, Pedro J.
1991-01-01
Voltages induced by overhead cloud lightning on a 448 m open circuited power distribution line and the corresponding north-south component of the lightning magnetic field were simultaneously measured at the NASA Kennedy Space Center during the summer of 1986. The incident electric field was calculated from the measured magnetic field. The electric field was then used as an input to the computer program, EMPLIN, that calculated the voltages at the two ends of the power line. EMPLIN models the frequency domain field/power coupling theory found, for example, in Ianoz et al. The direction of the source, which is also one of the inputs to EMPLIN, was crudely determined from a three station time delay technique. The authors found reasonably good agreement between calculated and measured waveforms.
Observations of the interstellar gas with the Copernicus satellite
NASA Technical Reports Server (NTRS)
Morton, D. C.
1975-01-01
Results are reviewed for Copernicus far-UV measurements of the absorption lines of H I, D I, H2, and heavier elements in the interstellar gas. Column densities along several lines of sight, as estimated from Ly-alpha absorption-line profiles, confirm that wide differences in the gas density are present in various directions. The measurement of interstellar D I implies an open universe unless alternate sources for this nuclide are found. Analysis of reddened stars for which the line of sight passes through one or more interstellar clouds indicates a depletion of several heavy elements in the gas. It is suggested that the depleted elements may be present in grains rather than molecules and that the intercloud medium may consist primarily of H II with a few small H I clouds.
Research on Visualization of Ground Laser Radar Data Based on Osg
NASA Astrophysics Data System (ADS)
Huang, H.; Hu, C.; Zhang, F.; Xue, H.
2018-04-01
Three-dimensional (3D) laser scanning is a new advanced technology integrating light, machine, electricity, and computer technologies. It can conduct 3D scanning to the whole shape and form of space objects with high precision. With this technology, you can directly collect the point cloud data of a ground object and create the structure of it for rendering. People use excellent 3D rendering engine to optimize and display the 3D model in order to meet the higher requirements of real time realism rendering and the complexity of the scene. OpenSceneGraph (OSG) is an open source 3D graphics engine. Compared with the current mainstream 3D rendering engine, OSG is practical, economical, and easy to expand. Therefore, OSG is widely used in the fields of virtual simulation, virtual reality, science and engineering visualization. In this paper, a dynamic and interactive ground LiDAR data visualization platform is constructed based on the OSG and the cross-platform C++ application development framework Qt. In view of the point cloud data of .txt format and the triangulation network data file of .obj format, the functions of 3D laser point cloud and triangulation network data display are realized. It is proved by experiments that the platform is of strong practical value as it is easy to operate and provides good interaction.
NASA Astrophysics Data System (ADS)
Cura, Rémi; Perret, Julien; Paparoditis, Nicolas
2017-05-01
In addition to more traditional geographical data such as images (rasters) and vectors, point cloud data are becoming increasingly available. Such data are appreciated for their precision and true three-Dimensional (3D) nature. However, managing point clouds can be difficult due to scaling problems and specificities of this data type. Several methods exist but are usually fairly specialised and solve only one aspect of the management problem. In this work, we propose a comprehensive and efficient point cloud management system based on a database server that works on groups of points (patches) rather than individual points. This system is specifically designed to cover the basic needs of point cloud users: fast loading, compressed storage, powerful patch and point filtering, easy data access and exporting, and integrated processing. Moreover, the proposed system fully integrates metadata (like sensor position) and can conjointly use point clouds with other geospatial data, such as images, vectors, topology and other point clouds. Point cloud (parallel) processing can be done in-base with fast prototyping capabilities. Lastly, the system is built on open source technologies; therefore it can be easily extended and customised. We test the proposed system with several billion points obtained from Lidar (aerial and terrestrial) and stereo-vision. We demonstrate loading speeds in the ˜50 million pts/h per process range, transparent-for-user and greater than 2 to 4:1 compression ratio, patch filtering in the 0.1 to 1 s range, and output in the 0.1 million pts/s per process range, along with classical processing methods, such as object detection.
Distributed MRI reconstruction using Gadgetron-based cloud computing.
Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S
2015-03-01
To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.
Sreedharan, Vipin T; Schultheiss, Sebastian J; Jean, Géraldine; Kahles, André; Bohnert, Regina; Drewe, Philipp; Mudrakarta, Pramod; Görnitz, Nico; Zeller, Georg; Rätsch, Gunnar
2014-05-01
We present Oqtans, an open-source workbench for quantitative transcriptome analysis, that is integrated in Galaxy. Its distinguishing features include customizable computational workflows and a modular pipeline architecture that facilitates comparative assessment of tool and data quality. Oqtans integrates an assortment of machine learning-powered tools into Galaxy, which show superior or equal performance to state-of-the-art tools. Implemented tools comprise a complete transcriptome analysis workflow: short-read alignment, transcript identification/quantification and differential expression analysis. Oqtans and Galaxy facilitate persistent storage, data exchange and documentation of intermediate results and analysis workflows. We illustrate how Oqtans aids the interpretation of data from different experiments in easy to understand use cases. Users can easily create their own workflows and extend Oqtans by integrating specific tools. Oqtans is available as (i) a cloud machine image with a demo instance at cloud.oqtans.org, (ii) a public Galaxy instance at galaxy.cbio.mskcc.org, (iii) a git repository containing all installed software (oqtans.org/git); most of which is also available from (iv) the Galaxy Toolshed and (v) a share string to use along with Galaxy CloudMan.
Enhanced K-means clustering with encryption on cloud
NASA Astrophysics Data System (ADS)
Singh, Iqjot; Dwivedi, Prerna; Gupta, Taru; Shynu, P. G.
2017-11-01
This paper tries to solve the problem of storing and managing big files over cloud by implementing hashing on Hadoop in big-data and ensure security while uploading and downloading files. Cloud computing is a term that emphasis on sharing data and facilitates to share infrastructure and resources.[10] Hadoop is an open source software that gives us access to store and manage big files according to our needs on cloud. K-means clustering algorithm is an algorithm used to calculate distance between the centroid of the cluster and the data points. Hashing is a algorithm in which we are storing and retrieving data with hash keys. The hashing algorithm is called as hash function which is used to portray the original data and later to fetch the data stored at the specific key. [17] Encryption is a process to transform electronic data into non readable form known as cipher text. Decryption is the opposite process of encryption, it transforms the cipher text into plain text that the end user can read and understand well. For encryption and decryption we are using Symmetric key cryptographic algorithm. In symmetric key cryptography are using DES algorithm for a secure storage of the files. [3
76 FR 62373 - Notice of Public Meeting-Cloud Computing Forum & Workshop IV
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-07
...--Cloud Computing Forum & Workshop IV AGENCY: National Institute of Standards and Technology (NIST), Commerce. ACTION: Notice. SUMMARY: NIST announces the Cloud Computing Forum & Workshop IV to be held on... to help develop open standards in interoperability, portability and security in cloud computing. This...
Open Reading Frame Phylogenetic Analysis on the Cloud
2013-01-01
Phylogenetic analysis has become essential in researching the evolutionary relationships between viruses. These relationships are depicted on phylogenetic trees, in which viruses are grouped based on sequence similarity. Viral evolutionary relationships are identified from open reading frames rather than from complete sequences. Recently, cloud computing has become popular for developing internet-based bioinformatics tools. Biocloud is an efficient, scalable, and robust bioinformatics computing service. In this paper, we propose a cloud-based open reading frame phylogenetic analysis service. The proposed service integrates the Hadoop framework, virtualization technology, and phylogenetic analysis methods to provide a high-availability, large-scale bioservice. In a case study, we analyze the phylogenetic relationships among Norovirus. Evolutionary relationships are elucidated by aligning different open reading frame sequences. The proposed platform correctly identifies the evolutionary relationships between members of Norovirus. PMID:23671843
The Arctic Summer Cloud-Ocean Study (ASCOS): overview and experimental design
NASA Astrophysics Data System (ADS)
Tjernström, M.; Leck, C.; Birch, C. E.; Brooks, B. J.; Brooks, I. M.; Bäcklin, L.; Chang, R. Y.-W.; Granath, E.; Graus, M.; Hansel, A.; Heintzenberg, J.; Held, A.; Hind, A.; de la Rosa, S.; Johnston, P.; Knulst, J.; de Leeuw, G.; Di Liberto, L.; Martin, M.; Matrai, P. A.; Mauritsen, T.; Müller, M.; Norris, S. J.; Orellana, M. V.; Orsini, D. A.; Paatero, J.; Persson, P. O. G.; Gao, Q.; Rauschenberg, C.; Ristovski, Z.; Sedlar, J.; Shupe, M. D.; Sierau, B.; Sirevaag, A.; Sjogren, S.; Stetzer, O.; Swietlicki, E.; Szczodrak, M.; Vaattovaara, P.; Wahlberg, N.; Westberg, M.; Wheeler, C. R.
2013-05-01
The climate in the Arctic is changing faster than anywhere else on Earth. Poorly understood feedback processes relating to Arctic clouds and aerosol-cloud interactions contribute to a poor understanding of the present changes in the Arctic climate system, and also to a large spread in projections of future climate in the Arctic. The problem is exacerbated by the paucity of research-quality observations in the central Arctic. Improved formulations in climate models require such observations, which can only come from measurements in-situ in this difficult to reach region with logistically demanding environmental conditions. The Arctic Summer Cloud-Ocean Study (ASCOS) was the most extensive central Arctic Ocean expedition with an atmospheric focus during the International Polar Year (IPY) 2007-2008. ASCOS focused on the study of the formation and life cycle of low-level Arctic clouds. ASCOS departed from Longyearbyen on Svalbard on 2 August and returned on 9 September 2008. In transit into and out of the pack ice, four short research stations were undertaken in the Fram Strait; two in open water and two in the marginal ice zone. After traversing the pack-ice northward an ice camp was set up on 12 August at 87°21' N 01°29' W and remained in operation through 1 September, drifting with the ice. During this time extensive measurements were taken of atmospheric gas and particle chemistry and physics, mesoscale and boundary-layer meteorology, marine biology and chemistry, and upper ocean physics. ASCOS provides a unique interdisciplinary data set for development and testing of new hypotheses on cloud processes, their interactions with the sea ice and ocean and associated physical, chemical, and biological processes and interactions. For example, the first ever quantitative observation of bubbles in Arctic leads, combined with the unique discovery of marine organic material, polymer gels with an origin in the ocean, inside cloud droplets suggest the possibility of primary marine organically derived cloud condensation nuclei in Arctic stratocumulus clouds. Direct observations of surface fluxes of aerosols could, however, not explain observed variability in aerosol concentrations and the balance between local and remote aerosols sources remains open. Lack of CCN was at times a controlling factor in low-level cloud formation, and hence for the impact of clouds on the surface energy budget. ASCOS provided detailed measurements of the surface energy balance from late summer melt into the initial autumn freeze-up, and documented the effects of clouds and storms on the surface energy balance during this transition. In addition to such process-level studies, the unique, independent ASCOS data set can and is being used for validation of satellite retrievals, operational models, and reanalysis data sets.
NASA Astrophysics Data System (ADS)
Schwind, Michael
Structure from Motion (SfM) is a photogrammetric technique whereby three-dimensional structures (3D) are estimated from overlapping two-dimensional (2D) image sequences. It is studied in the field of computer vision and utilized in fields such as archeology, engineering, and the geosciences. Currently, many SfM software packages exist that allow for the generation of 3D point clouds. Little work has been done to show how topographic data generated from these software differ over varying terrain types and why they might produce different results. This work aims to compare and characterize the differences between point clouds generated by three different SfM software packages: two well-known proprietary solutions (Pix4D, Agisoft PhotoScan) and one open source solution (OpenDroneMap). Five terrain types were imaged utilizing a DJI Phantom 3 Professional small unmanned aircraft system (sUAS). These terrain types include a marsh environment, a gently sloped sandy beach and jetties, a forested peninsula, a house, and a flat parking lot. Each set of imagery was processed with each software and then directly compared to each other. Before processing the sets of imagery, the software settings were analyzed and chosen in a manner that allowed for the most similar settings to be set across the three software types. This was done in an attempt to minimize point cloud differences caused by dissimilar settings. The characteristics of the resultant point clouds were then compared with each other. Furthermore, a terrestrial light detection and ranging (LiDAR) survey was conducted over the flat parking lot using a Riegl VZ- 400 scanner. This data served as ground truth in order to conduct an accuracy assessment of the sUAS-SfM point clouds. Differences were found between the different results, apparent not only in the characteristics of the clouds, but also the accuracy. This study allows for users of SfM photogrammetry to have a better understanding of how different processing software compare and the inherent sensitivity of SfM automation in 3D reconstruction. Because this study used mostly default settings within the software, it would be beneficial for further research to investigate the effects of changing parameters have on the fidelity of point cloud datasets generated from different SfM software packages.
ASDC Advances in the Utilization of Microservices and Hybrid Cloud Environments
NASA Astrophysics Data System (ADS)
Baskin, W. E.; Herbert, A.; Mazaika, A.; Walter, J.
2017-12-01
The Atmospheric Science Data Center (ASDC) is transitioning many of its software tools and applications to standalone microservices deployable in a hybrid cloud, offering benefits such as scalability and efficient environment management. This presentation features several projects the ASDC staff have implemented leveraging the OpenShift Container Application Platform and OpenStack Hybrid Cloud Environment focusing on key tools and techniques applied to: Earth Science data processing Spatial-Temporal metadata generation, validation, repair, and curation Archived Data discovery, visualization, and access
NASA Technical Reports Server (NTRS)
Meixner, Margaret; Panuzzo, P.; Roman-Duval, J.; Engelbracht, C.; Babler, B.; Seale, J.; Hony, S.; Montiel, E.; Sauvage, M.; Gordon, K.;
2013-01-01
We present an overview or the HERschel Inventory of The Agents of Galaxy Evolution (HERITAGE) in the Magellanic Clouds project, which is a Herschel Space Observatory open time key program. We mapped the Large Magellanic Cloud (LMC) and Small Magellanic Cloud (SMC) at 100, 160, 250, 350, and 500 micron with the Spectral and Photometric Imaging Receiver (SPIRE) and Photodetector Array Camera and Spectrometer (PACS) instruments on board Herschel using the SPIRE/PACS parallel mode. The overriding science goal of HERITAGE is to study the life cycle of matter as traced by dust in the LMC and SMC. The far-infrared and submillimeter emission is an effective tracer of the interstellar medium (ISM) dust, the most deeply embedded young stellar objects (YSOs), and the dust ejected by the most massive stars. We describe in detail the data processing, particularly for the PACS data, which required some custom steps because of the large angular extent of a single observational unit and overall the large amount of data to be processed as an ensemble. We report total global fluxes for LMC and SMC and demonstrate their agreement with measurements by prior missions. The HERITAGE maps of the LMC and SMC are dominated by the ISM dust emission and bear most resemblance to the tracers of ISM gas rather than the stellar content of the galaxies. We describe the point source extraction processing and the critetia used to establish a catalog for each waveband for the HERITAGE program. The 250 micron band is the most sensitive and the source catalogs for this band have approx. 25,000 objects for the LMC and approx. 5500 objects for the SMC. These data enable studies of ISM dust properties, submillimeter excess dust emission, dust-to-gas ratio, Class 0 YSO candidates, dusty massive evolved stars, supemova remnants (including SN1987A), H II regions, and dust evolution in the LMC and SMC. All images and catalogs are delivered to the Herschel Science Center as part of the conummity support aspects of the project. These HERITAGE images and catalogs provide an excellent basis for future research and follow up with other facilities.
Agile Infrastructure Monitoring
NASA Astrophysics Data System (ADS)
Andrade, P.; Ascenso, J.; Fedorko, I.; Fiorini, B.; Paladin, M.; Pigueiras, L.; Santos, M.
2014-06-01
At the present time, data centres are facing a massive rise in virtualisation and cloud computing. The Agile Infrastructure (AI) project is working to deliver new solutions to ease the management of CERN data centres. Part of the solution consists in a new "shared monitoring architecture" which collects and manages monitoring data from all data centre resources. In this article, we present the building blocks of this new monitoring architecture, the different open source technologies selected for each architecture layer, and how we are building a community around this common effort.
NASA Astrophysics Data System (ADS)
Mitasova, H.; Hardin, E. J.; Kratochvilova, A.; Landa, M.
2012-12-01
Multitemporal data acquired by modern mapping technologies provide unique insights into processes driving land surface dynamics. These high resolution data also offer an opportunity to improve the theoretical foundations and accuracy of process-based simulations of evolving landforms. We discuss development of new generation of visualization and analytics tools for GRASS GIS designed for 3D multitemporal data from repeated lidar surveys and from landscape process simulations. We focus on data and simulation methods that are based on point sampling of continuous fields and lead to representation of evolving surfaces as series of raster map layers or voxel models. For multitemporal lidar data we present workflows that combine open source point cloud processing tools with GRASS GIS and custom python scripts to model and analyze dynamics of coastal topography (Figure 1) and we outline development of coastal analysis toolbox. The simulations focus on particle sampling method for solving continuity equations and its application for geospatial modeling of landscape processes. In addition to water and sediment transport models, already implemented in GIS, the new capabilities under development combine OpenFOAM for wind shear stress simulation with a new module for aeolian sand transport and dune evolution simulations. Comparison of observed dynamics with the results of simulations is supported by a new, integrated 2D and 3D visualization interface that provides highly interactive and intuitive access to the redesigned and enhanced visualization tools. Several case studies will be used to illustrate the presented methods and tools and demonstrate the power of workflows built with FOSS and highlight their interoperability.Figure 1. Isosurfaces representing evolution of shoreline and a z=4.5m contour between the years 1997-2011at Cape Hatteras, NC extracted from a voxel model derived from series of lidar-based DEMs.
A Web-based Tool for Transparent, Collaborative Urban Water System Planning for Monterrey, Mexico
NASA Astrophysics Data System (ADS)
Rheinheimer, D. E.; Medellin-Azuara, J.; Garza Díaz, L. E.; Ramírez, A. I.
2017-12-01
Recent rapid advances in web technologies and cloud computing show great promise for facilitating collaboration and transparency in water planning efforts. Water resources planning is increasingly in the context of a rapidly urbanizing world, particularly in developing countries. In such countries with democratic traditions, the degree of transparency and collaboration in water planning can mean the difference between success and failure of water planning efforts. This is exemplified in the city of Monterrey, Mexico, where an effort to build a new long-distance aqueduct to increase water supply to the city dramatically failed due to lack of transparency and top-down planning. To help address, we used a new, web-based water system modeling platform, called OpenAgua, to develop a prototype decision support system for water planning in Monterrey. OpenAgua is designed to promote transparency and collaboration, as well as provide strong, cloud-based, water system modeling capabilities. We developed and assessed five water management options intended to increase water supply yield and/or reliability, a dominant water management concern in Latin America generally: 1) a new long-distance source (the previously-rejected project), 2) a new nearby reservoir, 3) expansion/re-operation of an existing major canal, 4) desalination, and 5) industrial water reuse. Using the integrated modeling and analytic capabilities of OpenAgua, and some customization, we assessed the performance of these options for water supply yield and reliability to help identify the most promising ones. In presenting this assessment, we demonstrate the viability of using online, cloud-based modeling systems for improving transparency and collaboration in decision making, reducing the gap between citizens, policy makers and water managers, and future directions.
Interfacing HTCondor-CE with OpenStack
NASA Astrophysics Data System (ADS)
Bockelman, B.; Caballero Bejar, J.; Hover, J.
2017-10-01
Over the past few years, Grid Computing technologies have reached a high level of maturity. One key aspect of this success has been the development and adoption of newer Compute Elements to interface the external Grid users with local batch systems. These new Compute Elements allow for better handling of jobs requirements and a more precise management of diverse local resources. However, despite this level of maturity, the Grid Computing world is lacking diversity in local execution platforms. As Grid Computing technologies have historically been driven by the needs of the High Energy Physics community, most resource providers run the platform (operating system version and architecture) that best suits the needs of their particular users. In parallel, the development of virtualization and cloud technologies has accelerated recently, making available a variety of solutions, both commercial and academic, proprietary and open source. Virtualization facilitates performing computational tasks on platforms not available at most computing sites. This work attempts to join the technologies, allowing users to interact with computing sites through one of the standard Computing Elements, HTCondor-CE, but running their jobs within VMs on a local cloud platform, OpenStack, when needed. The system will re-route, in a transparent way, end user jobs into dynamically-launched VM worker nodes when they have requirements that cannot be satisfied by the static local batch system nodes. Also, once the automated mechanisms are in place, it becomes straightforward to allow an end user to invoke a custom Virtual Machine at the site. This will allow cloud resources to be used without requiring the user to establish a separate account. Both scenarios are described in this work.
NOAA's Data Catalog and the Federal Open Data Policy
NASA Astrophysics Data System (ADS)
Wengren, M. J.; de la Beaujardiere, J.
2014-12-01
The 2013 Open Data Policy Presidential Directive requires Federal agencies to create and maintain a 'public data listing' that includes all agency data that is currently or will be made publicly-available in the future. The directive requires the use of machine-readable and open formats that make use of 'common core' and extensible metadata formats according to the best practices published in an online repository called 'Project Open Data', to use open licenses where possible, and to adhere to existing metadata and other technology standards to promote interoperability. In order to meet the requirements of the Open Data Policy, the National Oceanic and Atmospheric Administration (NOAA) has implemented an online data catalog that combines metadata from all subsidiary NOAA metadata catalogs into a single master inventory. The NOAA Data Catalog is available to the public for search and discovery, providing access to the NOAA master data inventory through multiple means, including web-based text search, OGC CS-W endpoint, as well as a native Application Programming Interface (API) for programmatic query. It generates on a daily basis the Project Open Data JavaScript Object Notation (JSON) file required for compliance with the Presidential directive. The Data Catalog is based on the open source Comprehensive Knowledge Archive Network (CKAN) software and runs on the Amazon Federal GeoCloud. This presentation will cover topics including mappings of existing metadata in standard formats (FGDC-CSDGM and ISO 19115 XML ) to the Project Open Data JSON metadata schema, representation of metadata elements within the catalog, and compatible metadata sources used to feed the catalog to include Web Accessible Folder (WAF), Catalog Services for the Web (CS-W), and Esri ArcGIS.com. It will also discuss related open source technologies that can be used together to build a spatial data infrastructure compliant with the Open Data Policy.
The Community Intercomparison Suite (CIS)
NASA Astrophysics Data System (ADS)
Watson-Parris, Duncan; Schutgens, Nick; Cook, Nick; Kipling, Zak; Kershaw, Phil; Gryspeerdt, Ed; Lawrence, Bryan; Stier, Philip
2017-04-01
Earth observations (both remote and in-situ) create vast amounts of data providing invaluable constraints for the climate science community. Efficient exploitation of these complex and highly heterogeneous datasets has been limited however by the lack of suitable software tools, particularly for comparison of gridded and ungridded data, thus reducing scientific productivity. CIS (http://cistools.net) is an open-source, command line tool and Python library which allows the straight-forward quantitative analysis, intercomparison and visualisation of remote sensing, in-situ and model data. The CIS can read gridded and ungridded remote sensing, in-situ and model data - and many other data sources 'out-of-the-box', such as ESA Aerosol and Cloud CCI product, MODIS, Cloud CCI, Cloudsat, AERONET. Perhaps most importantly however CIS also employs a modular plugin architecture to allow for the reading of limitless different data types. Users are able to write their own plugins for reading the data sources which they are familiar with, and share them within the community, allowing all to benefit from their expertise. To enable the intercomparison of this data the CIS provides a number of operations including: the aggregation of ungridded and gridded datasets to coarser representations using a number of different built in averaging kernels; the subsetting of data to reduce its extent or dimensionality; the co-location of two distinct datasets onto a single set of co-ordinates; the visualisation of the input or output data through a number of different plots and graphs; the evaluation of arbitrary mathematical expressions against any number of datasets; and a number of other supporting functions such as a statistical comparison of two co-located datasets. These operations can be performed efficiently on local machines or large computing clusters - and is already available on the JASMIN computing facility. A case-study using the GASSP collection of in-situ aerosol observations will demonstrate the power of using CIS to perform model evaluations. The use of an open-source, community developed tool in this way opens up a huge amount of data which would previously have been inaccessible to many users, while also providing replicable, repeatable analysis which scientists and policy-makers alike can trust and understand.
Tracking Clouds with low cost GNSS chips aided by the Arduino platform
NASA Astrophysics Data System (ADS)
Hameed, Saji; Realini, Eugenio; Ishida, Shinya
2016-04-01
The Global Navigation Satellite System (GNSS) is a constellation of satellites that is used to provide geo-positioning services. Besides this application, the GNSS system is important for a wide range of scientific and civilian applications. For example, GNSS systems are routinely used in civilian applications such as surveying and scientific applications such as the study of crustal deformation. Another important scientific application of GNSS system is in meteorological research. Here it is mainly used to determine the total water vapour content of the troposphere, hereafter Precipitable Water Vapor (PWV). However, both GNSS receivers and software have prohibitively high price due to a variety of reasons. To overcome this somewhat artificial barrier we are exploring the use of low-cost GNSS receivers along with open source GNSS software for scientific research, in particular for GNSS meteorology research. To achieve this aim, we have developed a custom Arduino compatible data logging board that is able to operate together with a specific low-cost single frequency GNSS receiver chip from NVS Technologies AG. We have also developed an open-source software bundle that includes a new Arduino core for the Atmel324p chip, which is the main processor used in our custom logger. We have also developed software code that enables data collection, logging and parsing of the GNSS data stream. Additionally we have comprehensively evaluated the low power characteristics of the GNSS receiver and logger boards. Currently we are exploring the use of several openly source or free to use for research software to map GNSS delays to PWV. These include the open source goGPS (http://www.gogps-project.org/) and gLAB (http://gage.upc.edu/gLAB) and the openly available GAMIT software from Massachusetts Institute of Technology (MIT). We note that all the firmware and software developed as part of this project is available on an open source license.
Design and deployment of an elastic network test-bed in IHEP data center based on SDN
NASA Astrophysics Data System (ADS)
Zeng, Shan; Qi, Fazhi; Chen, Gang
2017-10-01
High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.
Trusted computing strengthens cloud authentication.
Ghazizadeh, Eghbal; Zamani, Mazdak; Ab Manan, Jamalul-lail; Alizadeh, Mojtaba
2014-01-01
Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model.
Trusted Computing Strengthens Cloud Authentication
2014-01-01
Cloud computing is a new generation of technology which is designed to provide the commercial necessities, solve the IT management issues, and run the appropriate applications. Another entry on the list of cloud functions which has been handled internally is Identity Access Management (IAM). Companies encounter IAM as security challenges while adopting more technologies became apparent. Trust Multi-tenancy and trusted computing based on a Trusted Platform Module (TPM) are great technologies for solving the trust and security concerns in the cloud identity environment. Single sign-on (SSO) and OpenID have been released to solve security and privacy problems for cloud identity. This paper proposes the use of trusted computing, Federated Identity Management, and OpenID Web SSO to solve identity theft in the cloud. Besides, this proposed model has been simulated in .Net environment. Security analyzing, simulation, and BLP confidential model are three ways to evaluate and analyze our proposed model. PMID:24701149
An open science cloud for scientific research
NASA Astrophysics Data System (ADS)
Jones, Bob
2016-04-01
The Helix Nebula initiative was presented at EGU 2013 (http://meetingorganizer.copernicus.org/EGU2013/EGU2013-1510-2.pdf) and has continued to expand with more research organisations, providers and services. The hybrid cloud model deployed by Helix Nebula has grown to become a viable approach for provisioning ICT services for research communities from both public and commercial service providers (http://dx.doi.org/10.5281/zenodo.16001). The relevance of this approach for all those communities facing societal challenges in explained in a recent EIROforum publication (http://dx.doi.org/10.5281/zenodo.34264). This presentation will describe how this model brings together a range of stakeholders to implement a common platform for data intensive services that builds upon existing public funded e-infrastructures and commercial cloud services to promote open science. It explores the essential characteristics of a European Open Science Cloud if it is to address the big data needs of the latest generation of Research Infrastructures. The high-level architecture and key services as well as the role of standards is described. A governance and financial model together with the roles of the stakeholders, including commercial service providers and downstream business sectors, that will ensure a European Open Science Cloud can innovate, grow and be sustained beyond the current project cycles is described.
Turbulent aerosol fluxes over the Arctic Ocean: 2. Wind-driven sources from the sea
NASA Astrophysics Data System (ADS)
Nilsson, E. D.; Rannik, Ü.; Swietlicki, E.; Leck, C.; Aalto, P. P.; Zhou, J.; Norman, M.
2001-12-01
An eddy-covariance flux system was successfully applied over open sea, leads and ice floes during the Arctic Ocean Expedition in July-August 1996. Wind-driven upward aerosol number fluxes were observed over open sea and leads in the pack ice. These particles must originate from droplets ejected into the air at the bursting of small air bubbles at the water surface. The source flux F (in 106 m-2 s-1) had a strong dependency on wind speed, log>(F>)=0.20U¯-1.71 and 0.11U¯-1.93, over the open sea and leads, respectively (where U¯ is the local wind speed at about 10 m height). Over the open sea the wind-driven aerosol source flux consisted of a film drop mode centered at ˜100 nm diameter and a jet drop mode centered at ˜1 μm diameter. Over the leads in the pack ice, a jet drop mode at ˜2 μm diameter dominated. The jet drop mode consisted of sea-salt, but oxalate indicated an organic contribution, and bacterias and other biogenic particles were identified by single particle analysis. Particles with diameters less than -100 nm appear to have contributed to the flux, but their chemical composition is unknown. Whitecaps were probably the bubble source at open sea and on the leads at high wind speed, but a different bubble source is needed in the leads owing to their small fetch. Melting of ice in the leads is probably the best candidate. The flux over the open sea was of such a magnitude that it could give a significant contribution to the condensation nuclei (CCN) population. Although the flux from the leads were roughly an order of magnitude smaller and the leads cover only a small fraction of the pack ice, the local source may till be important for the CCN population in Arctic fogs. The primary marine aerosol source will increase both with increased wind speed and with decreased ice fraction and extent. The local CCN production may therefore increase and influence cloud or fog albedo and lifetime in response to greenhouse warming in the Arctic Ocean region.
Science Gateways, Scientific Workflows and Open Community Software
NASA Astrophysics Data System (ADS)
Pierce, M. E.; Marru, S.
2014-12-01
Science gateways and scientific workflows occupy different ends of the spectrum of user-focused cyberinfrastructure. Gateways, sometimes called science portals, provide a way for enabling large numbers of users to take advantage of advanced computing resources (supercomputers, advanced storage systems, science clouds) by providing Web and desktop interfaces and supporting services. Scientific workflows, at the other end of the spectrum, support advanced usage of cyberinfrastructure that enable "power users" to undertake computational experiments that are not easily done through the usual mechanisms (managing simulations across multiple sites, for example). Despite these different target communities, gateways and workflows share many similarities and can potentially be accommodated by the same software system. For example, pipelines to process InSAR imagery sets or to datamine GPS time series data are workflows. The results and the ability to make downstream products may be made available through a gateway, and power users may want to provide their own custom pipelines. In this abstract, we discuss our efforts to build an open source software system, Apache Airavata, that can accommodate both gateway and workflow use cases. Our approach is general, and we have applied the software to problems in a number of scientific domains. In this talk, we discuss our applications to usage scenarios specific to earth science, focusing on earthquake physics examples drawn from the QuakSim.org and GeoGateway.org efforts. We also examine the role of the Apache Software Foundation's open community model as a way to build up common commmunity codes that do not depend upon a single "owner" to sustain. Pushing beyond open source software, we also see the need to provide gateways and workflow systems as cloud services. These services centralize operations, provide well-defined programming interfaces, scale elastically, and have global-scale fault tolerance. We discuss our work providing Apache Airavata as a hosted service to provide these features.
Bionimbus: a cloud for managing, analyzing and sharing large genomics datasets.
Heath, Allison P; Greenway, Matthew; Powell, Raymond; Spring, Jonathan; Suarez, Rafael; Hanley, David; Bandlamudi, Chai; McNerney, Megan E; White, Kevin P; Grossman, Robert L
2014-01-01
As large genomics and phenotypic datasets are becoming more common, it is increasingly difficult for most researchers to access, manage, and analyze them. One possible approach is to provide the research community with several petabyte-scale cloud-based computing platforms containing these data, along with tools and resources to analyze it. Bionimbus is an open source cloud-computing platform that is based primarily upon OpenStack, which manages on-demand virtual machines that provide the required computational resources, and GlusterFS, which is a high-performance clustered file system. Bionimbus also includes Tukey, which is a portal, and associated middleware that provides a single entry point and a single sign on for the various Bionimbus resources; and Yates, which automates the installation, configuration, and maintenance of the software infrastructure required. Bionimbus is used by a variety of projects to process genomics and phenotypic data. For example, it is used by an acute myeloid leukemia resequencing project at the University of Chicago. The project requires several computational pipelines, including pipelines for quality control, alignment, variant calling, and annotation. For each sample, the alignment step requires eight CPUs for about 12 h. BAM file sizes ranged from 5 GB to 10 GB for each sample. Most members of the research community have difficulty downloading large genomics datasets and obtaining sufficient storage and computer resources to manage and analyze the data. Cloud computing platforms, such as Bionimbus, with data commons that contain large genomics datasets, are one choice for broadening access to research data in genomics. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Optical instruments synergy in determination of optical depth of thin clouds
NASA Astrophysics Data System (ADS)
Viviana Vlăduţescu, Daniela; Schwartz, Stephen E.; Huang, Dong
2018-04-01
Optically thin clouds have a strong radiative effect and need to be represented accurately in climate models. Cloud optical depth of thin clouds was retrieved using high resolution digital photography, lidar, and a radiative transfer model. The Doppler Lidar was operated at 1.5 μm, minimizing return from Rayleigh scattering, emphasizing return from aerosols and clouds. This approach examined cloud structure on scales 3 to 5 orders of magnitude finer than satellite products, opening new avenues for examination of cloud structure and evolution.
Optical Instruments Synergy in Determination of Optical Depth of Thin Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vladutescu, Daniela V.; Schwartz, Stephen E.
Optically thin clouds have a strong radiative effect and need to be represented accurately in climate models. Cloud optical depth of thin clouds was retrieved using high resolution digital photography, lidar, and a radiative transfer model. The Doppler Lidar was operated at 1.5 μm, minimizing return from Rayleigh scattering, emphasizing return from aerosols and clouds. This approach examined cloud structure on scales 3 to 5 orders of magnitude finer than satellite products, opening new avenues for examination of cloud structure and evolution.
Services for domain specific developments in the Cloud
NASA Astrophysics Data System (ADS)
Schwichtenberg, Horst; Gemuend, André
2015-04-01
We will discuss and demonstrate the possibilities of new Cloud Services where the complete development of code is in the Cloud. We will discuss the possibilities of such services where the complete development cycle from programing to testing is in the cloud. This can be also combined with dedicated research domain specific services and hide the burden of accessing available infrastructures. As an example, we will show a service that is intended to complement the services of the VERCE projects infrastructure, a service that utilizes Cloud resources to offer simplified execution of data pre- and post-processing scripts. It offers users access to the ObsPy seismological toolbox for processing data with the Python programming language, executed on virtual Cloud resources in a secured sandbox. The solution encompasses a frontend with a modern graphical user interface, a messaging infrastructure as well as Python worker nodes for background processing. All components are deployable in the Cloud and have been tested on different environments based on OpenStack and OpenNebula. Deployments on commercial, public Clouds will be tested in the future.
Plenario: An Open Data Discovery and Exploration Platform for Urban Science
DOE Office of Scientific and Technical Information (OSTI.GOV)
Catlett, Charlie; Malik, Tanu; Goldstein, Brett J.
2014-12-01
The past decade has seen the widespread release of open data concerning city services, conditions, and activities by government bodies and public institutions of all sizes. Hundreds of open data portals now host thousands of datasets of many different types. These new data sources represent enormous po- tential for improved understanding of urban dynamics and processes—and, ultimately, for more livable, efficient, and prosperous communities. However, those who seek to realize this potential quickly discover that discovering and applying those data relevant to any particular question can be extraordinarily dif- ficult, due to decentralized storage, heterogeneous formats, and poor documentation. Inmore » this context, we introduce Plenario, a platform designed to automating time-consuming tasks associated with the discovery, exploration, and application of open city data—and, in so doing, reduce barriers to data use for researchers, policymakers, service providers, journalists, and members of the general public. Key innovations include a geospatial data warehouse that allows data from many sources to be registered into a common spatial and temporal frame; simple and intuitive interfaces that permit rapid discovery and exploration of data subsets pertaining to a particular area and time, regardless of type and source; easy export of such data subsets for further analysis; a user-configurable data ingest framework for automated importing and periodic updating of new datasets into the data warehouse; cloud hosting for elastic scaling and rapid creation of new Plenario instances; and an open source implementation to enable community contributions. We describe here the architecture and implementation of the Plenario platform, discuss lessons learned from its use by several communities, and outline plans for future work.« less
Open-cell cloud formation over the Bahamas
NASA Technical Reports Server (NTRS)
2002-01-01
What atmospheric scientists refer to as open cell cloud formation is a regular occurrence on the back side of a low-pressure system or cyclone in the mid-latitudes. In the Northern Hemisphere, a low-pressure system will draw in surrounding air and spin it counterclockwise. That means that on the back side of the low-pressure center, cold air will be drawn in from the north, and on the front side, warm air will be drawn up from latitudes closer to the equator. This movement of an air mass is called advection, and when cold air advection occurs over warmer waters, open cell cloud formations often result. This MODIS image shows open cell cloud formation over the Atlantic Ocean off the southeast coast of the United States on February 19, 2002. This particular formation is the result of a low-pressure system sitting out in the North Atlantic Ocean a few hundred miles east of Massachusetts. (The low can be seen as the comma-shaped figure in the GOES-8 Infrared image from February 19, 2002.) Cold air is being drawn down from the north on the western side of the low and the open cell cumulus clouds begin to form as the cold air passes over the warmer Caribbean waters. For another look at the scene, check out the MODIS Direct Broadcast Image from the University of Wisconsin. Image courtesy Jacques Descloitres, MODIS Land Rapid Response Team at NASA GSFC
NASA Astrophysics Data System (ADS)
Schäfer, M.; Bierwirth, E.; Ehrlich, A.; Jäkel, E.; Wendisch, M.
2015-01-01
Based on airborne spectral imaging observations three-dimensional (3-D) radiative effects between Arctic boundary layer clouds and ice floes have been identified and quantified. A method is presented to discriminate sea ice and open water in case of clouds from imaging radiance measurements. This separation simultaneously reveals that in case of clouds the transition of radiance between open water and sea ice is not instantaneously but horizontally smoothed. In general, clouds reduce the nadir radiance above bright surfaces in the vicinity of sea ice - open water boundaries, while the nadir radiance above dark surfaces is enhanced compared to situations with clouds located above horizontal homogeneous surfaces. With help of the observations and 3-D radiative transfer simulations, this effect was quantified to range between 0 and 2200 m distance to the sea ice edge. This affected distance Δ L was found to depend on both, cloud and sea ice properties. For a ground overlaying cloud in 0-200 m altitude, increasing the cloud optical thickness from τ = 1 to τ = 10 decreases Δ L from 600 to 250 m, while increasing cloud base altitude or cloud geometrical thickness can increase Δ L; Δ L(τ = 1/10) = 2200 m/1250 m for 500-1000 m cloud altitude. To quantify the effect for different shapes and sizes of the ice floes, various albedo fields (infinite straight ice edge, circles, squares, realistic ice floe field) were modelled. Simulations show that Δ L increases by the radius of the ice floe and for sizes larger than 6 km (500-1000 m cloud altitude) asymptotically reaches maximum values, which corresponds to an infinite straight ice edge. Furthermore, the impact of these 3-D-radiative effects on retrieval of cloud optical properties was investigated. The enhanced brightness of a dark pixel next to an ice edge results in uncertainties of up to 90 and 30% in retrievals of cloud optical thickness and effective radius reff, respectively. With help of Δ L quantified here, an estimate of the distance to the ice edge for which the retrieval errors are negligible is given.
Depth of a strong jovian jet from a planetary-scale disturbance driven by storms.
Sánchez-Lavega, A; Orton, G S; Hueso, R; García-Melendo, E; Pérez-Hoyos, S; Simon-Miller, A; Rojas, J F; Gómez, J M; Yanamandra-Fisher, P; Fletcher, L; Joels, J; Kemerer, J; Hora, J; Karkoschka, E; de Pater, I; Wong, M H; Marcus, P S; Pinilla-Alonso, N; Carvalho, F; Go, C; Parker, D; Salway, M; Valimberti, M; Wesley, A; Pujic, Z
2008-01-24
The atmospheres of the gas giant planets (Jupiter and Saturn) contain jets that dominate the circulation at visible levels. The power source for these jets (solar radiation, internal heat, or both) and their vertical structure below the upper cloud are major open questions in the atmospheric circulation and meteorology of giant planets. Several observations and in situ measurements found intense winds at a depth of 24 bar, and have been interpreted as supporting an internal heat source. This issue remains controversial, in part because of effects from the local meteorology. Here we report observations and modelling of two plumes in Jupiter's atmosphere that erupted at the same latitude as the strongest jet (23 degrees N). The plumes reached a height of 30 km above the surrounding clouds, moved faster than any other feature (169 m s(-1)), and left in their wake a turbulent planetary-scale disturbance containing red aerosols. On the basis of dynamical modelling, we conclude that the data are consistent only with a wind that extends well below the level where solar radiation is deposited.
McIDAS-V: Advanced Visualization for 3D Remote Sensing Data
NASA Astrophysics Data System (ADS)
Rink, T.; Achtor, T. H.
2010-12-01
McIDAS-V is a Java-based, open-source, freely available software package for analysis and visualization of geophysical data. Its advanced capabilities provide very interactive 4-D displays, including 3D volumetric rendering and fast sub-manifold slicing, linked to an abstract mathematical data model with built-in metadata for units, coordinate system transforms and sampling topology. A Jython interface provides user defined analysis and computation in terms of the internal data model. These powerful capabilities to integrate data, analysis and visualization are being applied to hyper-spectral sounding retrievals, eg. AIRS and IASI, of moisture and cloud density to interrogate and analyze their 3D structure, as well as, validate with instruments such as CALIPSO, CloudSat and MODIS. The object oriented framework design allows for specialized extensions for novel displays and new sources of data. Community defined CF-conventions for gridded data are understood by the software, and can be immediately imported into the application. This presentation will show examples how McIDAS-V is used in 3-dimensional data analysis, display and evaluation.
H2 emission as a tracer of molecular hydrogen: Large-scale observations of Orion
NASA Technical Reports Server (NTRS)
Luhman, M. L.; Jaffe, D. T.; Keller, L. D.; Pak, Soojong
1994-01-01
We have detected extremely extended (greater than 1.5 deg, or 12 pc) near-infrared H2 line emission from the Orion A molecular cloud. We have mapped emission in the 1.601 micrometer(s) upsilon = 6 - 4 Q(1) and 2.121 micrometer(s) upsilon = 1 - 0 S(1) lines of H2 along a approx. 2 deg R.A. cut and from a 6' x 6' region near theta(sup 1) Ori C. The surface brightness of the extended H2 line emission is 10(exp -6) to 10(exp -5) ergs/s/sq. cm/sr. Based on the distribution and relative strengths of the H2 lines, we conclude that UV fluorescene is most likely the dominant H2 emission mechanism in the outer parts of the Orion cloud. Shock-heated gas does not make a major contribution to the H2 emission in this region. The fluorescent component of the total H2 upsilon = 1 - 0 S(1) luminosity from Orion is 30-40 solar luminosity. Molecular hydrogen excited by UV radiation from nearby OB stars contributes 98%-99% of the global H2 line emission from the Orion molecular cloud, even though this cloud has a powerful shock-excited H2 source in its core. The ability to detect large-scale H2 directly opens up new possibilities for the study of molecular clouds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nagler, Robert; Moeller, Paul
Sirepo is an open source framework for cloud computing. The graphical user interface (GUI) for Sirepo, also known as the client, executes in any HTML5 compliant web browser on any computing platform, including tablets. The client is built in JavaScript, making use of the following open source libraries: Bootstrap, which is fundamental for cross-platform web applications; AngularJS, which provides a model–view–controller (MVC) architecture and GUI components; and D3.js, which provides interactive plots and data-driven transformations. The Sirepo server is built on the following Python technologies: Flask, which is a lightweight framework for web development; Jin-ja, which is a secure andmore » widely used templating language; and Werkzeug, a utility library that is compliant with the WSGI standard. We use Nginx as the HTTP server and proxy, which provides a scalable event-driven architecture. The physics codes supported by Sirepo execute inside a Docker container. One of the codes supported by Sirepo is Warp. Warp is a particle-in-cell (PIC) code de-signed to simulate high-intensity charged particle beams and plasmas in both the electrostatic and electromagnetic regimes, with a wide variety of integrated physics models and diagnostics. At pre-sent, Sirepo supports a small subset of Warp’s capabilities. Warp is open source and is part of the Berkeley Lab Accelerator Simulation Toolkit.« less
On the Interaction between Marine Boundary Layer Cellular Cloudiness and Surface Heat Fluxes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kazil, J.; Feingold, G.; Wang, Hailong
2014-01-02
The interaction between marine boundary layer cellular cloudiness and surface uxes of sensible and latent heat is investigated. The investigation focuses on the non-precipitating closed-cell state and the precipitating open-cell state at low geostrophic wind speed. The Advanced Research WRF model is used to conduct cloud-system-resolving simulations with interactive surface fluxes of sensible heat, latent heat, and of sea salt aerosol, and with a detailed representation of the interaction between aerosol particles and clouds. The mechanisms responsible for the temporal evolution and spatial distribution of the surface heat fluxes in the closed- and open-cell state are investigated and explained. Itmore » is found that the horizontal spatial structure of the closed-cell state determines, by entrainment of dry free tropospheric air, the spatial distribution of surface air temperature and water vapor, and, to a lesser degree, of the surface sensible and latent heat flux. The synchronized dynamics of the the open-cell state drives oscillations in surface air temperature, water vapor, and in the surface fluxes of sensible and latent heat, and of sea salt aerosol. Open-cell cloud formation, cloud optical depth and liquid water path, and cloud and rain water path are identified as good predictors of the spatial distribution of surface air temperature and sensible heat flux, but not of surface water vapor and latent heat flux. It is shown that by enhancing the surface sensible heat flux, the open-cell state creates conditions by which it is maintained. While the open-cell state under consideration is not depleted in aerosol, and is insensitive to variations in sea-salt fluxes, it also enhances the sea-salt flux relative to the closed-cell state. In aerosol-depleted conditions, this enhancement may replenish the aerosol needed for cloud formation, and hence contribute to the perpetuation of the open-cell state as well. Spatial homogenization of the surface fluxes is found to have only a small effect on cloud properties in the investigated cases. This indicates that sub-grid scale spatial variability in the surface flux of sensible and latent heat and of sea salt aerosol may not be required in large scale and global models to describe marine boundary layer cellular cloudiness.« less
Construction and application of Red5 cluster based on OpenStack
NASA Astrophysics Data System (ADS)
Wang, Jiaqing; Song, Jianxin
2017-08-01
With the application and development of cloud computing technology in various fields, the resource utilization rate of the data center has been improved obviously, and the system based on cloud computing platform has also improved the expansibility and stability. In the traditional way, Red5 cluster resource utilization is low and the system stability is poor. This paper uses cloud computing to efficiently calculate the resource allocation ability, and builds a Red5 server cluster based on OpenStack. Multimedia applications can be published to the Red5 cloud server cluster. The system achieves the flexible construction of computing resources, but also greatly improves the stability of the cluster and service efficiency.
NASA Astrophysics Data System (ADS)
Penasa, Luca; Franceschi, Marco; Preto, Nereo; Girardeau-Montaut, Daniel
2015-04-01
Three-dimensional Virtual Outcrop Models (VOMs), often produced using terrestrial laser scanning or photogrammetry, have become popular in the Geosciences. The main feature of a VOM is that it allows for a quantification of the 3D geometry and/or distribution of geologic features that range from rock properties to structural elements. This actually generated much of the interest in VOMs by the oil and gas industry. The potential importance of a VOM in stratigraphy, however, does not seems completely disclosed yet. Indeed outcrops are the primary sources of data for a number of stratigraphic studies (e.g. palaeontology, sedimentology, cyclostratigraphy, geochemistry...). All the observations are typically reported on stratigraphic logs which constitute an idealized representation of the stratigraphic series, drawn by the researcher on the basis of the features that has to be highlighted. The observations are localized by means of manual measurements and a certain amount of subjectivity in log drawing is involved. These facts can prevent the log from being properly pinned to the real outcrop. Moreover, the integration of stratigraphic logs made by different researchers studying the same outcrop may be difficult. The exposure conditions of outcrops can change through time, to the point that they can become unaccessible or even be destroyed. In such a case, linking the stratigraphic log to its physical counterpart becomes impossible. This can be particularly relevant when a classical outcrop or even a GSSP is considered. A VOM may prove useful to tackle these issues, by providing a more objective stratigraphic reference for measurements and by preserving an outcrop through time as a visual representation, thus permitting reference and accurate comparison between observations made through time. Finally, a VOM itself may contain relevant stratigraphic information (e.g. scalar fields associated with the point cloud as intensity, rgb data or hyperspectral information from passive remote sensing devices). This information requires to be merged with geological data collected in the field, in a consistent and reproducible way. We present Vombat, a proof-of-concept of open-source software to illustrate some of the possibilities in terms of information storage, visualization and exploitation of outcrop stratigraphic information. Our solution integrates with CloudCompare, a software that permits to visualize and edit point clouds. A dedicated algorithm estimates stratigraphic attitudes from point cloud data, without the need of exposed planar bedding surfaces. These attitudes can be used to define a virtual stratigraphic section. Composite sections can then be realized defining stratigraphic constraints between different reference frames. Any observation can be displayed in a stratigraphic framework that is directly generated from a VOM. The virtual outcrop, the samples and the stratigraphic reference frames can be saved into an XML file. In the future, the adoption of a standard format (e.g. GeoSciML) will permit easier exchange of stratigraphic data among researchers. The software constitutes a first step towards the full exploitation of VOMs in stratigraphy, is stored at http://github.com/luca-penasa/vombat and is open source. Comments and suggestions are most welcome and will help focusing and refining the software and its tools.
NASA Astrophysics Data System (ADS)
Grochocka, M.
2013-12-01
Mobile laser scanning is dynamically developing measurement technology, which is becoming increasingly widespread in acquiring three-dimensional spatial information. Continuous technical progress based on the use of new tools, technology development, and thus the use of existing resources in a better way, reveals new horizons of extensive use of MLS technology. Mobile laser scanning system is usually used for mapping linear objects, and in particular the inventory of roads, railways, bridges, shorelines, shafts, tunnels, and even geometrically complex urban spaces. The measurement is done from the perspective of use of the object, however, does not interfere with the possibilities of movement and work. This paper presents the initial results of the segmentation data acquired by the MLS. The data used in this work was obtained as part of an inventory measurement infrastructure railway line. Measurement of point clouds was carried out using a profile scanners installed on the railway platform. To process the data, the tools of 'open source' Point Cloud Library was used. These tools allow to use templates of programming libraries. PCL is an open, independent project, operating on a large scale for processing 2D/3D image and point clouds. Software PCL is released under the terms of the BSD license (Berkeley Software Distribution License), which means it is a free for commercial and research use. The article presents a number of issues related to the use of this software and its capabilities. Segmentation data is based on applying the templates library pcl_ segmentation, which contains the segmentation algorithms to separate clusters. These algorithms are best suited to the processing point clouds, consisting of a number of spatially isolated regions. Template library performs the extraction of the cluster based on the fit of the model by the consensus method samples for various parametric models (planes, cylinders, spheres, lines, etc.). Most of the mathematical operation is carried out on the basis of Eigen library, a set of templates for linear algebra.
dCache, Sync-and-Share for Big Data
NASA Astrophysics Data System (ADS)
Millar, AP; Fuhrmann, P.; Mkrtchyan, T.; Behrmann, G.; Bernardt, C.; Buchholz, Q.; Guelzow, V.; Litvintsev, D.; Schwank, K.; Rossi, A.; van der Reest, P.
2015-12-01
The availability of cheap, easy-to-use sync-and-share cloud services has split the scientific storage world into the traditional big data management systems and the very attractive sync-and-share services. With the former, the location of data is well understood while the latter is mostly operated in the Cloud, resulting in a rather complex legal situation. Beside legal issues, those two worlds have little overlap in user authentication and access protocols. While traditional storage technologies, popular in HEP, are based on X.509, cloud services and sync-and-share software technologies are generally based on username/password authentication or mechanisms like SAML or Open ID Connect. Similarly, data access models offered by both are somewhat different, with sync-and-share services often using proprietary protocols. As both approaches are very attractive, dCache.org developed a hybrid system, providing the best of both worlds. To avoid reinventing the wheel, dCache.org decided to embed another Open Source project: OwnCloud. This offers the required modern access capabilities but does not support the managed data functionality needed for large capacity data storage. With this hybrid system, scientists can share files and synchronize their data with laptops or mobile devices as easy as with any other cloud storage service. On top of this, the same data can be accessed via established mechanisms, like GridFTP to serve the Globus Transfer Service or the WLCG FTS3 tool, or the data can be made available to worker nodes or HPC applications via a mounted filesystem. As dCache provides a flexible authentication module, the same user can access its storage via different authentication mechanisms; e.g., X.509 and SAML. Additionally, users can specify the desired quality of service or trigger media transitions as necessary, thus tuning data access latency to the planned access profile. Such features are a natural consequence of using dCache. We will describe the design of the hybrid dCache/OwnCloud system, report on several months of operations experience running it at DESY, and elucidate the future road-map.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rosenfeld, Daniel; Wang, Hailong; Rasch, Philip J.
Numerical simulations described in previous studies showed that adding cloud condensation nuclei to marine stratocumulus can prevent their breakup from closed into open cells. Additional analyses of the same simulations show that the suppression of rain is well described in terms of cloud drop effective radius (re). Rain is initiated when re near cloud top is around 12-14 um. Cloud water starts to get depleted when column-maximum rain intensity (Rmax) exceeds 0.1 mm h-1. This happens when cloud-top re reaches 14 um. Rmax is mostly less than 0.1 mm h-1 at re<14 um, regardless of the cloud water path, butmore » increases rapidly when re exceeds 14 um. This is in agreement with recent aircraft observations and theoretical observations in convective clouds so that the mechanism is not limited to describing marine stratocumulus. These results support the hypothesis that the onset of significant precipitation is determined by the number of nucleated cloud drops and the height (H) above cloud base within the cloud that is required for cloud drops to reach re of 14 um. In turn, this can explain the conditions for initiation of significant drizzle and opening of closed cells providing the basis for a simple parameterization for GCMs that unifies the representation of both precipitating and non-precipitating clouds as well as the transition between them. Furthermore, satellite global observations of cloud depth (from base to top), and cloud top re can be used to derive and validate this parameterization.« less
Generalized Intelligent Framework for Tutoring (GIFT) Cloud/Virtual Open Campus Quick-Start Guide
2016-03-01
distribution is unlimited. 13. SUPPLEMENTARY NOTES 14. ABSTRACT This document serves as the quick-start guide for GIFT Cloud, the web -based...to users with a GIFT Account at no cost. GIFT Cloud is a new implementation of GIFT. This web -based application allows learners, authors, and...distribution is unlimited. 3 3. Requirements for GIFT Cloud GIFT Cloud is accessed via a web browser. Officially, GIFT Cloud has been tested to work on
Space Subdivision in Indoor Mobile Laser Scanning Point Clouds Based on Scanline Analysis.
Zheng, Yi; Peter, Michael; Zhong, Ruofei; Oude Elberink, Sander; Zhou, Quan
2018-06-05
Indoor space subdivision is an important aspect of scene analysis that provides essential information for many applications, such as indoor navigation and evacuation route planning. Until now, most proposed scene understanding algorithms have been based on whole point clouds, which has led to complicated operations, high computational loads and low processing speed. This paper presents novel methods to efficiently extract the location of openings (e.g., doors and windows) and to subdivide space by analyzing scanlines. An opening detection method is demonstrated that analyses the local geometric regularity in scanlines to refine the extracted opening. Moreover, a space subdivision method based on the extracted openings and the scanning system trajectory is described. Finally, the opening detection and space subdivision results are saved as point cloud labels which will be used for further investigations. The method has been tested on a real dataset collected by ZEB-REVO. The experimental results validate the completeness and correctness of the proposed method for different indoor environment and scanning paths.
Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing
NASA Technical Reports Server (NTRS)
Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane
2012-01-01
Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.
NASA Astrophysics Data System (ADS)
Lague, D.
2014-12-01
High Resolution Topographic (HRT) datasets are predominantly stored and analyzed as 2D raster grids of elevations (i.e., Digital Elevation Models). Raster grid processing is common in GIS software and benefits from a large library of fast algorithms dedicated to geometrical analysis, drainage network computation and topographic change measurement. Yet, all instruments or methods currently generating HRT datasets (e.g., ALS, TLS, SFM, stereo satellite imagery) output natively 3D unstructured point clouds that are (i) non-regularly sampled, (ii) incomplete (e.g., submerged parts of river channels are rarely measured), and (iii) include 3D elements (e.g., vegetation, vertical features such as river banks or cliffs) that cannot be accurately described in a DEM. Interpolating the raw point cloud onto a 2D grid generally results in a loss of position accuracy, spatial resolution and in more or less controlled interpolation. Here I demonstrate how studying earth surface topography and processes directly on native 3D point cloud datasets offers several advantages over raster based methods: point cloud methods preserve the accuracy of the original data, can better handle the evaluation of uncertainty associated to topographic change measurements and are more suitable to study vegetation characteristics and steep features of the landscape. In this presentation, I will illustrate and compare Point Cloud based and Raster based workflows with various examples involving ALS, TLS and SFM for the analysis of bank erosion processes in bedrock and alluvial rivers, rockfall statistics (including rockfall volume estimate directly from point clouds) and the interaction of vegetation/hydraulics and sedimentation in salt marshes. These workflows use 2 recently published algorithms for point cloud classification (CANUPO) and point cloud comparison (M3C2) now implemented in the open source software CloudCompare.
Investigating the Accuracy of Point Clouds Generated for Rock Surfaces
NASA Astrophysics Data System (ADS)
Seker, D. Z.; Incekara, A. H.
2016-12-01
Point clouds which are produced by means of different techniques are widely used to model the rocks and obtain the properties of rock surfaces like roughness, volume and area. These point clouds can be generated by applying laser scanning and close range photogrammetry techniques. Laser scanning is the most common method to produce point cloud. In this method, laser scanner device produces 3D point cloud at regular intervals. In close range photogrammetry, point cloud can be produced with the help of photographs taken in appropriate conditions depending on developing hardware and software technology. Many photogrammetric software which is open source or not currently provide the generation of point cloud support. Both methods are close to each other in terms of accuracy. Sufficient accuracy in the mm and cm range can be obtained with the help of a qualified digital camera and laser scanner. In both methods, field work is completed in less time than conventional techniques. In close range photogrammetry, any part of rock surfaces can be completely represented owing to overlapping oblique photographs. In contrast to the proximity of the data, these two methods are quite different in terms of cost. In this study, whether or not point cloud produced by photographs can be used instead of point cloud produced by laser scanner device is investigated. In accordance with this purpose, rock surfaces which have complex and irregular shape located in İstanbul Technical University Ayazaga Campus were selected as study object. Selected object is mixture of different rock types and consists of both partly weathered and fresh parts. Study was performed on a part of 30m x 10m rock surface. 2D and 3D analysis were performed for several regions selected from the point clouds of the surface models. 2D analysis is area-based and 3D analysis is volume-based. Analysis conclusions showed that point clouds in both are similar and can be used as alternative to each other. This proved that point cloud produced using photographs which are both economical and enables to produce data in less time can be used in several studies instead of point cloud produced by laser scanner.
SenseMyHeart: A cloud service and API for wearable heart monitors.
Pinto Silva, P M; Silva Cunha, J P
2015-01-01
In the era of ubiquitous computing, the growing adoption of wearable systems and body sensor networks is trailing the path for new research and software for cardiovascular intensity, energy expenditure and stress and fatigue detection through cardiovascular monitoring. Several systems have received clinical-certification and provide huge amounts of reliable heart-related data in a continuous basis. PhysioNet provides equally reliable open-source software tools for ECG processing and analysis that can be combined with these devices. However, this software remains difficult to use in a mobile environment and for researchers unfamiliar with Linux-based systems. In the present paper we present an approach that aims at tackling these limitations by developing a cloud service that provides an API for a PhysioNet-based pipeline for ECG processing and Heart Rate Variability measurement. We describe the proposed solution, along with its advantages and tradeoffs. We also present some client tools (windows and Android) and several projects where the developed cloud service has been used successfully as a standard for Heart Rate and Heart Rate Variability studies in different scenarios.
A network approach to the geometric structure of shallow cloud fields
NASA Astrophysics Data System (ADS)
Glassmeier, F.; Feingold, G.
2017-12-01
The representation of shallow clouds and their radiative impact is one of the largest challenges for global climate models. While the bulk properties of cloud fields, including effects of organization, are a very active area of research, the potential of the geometric arrangement of cloud fields for the development of new parameterizations has hardly been explored. Self-organized patterns are particularly evident in the cellular structure of Stratocumulus (Sc) clouds so readily visible in satellite imagery. Inspired by similar patterns in biology and physics, we approach pattern formation in Sc fields from the perspective of natural cellular networks. Our network analysis is based on large-eddy simulations of open- and closed-cell Sc cases. We find the network structure to be neither random nor characteristic to natural convection. It is independent of macroscopic cloud fields properties like the Sc regime (open vs closed) and its typical length scale (boundary layer height). The latter is a consequence of entropy maximization (Lewis's Law with parameter 0.16). The cellular pattern is on average hexagonal, where non-6 sided cells occur according to a neighbor-number distribution variance of about 2. Reflecting the continuously renewing dynamics of Sc fields, large (many-sided) cells tend to neighbor small (few-sided) cells (Aboav-Weaire Law with parameter 0.9). These macroscopic network properties emerge independent of the Sc regime because the different processes governing the evolution of closed as compared to open cells correspond to topologically equivalent network dynamics. By developing a heuristic model, we show that open and closed cell dynamics can both be mimicked by versions of cell division and cell disappearance and are biased towards the expansion of smaller cells. This model offers for the first time a fundamental and universal explanation for the geometric pattern of Sc clouds. It may contribute to the development of advanced Sc parameterizations. As an outlook, we discuss how a similar network approach can be applied to describe and quantify the geometric structure of shallow cumulus cloud fields.
Abstracting application deployment on Cloud infrastructures
NASA Astrophysics Data System (ADS)
Aiftimiei, D. C.; Fattibene, E.; Gargana, R.; Panella, M.; Salomoni, D.
2017-10-01
Deploying a complex application on a Cloud-based infrastructure can be a challenging task. In this contribution we present an approach for Cloud-based deployment of applications and its present or future implementation in the framework of several projects, such as “!CHAOS: a cloud of controls” [1], a project funded by MIUR (Italian Ministry of Research and Education) to create a Cloud-based deployment of a control system and data acquisition framework, “INDIGO-DataCloud” [2], an EC H2020 project targeting among other things high-level deployment of applications on hybrid Clouds, and “Open City Platform”[3], an Italian project aiming to provide open Cloud solutions for Italian Public Administrations. We considered to use an orchestration service to hide the complex deployment of the application components, and to build an abstraction layer on top of the orchestration one. Through Heat [4] orchestration service, we prototyped a dynamic, on-demand, scalable platform of software components, based on OpenStack infrastructures. On top of the orchestration service we developed a prototype of a web interface exploiting the Heat APIs. The user can start an instance of the application without having knowledge about the underlying Cloud infrastructure and services. Moreover, the platform instance can be customized by choosing parameters related to the application such as the size of a File System or the number of instances of a NoSQL DB cluster. As soon as the desired platform is running, the web interface offers the possibility to scale some infrastructure components. In this contribution we describe the solution design and implementation, based on the application requirements, the details of the development of both the Heat templates and of the web interface, together with possible exploitation strategies of this work in Cloud data centers.
Grubelich, Mark C.
2001-01-01
A diversionary device has a housing having at least one opening and containing a non-explosive propellant and a quantity of fine powder packed within the housing, with the powder being located between the propellant and the opening. When the propellant is activated, it has sufficient energy to propel the powder through the opening to produce a cloud of powder outside the housing. An igniter is also provided for igniting the cloud of powder to create a diversionary flash and bang, but at a low enough pressure to avoid injuring nearby people.
Aerosols and Aerosol-related haze forecasting in China Meteorological Adminstration
NASA Astrophysics Data System (ADS)
Zhou, Chunhong; Zhang, Xiaoye; Gong, Sunling; Liu, Hongli; Xue, Min
2017-04-01
CMA Unified Atmospheric Chemistry Environmental Forecasting System (CUACE) is a unified numerical chemical weather forecasting system with BC, OC, Sulfate, Nitrate, Ammonia, Dust and Sea-Salt aerosols and their sources, gas to particle processes, SOA, microphysics and transformation. With an open interface, CUACE has been online coupled to mesoscale model MM5 and the new NWP system GRAPES (Global/Regional Assimilation and Prediction Enhanced System)min CMA. With Chinese Emissions from Cao and Zhang(2012 and 2013), a forecasting system called CUACE/Haze-fog has been running in real time in CMA and issue 5-days PM10, O3 and Visibility forecasts. A comprehensive ACI scheme has also been developed in CUACE Calculated by a sectional aerosol activation scheme based on the information of size and mass from CUACE and the thermal-dynamic and humid states from the weather model at each time step, the cloud condensation nuclei (CCN) is fed online interactively into a two-moment cloud scheme (WDM6) and a convective parameterization to drive the cloud physics and precipitation formation processes. The results show that interactive aerosols with the WDM6 in CUACE obviously improve the clouds properties and the precipitation, showing 24% to 48% enhancements of TS scoring for 6-h precipitation .
IRAS observations of dust heating and energy balance in the Rho Ophiuchi dark cloud
NASA Technical Reports Server (NTRS)
Greene, Thomas P.; Young, Erick T.
1989-01-01
The equilibrium process dust emission in the Rho Ophiuchi dark cloud is studied. The luminosity of the cloud is found to closely match the luminosity of the clouds's known embedded and external radiation sources. There is no evidence for a large population of undetected low-luminosity sources within the cloud and unknown external heating is also only a minor source of energy. Most of the cloud's luminosity is emitted in the mid-to-far-IR. Dust temperature maps indicate that the dust is not hot enough to heat the gas to observed temperatures. A simple cloud model with a radiation field composed of flux HD 147889, S1, and Sco OB2 associations predicts the observed IRAS 60 to 100 micron in-band flux ratios for a mean cloud density n(H2) = 1400. Flattened 12 and 25 micron observations show much extended emission in these bands, suggesting stochastic heating of very small grains or large molecules.
Cloud and surface textural features in polar regions
NASA Technical Reports Server (NTRS)
Welch, Ronald M.; Kuo, Kwo-Sen; Sengupta, Sailes K.
1990-01-01
The study examines the textural signatures of clouds, ice-covered mountains, solid and broken sea ice and floes, and open water. The textural features are computed from sum and difference histogram and gray-level difference vector statistics defined at various pixel displacement distances derived from Landsat multispectral scanner data. Polar cloudiness, snow-covered mountainous regions, solid sea ice, glaciers, and open water have distinguishable texture features. This suggests that textural measures can be successfully applied to the detection of clouds over snow-covered mountains, an ability of considerable importance for the modeling of snow-melt runoff. However, broken stratocumulus cloud decks and thin cirrus over broken sea ice remain difficult to distinguish texturally. It is concluded that even with high spatial resolution imagery, it may not be possible to distinguish broken stratocumulus and thin clouds from sea ice in the marginal ice zone using the visible channel textural features alone.
Use of Open Standards and Technologies at the Lunar Mapping and Modeling Project
NASA Astrophysics Data System (ADS)
Law, E.; Malhotra, S.; Bui, B.; Chang, G.; Goodale, C. E.; Ramirez, P.; Kim, R. M.; Sadaqathulla, S.; Rodriguez, L.
2011-12-01
The Lunar Mapping and Modeling Project (LMMP), led by the Marshall Space Flight center (MSFC), is tasked by NASA. The project is responsible for the development of an information system to support lunar exploration activities. It provides lunar explorers a set of tools and lunar map and model products that are predominantly derived from present lunar missions (e.g., the Lunar Reconnaissance Orbiter (LRO)) and from historical missions (e.g., Apollo). At Jet Propulsion Laboratory (JPL), we have built the LMMP interoperable geospatial information system's underlying infrastructure and a single point of entry - the LMMP Portal by employing a number of open standards and technologies. The Portal exposes a set of services to users to allow search, visualization, subset, and download of lunar data managed by the system. Users also have access to a set of tools that visualize, analyze and annotate the data. The infrastructure and Portal are based on web service oriented architecture. We designed the system to support solar system bodies in general including asteroids, earth and planets. We employed a combination of custom software, commercial and open-source components, off-the-shelf hardware and pay-by-use cloud computing services. The use of open standards and web service interfaces facilitate platform and application independent access to the services and data, offering for instances, iPad and Android mobile applications and large screen multi-touch with 3-D terrain viewing functions, for a rich browsing and analysis experience from a variety of platforms. The web services made use of open standards including: Representational State Transfer (REST); and Open Geospatial Consortium (OGC)'s Web Map Service (WMS), Web Coverage Service (WCS), Web Feature Service (WFS). Its data management services have been built on top of a set of open technologies including: Object Oriented Data Technology (OODT) - open source data catalog, archive, file management, data grid framework; openSSO - open source access management and federation platform; solr - open source enterprise search platform; redmine - open source project collaboration and management framework; GDAL - open source geospatial data abstraction library; and others. Its data products are compliant with Federal Geographic Data Committee (FGDC) metadata standard. This standardization allows users to access the data products via custom written applications or off-the-shelf applications such as GoogleEarth. We will demonstrate this ready-to-use system for data discovery and visualization by walking through the data services provided through the portal such as browse, search, and other tools. We will further demonstrate image viewing and layering of lunar map images from the Internet, via mobile devices such as Apple's iPad.
Coupling West WRF to GSSHA with GSSHApy
NASA Astrophysics Data System (ADS)
Snow, A. D.
2017-12-01
The West WRF output data is in the gridded NetCDF output format containing the required forcing data needed to run a GSSHA simulation. These data include precipitation, pressure, temperature, relative humidity, cloud cover, wind speed, and solar radiation. Tools to reproject, resample, and reformat the data for GSSHA have recently been added to the open source Python library GSSHApy (https://github.com/ci-water/gsshapy). These tools have created a connection that has made it possible to run forecasts using the West WRF forcing data with GSSHA to produce both streamflow and lake level predictions.
Millstone: software for multiplex microbial genome analysis and engineering
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.
Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.
Millstone: software for multiplex microbial genome analysis and engineering.
Goodman, Daniel B; Kuznetsov, Gleb; Lajoie, Marc J; Ahern, Brian W; Napolitano, Michael G; Chen, Kevin Y; Chen, Changping; Church, George M
2017-05-25
Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. We describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.
Millstone: software for multiplex microbial genome analysis and engineering
Goodman, Daniel B.; Kuznetsov, Gleb; Lajoie, Marc J.; ...
2017-05-25
Inexpensive DNA sequencing and advances in genome editing have made computational analysis a major rate-limiting step in adaptive laboratory evolution and microbial genome engineering. Here, we describe Millstone, a web-based platform that automates genotype comparison and visualization for projects with up to hundreds of genomic samples. To enable iterative genome engineering, Millstone allows users to design oligonucleotide libraries and create successive versions of reference genomes. Millstone is open source and easily deployable to a cloud platform, local cluster, or desktop, making it a scalable solution for any lab.
NASA Astrophysics Data System (ADS)
Schäfer, M.; Bierwirth, E.; Ehrlich, A.; Jäkel, E.; Wendisch, M.
2015-07-01
Based on airborne spectral imaging observations, three-dimensional (3-D) radiative effects between Arctic boundary layer clouds and highly variable Arctic surfaces were identified and quantified. A method is presented to discriminate between sea ice and open water under cloudy conditions based on airborne nadir reflectivity γλ measurements in the visible spectral range. In cloudy cases the transition of γλ from open water to sea ice is not instantaneous but horizontally smoothed. In general, clouds reduce γλ above bright surfaces in the vicinity of open water, while γλ above open sea is enhanced. With the help of observations and 3-D radiative transfer simulations, this effect was quantified to range between 0 and 2200 m distance to the sea ice edge (for a dark-ocean albedo of αwater = 0.042 and a sea-ice albedo of αice = 0.91 at 645 nm wavelength). The affected distance Δ L was found to depend on both cloud and sea ice properties. For a low-level cloud at 0-200 m altitude, as observed during the Arctic field campaign VERtical Distribution of Ice in Arctic clouds (VERDI) in 2012, an increase in the cloud optical thickness τ from 1 to 10 leads to a decrease in Δ L from 600 to 250 m. An increase in the cloud base altitude or cloud geometrical thickness results in an increase in Δ L; for τ = 1/10 Δ L = 2200 m/1250 m in case of a cloud at 500-1000 m altitude. To quantify the effect for different shapes and sizes of ice floes, radiative transfer simulations were performed with various albedo fields (infinitely long straight ice edge, circular ice floes, squares, realistic ice floe field). The simulations show that Δ L increases with increasing radius of the ice floe and reaches maximum values for ice floes with radii larger than 6 km (500-1000 m cloud altitude), which matches the results found for an infinitely long, straight ice edge. Furthermore, the influence of these 3-D radiative effects on the retrieved cloud optical properties was investigated. The enhanced brightness of a dark pixel next to an ice edge results in uncertainties of up to 90 and 30 % in retrievals of τ and effective radius reff, respectively. With the help of Δ L, an estimate of the distance to the ice edge is given, where the retrieval uncertainties due to 3-D radiative effects are negligible.
NASA Astrophysics Data System (ADS)
Fisher, W. I.
2017-12-01
The rise in cloud computing, coupled with the growth of "Big Data", has lead to a migration away from local scientific data storage. The increasing size of remote scientific data sets increase, however, makes it difficult for scientists to subject them to large-scale analysis and visualization. These large datasets can take an inordinate amount of time to download; subsetting is a potential solution, but subsetting services are not yet ubiquitous. Data providers may also pay steep prices, as many cloud providers meter data based on how much data leaves their cloud service. The solution to this problem is a deceptively simple one; move data analysis and visualization tools to the cloud, so that scientists may perform data-proximate analysis and visualization. This results in increased transfer speeds, while egress costs are lowered or completely eliminated. Moving standard desktop analysis and visualization tools to the cloud is enabled via a technique called "Application Streaming". This technology allows a program to run entirely on a remote virtual machine while still allowing for interactivity and dynamic visualizations. When coupled with containerization technology such as Docker, we are able to easily deploy legacy analysis and visualization software to the cloud whilst retaining access via a desktop, netbook, a smartphone, or the next generation of hardware, whatever it may be. Unidata has created a Docker-based solution for easily adapting legacy software for Application Streaming. This technology stack, dubbed Cloudstream, allows desktop software to run in the cloud with little-to-no effort. The docker container is configured by editing text files, and the legacy software does not need to be modified in any way. This work will discuss the underlying technologies used by Cloudstream, and outline how to use Cloudstream to run and access an existing desktop application to the cloud.
Using mid-range laser scanners to digitize cultural-heritage sites.
Spring, Adam P; Peters, Caradoc; Minns, Tom
2010-01-01
Here, we explore new, more accessible ways of modeling 3D data sets that both professionals and amateurs can employ in areas such as architecture, forensics, geotechnics, cultural heritage, and even hobbyist modeling. To support our arguments, we present images from a recent case study in digital preservation of cultural heritage using a mid-range laser scanner. Our appreciation of the increasing variety of methods for capturing 3D spatial data inspired our research. Available methods include photogrammetry, airborne lidar, sonar, total stations (a combined electronic and optical survey instrument), and midand close-range scanning.1 They all can produce point clouds of varying density. In our case study, the point cloud produced by a mid-range scanner demonstrates how open source software can make modeling and disseminating data easier. Normally, researchers would model this data using expensive specialized software, and the data wouldn't extend beyond the laser-scanning community.
CLOTHES AS A SOURCE OF PARTICLES CONTRIBUTING TO THE "PERSONAL CLOUD"
Previous studies such as EPA's PTEAM Study have documented increased personal exposures to particles compared to either indoor or outdoor concentrations--a finding that bas been characterized as a "personal cloud." The sources of the personal cloud are unknown, but co...
WebGL Visualisation of 3D Environmental Models Based on Finnish Open Geospatial Data Sets
NASA Astrophysics Data System (ADS)
Krooks, A.; Kahkonen, J.; Lehto, L.; Latvala, P.; Karjalainen, M.; Honkavaara, E.
2014-08-01
Recent developments in spatial data infrastructures have enabled real time GIS analysis and visualization using open input data sources and service interfaces. In this study we present a new concept where metric point clouds derived from national open airborne laser scanning (ALS) and photogrammetric image data are processed, analyzed, finally visualised a through open service interfaces to produce user-driven analysis products from targeted areas. The concept is demonstrated in three environmental applications: assessment of forest storm damages, assessment of volumetric changes in open pit mine and 3D city model visualization. One of the main objectives was to study the usability and requirements of national level photogrammetric imagery in these applications. The results demonstrated that user driven 3D geospatial analyses were possible with the proposed approach and current technology, for instance, the landowner could assess the amount of fallen trees within his property borders after a storm easily using any web browser. On the other hand, our study indicated that there are still many uncertainties especially due to the insufficient standardization of photogrammetric products and processes and their quality indicators.
Structure and organization of Stratocumulus fields: A network approach
NASA Astrophysics Data System (ADS)
Glassmeier, Franziska; Feingold, Graham
2017-04-01
The representation of Stratocumulus (Sc) clouds and their radiative impact is one of the large challenges for global climate models. Aerosol-cloud-precipitation interactions greatly contribute to this challenge by influencing the morphology of Sc fields: In the absence of rain, Sc are arranged in a relatively regular pattern of cloudy cells separated by cloud-free rings of down welling air ('closed cells'). Raining cloud fields, in contrast, exhibit an oscillating pattern of cloudy rings surrounding cloud free cells of negatively buoyant air caused by sedimentation and evaporation of rain ('open cells'). Surprisingly, these regular structures of open and closed cellular Sc fields and their potential for the development of new parameterizations have hardly been explored. In this contribution, we approach the organization of Sc from the perspective of a 2-dimensional random network. We find that cellular networks derived from LES simulations of open- and closed-cell Sc cases are almost indistinguishable and share the following features: (i) The distributions of nearest neighbors, or cell degree, are centered at six. This corresponds to approximately hexagonal cloud cells and is a direct mathematical consequence (Euler formula) of the triple junctions featured by Sc organization. (ii) The degree of individual cells is found to be proportional to the normalized size of the cells. This means that cell arrangement is independent of the typical cell size. (iii) Reflecting the continuously renewing dynamics of Sc fields, large (high-degree) cells tend to be neighbored by small (low-degree) cells and vice versa. These macroscopic network properties emerge independent of the state of the Sc field because the different processes governing the evolution of closed as compared to open cells correspond to topologically equivalent network dynamics. By developing a heuristic model, we show that open and closed cell dynamics can both be mimicked by versions of cell division and cell disappearance and are biased towards the expansion of smaller cells. As a conclusion of our network analysis, Sc organization can be characterized by a typical length scale and a scale-independent cell arrangement. While the typical length scale emerges from the full complexity of aerosol-cloud-precipitation-radiation interactions, cell arrangement is independent of cloud processes and its evolution could be parameterized based on our heuristic model.
Multi-Dimensional Optimization for Cloud Based Multi-Tier Applications
ERIC Educational Resources Information Center
Jung, Gueyoung
2010-01-01
Emerging trends toward cloud computing and virtualization have been opening new avenues to meet enormous demands of space, resource utilization, and energy efficiency in modern data centers. By being allowed to host many multi-tier applications in consolidated environments, cloud infrastructure providers enable resources to be shared among these…
NASA Astrophysics Data System (ADS)
Fisher, Daniel; Poulsen, Caroline A.; Thomas, Gareth E.; Muller, Jan-Peter
2016-03-01
In this paper we evaluate the impact on the cloud parameter retrievals of the ORAC (Optimal Retrieval of Aerosol and Cloud) algorithm following the inclusion of stereo-derived cloud top heights as a priori information. This is performed in a mathematically rigorous way using the ORAC optimal estimation retrieval framework, which includes the facility to use such independent a priori information. Key to the use of a priori information is a characterisation of their associated uncertainty. This paper demonstrates the improvements that are possible using this approach and also considers their impact on the microphysical cloud parameters retrieved. The Along-Track Scanning Radiometer (AATSR) instrument has two views and three thermal channels, so it is well placed to demonstrate the synergy of the two techniques. The stereo retrieval is able to improve the accuracy of the retrieved cloud top height when compared to collocated Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), particularly in the presence of boundary layer inversions and high clouds. The impact of the stereo a priori information on the microphysical cloud properties of cloud optical thickness (COT) and effective radius (RE) was evaluated and generally found to be very small for single-layer clouds conditions over open water (mean RE differences of 2.2 (±5.9) microns and mean COD differences of 0.5 (±1.8) for single-layer ice clouds over open water at elevations of above 9 km, which are most strongly affected by the inclusion of the a priori).
A Cloud-Computing Service for Environmental Geophysics and Seismic Data Processing
NASA Astrophysics Data System (ADS)
Heilmann, B. Z.; Maggi, P.; Piras, A.; Satta, G.; Deidda, G. P.; Bonomi, E.
2012-04-01
Cloud computing is establishing worldwide as a new high performance computing paradigm that offers formidable possibilities to industry and science. The presented cloud-computing portal, part of the Grida3 project, provides an innovative approach to seismic data processing by combining open-source state-of-the-art processing software and cloud-computing technology, making possible the effective use of distributed computation and data management with administratively distant resources. We substituted the user-side demanding hardware and software requirements by remote access to high-performance grid-computing facilities. As a result, data processing can be done quasi in real-time being ubiquitously controlled via Internet by a user-friendly web-browser interface. Besides the obvious advantages over locally installed seismic-processing packages, the presented cloud-computing solution creates completely new possibilities for scientific education, collaboration, and presentation of reproducible results. The web-browser interface of our portal is based on the commercially supported grid portal EnginFrame, an open framework based on Java, XML, and Web Services. We selected the hosted applications with the objective to allow the construction of typical 2D time-domain seismic-imaging workflows as used for environmental studies and, originally, for hydrocarbon exploration. For data visualization and pre-processing, we chose the free software package Seismic Un*x. We ported tools for trace balancing, amplitude gaining, muting, frequency filtering, dip filtering, deconvolution and rendering, with a customized choice of options as services onto the cloud-computing portal. For structural imaging and velocity-model building, we developed a grid version of the Common-Reflection-Surface stack, a data-driven imaging method that requires no user interaction at run time such as manual picking in prestack volumes or velocity spectra. Due to its high level of automation, CRS stacking can benefit largely from the hardware parallelism provided by the cloud deployment. The resulting output, post-stack section, coherence, and NMO-velocity panels are used to generate a smooth migration-velocity model. Residual static corrections are calculated as a by-product of the stack and can be applied iteratively. As a final step, a time migrated subsurface image is obtained by a parallelized Kirchhoff time migration scheme. Processing can be done step-by-step or using a graphical workflow editor that can launch a series of pipelined tasks. The status of the submitted jobs is monitored by a dedicated service. All results are stored in project directories, where they can be downloaded of viewed directly in the browser. Currently, the portal has access to three research clusters having a total number of 70 nodes with 4 cores each. They are shared with four other cloud-computing applications bundled within the GRIDA3 project. To demonstrate the functionality of our "seismic cloud lab", we will present results obtained for three different types of data, all taken from hydrogeophysical studies: (1) a seismic reflection data set, made of compressional waves from explosive sources, recorded in Muravera, Sardinia; (2) a shear-wave data set from, Sardinia; (3) a multi-offset Ground-Penetrating-Radar data set from Larreule, France. The presented work was funded by the government of the Autonomous Region of Sardinia and by the Italian Ministry of Research and Education.
Enhancing data utilization through adoption of cloud-based data architectures (Invited Paper 211869)
NASA Astrophysics Data System (ADS)
Kearns, E. J.
2017-12-01
A traditional approach to data distribution and utilization of open government data involves continuously moving those data from a central government location to each potential user, who would then utilize them on their local computer systems. An alternate approach would be to bring those users to the open government data, where users would also have access to computing and analytics capabilities that would support data utilization. NOAA's Big Data Project is exploring such an alternate approach through an experimental collaboration with Amazon Web Services, Google Cloud Platform, IBM, Microsoft Azure, and the Open Commons Consortium. As part of this ongoing experiment, NOAA is providing open data of interest which are freely hosted by the Big Data Project Collaborators, who provide a variety of cloud-based services and capabilities to enable utilization by data users. By the terms of the agreement, the Collaborators may charge for those value-added services and processing capacities to recover their costs to freely host the data and to generate profits if so desired. Initial results have shown sustained increases in data utilization from 2 to over 100 times previously-observed access patterns from traditional approaches. Significantly increased utilization speed as compared to the traditional approach has also been observed by NOAA data users who have volunteered their experiences on these cloud-based systems. The potential for implementing and sustaining the alternate cloud-based approach as part of a change in operational data utilization strategies will be discussed.
Plenario: A Spatio-Temporal Platform for Discovery and Exploration of Urban Science Data
NASA Astrophysics Data System (ADS)
Engler, W. H.; Malik, T.; Catlett, C.; Foster, I.; Goldstein, B.
2015-12-01
The past decade has seen the widespread release of open data concerning city services, conditions, and activities by government bodies and public institutions of all sizes. Hundreds of open data portals now host thousands of datasets of many different types. These new data sources represent enormous potential for improved understanding of urban dynamics and processes—and, ultimately, for more livable, efficient, and prosperous communities. However, those who seek to realize this potential quickly discover that discovering and applying those data relevant to any particular question can be extraordinarily difficult, due to decentralized storage, heterogeneous formats, and poor documentation. In this context, we introduce Plenario, a platform designed to automating time-consuming tasks associated with the discovery, exploration, and application of open city data—and, in so doing, reduce barriers to data use for researchers, policymakers, service providers, journalists, and members of the general public. Key innovations include a geospatial data warehouse that allows data from many sources to be registered into a common spatial and temporal frame; simple and intuitive interfaces that permit rapid discovery and exploration of data subsets pertaining to a particular area and time, regardless of type and source; easy export of such data subsets for further analysis; a user-configurable data ingest framework for automated importing and periodic updating of new datasets into the data warehouse; cloud hosting for elastic scaling and rapid creation of new Plenario instances; and an open source implementation to enable community contributions. We describe here the architecture and implementation of the Plenario platform, discuss lessons learned from its use by several communities, and outline plans for future work.
Massive stereo-based DTM production for Mars on cloud computers
NASA Astrophysics Data System (ADS)
Tao, Y.; Muller, J.-P.; Sidiropoulos, P.; Xiong, Si-Ting; Putri, A. R. D.; Walter, S. H. G.; Veitch-Michaelis, J.; Yershov, V.
2018-05-01
Digital Terrain Model (DTM) creation is essential to improving our understanding of the formation processes of the Martian surface. Although there have been previous demonstrations of open-source or commercial planetary 3D reconstruction software, planetary scientists are still struggling with creating good quality DTMs that meet their science needs, especially when there is a requirement to produce a large number of high quality DTMs using "free" software. In this paper, we describe a new open source system to overcome many of these obstacles by demonstrating results in the context of issues found from experience with several planetary DTM pipelines. We introduce a new fully automated multi-resolution DTM processing chain for NASA Mars Reconnaissance Orbiter (MRO) Context Camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) stereo processing, called the Co-registration Ames Stereo Pipeline (ASP) Gotcha Optimised (CASP-GO), based on the open source NASA ASP. CASP-GO employs tie-point based multi-resolution image co-registration, and Gotcha sub-pixel refinement and densification. CASP-GO pipeline is used to produce planet-wide CTX and HiRISE DTMs that guarantee global geo-referencing compliance with respect to High Resolution Stereo Colour imaging (HRSC), and thence to the Mars Orbiter Laser Altimeter (MOLA); providing refined stereo matching completeness and accuracy. All software and good quality products introduced in this paper are being made open-source to the planetary science community through collaboration with NASA Ames, United States Geological Survey (USGS) and the Jet Propulsion Laboratory (JPL), Advanced Multi-Mission Operations System (AMMOS) Planetary Data System (PDS) Pipeline Service (APPS-PDS4), as well as browseable and visualisable through the iMars web based Geographic Information System (webGIS) system.
NASA Astrophysics Data System (ADS)
Bolick, Leslie; Harguess, Josh
2016-05-01
An emerging technology in the realm of airborne intelligence, surveillance, and reconnaissance (ISR) systems is structure-from-motion (SfM), which enables the creation of three-dimensional (3D) point clouds and 3D models from two-dimensional (2D) imagery. There are several existing tools, such as VisualSFM and open source project OpenSfM, to assist in this process, however, it is well-known that pristine imagery is usually required to create meaningful 3D data from the imagery. In military applications, such as the use of unmanned aerial vehicles (UAV) for surveillance operations, imagery is rarely pristine. Therefore, we present an analysis of structure-from-motion packages on imagery that has been degraded in a controlled manner.
Origin and dynamics of emission line clouds in cooling flow environments
NASA Technical Reports Server (NTRS)
Loewenstein, Michael
1990-01-01
The author suggests that since clouds are born co-moving in a turbulent intra-cluster medium (ICM), the allowed parameter space can now be opened up to a more acceptable range. Large-scale motions can be driven in the central parts of cooling flows by a number of mechanisms including the motion of the central and other galaxies, and the dissipation of advected, focussed rotational and magnetic energy. In addition to the velocity width paradox, two other paradoxes (Heckman et al. 1989) can be solved if the ICM is turbulent. Firstly, the heating source for the emission line regions has always been puzzling - line luminosities are extremely high for a given (optical or radio) galaxy luminosity compared to those in non-cooling flow galaxies, therefore a mechanism peculiar to cooling flows must be at work. However most, if not all, previously suggested heating mechanisms either fail to provide enough ionization or give the wrong line ratios, or both. The kinetic energy in the turbulence provides a natural energy source if it can be efficiently converted to cloud heat. Researchers suggest that this can be done via magneto-hydrodynamic waves through plasma slip. Secondly, while the x ray observations indicate extended mass deposition, the optical line emission is more centrally concentrated. Since many of the turbulence-inducing mechanisms are strongest in the central regions of the ICM, so is the method of heating. In other words material is dropping out everywhere but only being lit up in the center.
NoSQL: collection document and cloud by using a dynamic web query form
NASA Astrophysics Data System (ADS)
Abdalla, Hemn B.; Lin, Jinzhao; Li, Guoquan
2015-07-01
Mongo-DB (from "humongous") is an open-source document database and the leading NoSQL database. A NoSQL (Not Only SQL, next generation databases, being non-relational, deal, open-source and horizontally scalable) presenting a mechanism for storage and retrieval of documents. Previously, we stored and retrieved the data using the SQL queries. Here, we use the MonogoDB that means we are not utilizing the MySQL and SQL queries. Directly importing the documents into our Drives, retrieving the documents on that drive by not applying the SQL queries, using the IO BufferReader and Writer, BufferReader for importing our type of document files to my folder (Drive). For retrieving the document files, the usage is BufferWriter from the particular folder (or) Drive. In this sense, providing the security for those storing files for what purpose means if we store the documents in our local folder means all or views that file and modified that file. So preventing that file, we are furnishing the security. The original document files will be changed to another format like in this paper; Binary format is used. Our documents will be converting to the binary format after that direct storing in one of our folder, that time the storage space will provide the private key for accessing that file. Wherever any user tries to discover the Document files means that file data are in the binary format, the document's file owner simply views that original format using that personal key from receive the secret key from the cloud.
Looking Down Through the Clouds – Optical Attenuation through Real-Time Clouds
NASA Astrophysics Data System (ADS)
Burley, J.; Lazarewicz, A.; Dean, D.; Heath, N.
Detecting and identifying nuclear explosions in the atmosphere and on the surface of the Earth is critical for the Air Force Technical Applications Center (AFTAC) treaty monitoring mission. Optical signals, from surface or atmospheric nuclear explosions detected by satellite sensors, are attenuated by the atmosphere and clouds. Clouds present a particularly complex challenge as they cover up to seventy percent of the earth's surface. Moreover, their highly variable and diverse nature requires physics-based modeling. Determining the attenuation for each optical ray-path is uniquely dependent on the source geolocation, the specific optical transmission characteristics along that ray path, and sensor detection capabilities. This research details a collaborative AFTAC and AFIT effort to fuse worldwide weather data, from a variety of sources, to provide near-real-time profiles of atmospheric and cloud conditions and the resulting radiative transfer analysis for virtually any wavelength(s) of interest from source to satellite. AFIT has developed a means to model global clouds using the U.S. Air Force’s World Wide Merged Cloud Analysis (WWMCA) cloud data in a new toolset that enables radiance calculations through clouds from UV to RF wavelengths.
Source Region Identification for Low Latitude Whistlers (L=1.08)
NASA Astrophysics Data System (ADS)
Gokani, S. A.; Singh, R.; Maurya, A. K.; Bhaskara, V.; Cohen, M.; Kumar, S.; Lichtenberger, J.
2014-12-01
Though whistlers are known and studied from past one century, the scientific community still strives to understand the generation and propagation mechanism of whistlers in very low latitude region. One of the solutions comes from locating the causative lightning discharges and source region of low latitude whistlers. In the present study, ~ 2000 whistlers recorded during period of one year (Dec, 2010 to Jan, 2011) at Allahabad (Geomag. lat. 16.79o N; L=1.08), India are correlated with lightning activity detected by World Wide Lightning Location Network (WWLLN) at and around conjugate region. About 63% of whistlers are correlated with the lightning strikes around conjugate region. Further to confirm this correlation, arrival azimuths of causative sferics are determined and the obtained azimuths points towards conjugate region of Allahabad. The characteristics of thunder cloud generating these whistlers are examined and found that the clouds with South-East alignment are more prone to trigger whistler waves. The seasonal and diurnal variation of whistler parameters such as occurrence rate, power spectral density and dispersion are also studied and explained on the basis of ionospheric conditions in low latitudes. The results obtained open a new window to look for the propagation mechanism of low latitude whistlers.
Exploring the nonlinear cloud and rain equation
NASA Astrophysics Data System (ADS)
Koren, Ilan; Tziperman, Eli; Feingold, Graham
2017-01-01
Marine stratocumulus cloud decks are regarded as the reflectors of the climate system, returning back to space a significant part of the income solar radiation, thus cooling the atmosphere. Such clouds can exist in two stable modes, open and closed cells, for a wide range of environmental conditions. This emergent behavior of the system, and its sensitivity to aerosol and environmental properties, is captured by a set of nonlinear equations. Here, using linear stability analysis, we express the transition from steady to a limit-cycle state analytically, showing how it depends on the model parameters. We show that the control of the droplet concentration (N), the environmental carrying-capacity (H0), and the cloud recovery parameter (τ) can be linked by a single nondimensional parameter (μ=√{N }/(ατH0) ) , suggesting that for deeper clouds the transition from open (oscillating) to closed (stable fixed point) cells will occur for higher droplet concentration (i.e., higher aerosol loading). The analytical calculations of the possible states, and how they are affected by changes in aerosol and the environmental variables, provide an enhanced understanding of the complex interactions of clouds and rain.
Tidal disruption of open clusters in their parent molecular clouds
NASA Technical Reports Server (NTRS)
Long, Kevin
1989-01-01
A simple model of tidal encounters has been applied to the problem of an open cluster in a clumpy molecular cloud. The parameters of the clumps are taken from the Blitz, Stark, and Long (1988) catalog of clumps in the Rosette molecular cloud. Encounters are modeled as impulsive, rectilinear collisions between Plummer spheres, but the tidal approximation is not invoked. Mass and binding energy changes during an encounter are computed by considering the velocity impulses given to individual stars in a random realization of a Plummer sphere. Mean rates of mass and binding energy loss are then computed by integrating over many encounters. Self-similar evolutionary calculations using these rates indicate that the disruption process is most sensitive to the cluster radius and relatively insensitive to cluster mass. The calculations indicate that clusters which are born in a cloud similar to the Rosette with a cluster radius greater than about 2.5 pc will not survive long enough to leave the cloud. The majority of clusters, however, have smaller radii and will survive the passage through their parent cloud.
NASA Astrophysics Data System (ADS)
Niggemann, F.; Appel, F.; Bach, H.; de la Mar, J.; Schirpke, B.; Dutting, K.; Rucker, G.; Leimbach, D.
2015-04-01
To address the challenges of effective data handling faced by Small and Medium Sized Enterprises (SMEs) a cloud-based infrastructure for accessing and processing of Earth Observation(EO)-data has been developed within the project APPS4GMES(www.apps4gmes.de). To gain homogenous multi mission data access an Input Data Portal (IDP) been implemented on this infrastructure. The IDP consists of an Open Geospatial Consortium (OGC) conformant catalogue, a consolidation module for format conversion and an OGC-conformant ordering framework. Metadata of various EO-sources and with different standards is harvested and transferred to an OGC conformant Earth Observation Product standard and inserted into the catalogue by a Metadata Harvester. The IDP can be accessed for search and ordering of the harvested datasets by the services implemented on the cloud infrastructure. Different land-surface services have been realised by the project partners, using the implemented IDP and cloud infrastructure. Results of these are customer ready products, as well as pre-products (e.g. atmospheric corrected EO data), serving as a basis for other services. Within the IDP an automated access to ESA's Sentinel-1 Scientific Data Hub has been implemented. Searching and downloading of the SAR data can be performed in an automated way. With the implementation of the Sentinel-1 Toolbox and own software, for processing of the datasets for further use, for example for Vista's snow monitoring, delivering input for the flood forecast services, can also be performed in an automated way. For performance tests of the cloud environment a sophisticated model based atmospheric correction and pre-classification service has been implemented. Tests conducted an automated synchronised processing of one entire Landsat 8 (LS-8) coverage for Germany and performance comparisons to standard desktop systems. Results of these tests, showing a performance improvement by the factor of six, proved the high flexibility and computing power of the cloud environment. To make full use of the cloud capabilities a possibility for automated upscaling of the hardware resources has been implemented. Together with the IDP infrastructure fast and automated processing of various satellite sources to deliver market ready products can be realised, thus increasing customer needs and numbers can be satisfied without loss of accuracy and quality.
A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system
NASA Astrophysics Data System (ADS)
Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.
2014-06-01
The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.
Optimizing the resource usage in Cloud based environments: the Synergy approach
NASA Astrophysics Data System (ADS)
Zangrando, L.; Llorens, V.; Sgaravatto, M.; Verlato, M.
2017-10-01
Managing resource allocation in a cloud based data centre serving multiple virtual organizations is a challenging issue. In fact, while batch systems are able to allocate resources to different user groups according to specific shares imposed by the data centre administrator, without a static partitioning of such resources, this is not so straightforward in the most common cloud frameworks, e.g. OpenStack. In the current OpenStack implementation, it is only possible to grant fixed quotas to the different user groups and these resources cannot be exceeded by one group even if there are unused resources allocated to other groups. Moreover in the existing OpenStack implementation, when there aren’t resources available, new requests are simply rejected: it is then up to the client to later re-issue the request. The recently started EU-funded INDIGO-DataCloud project is addressing this issue through “Synergy”, a new advanced scheduling service targeted for OpenStack. Synergy adopts a fair-share model for resource provisioning which guarantees that resources are distributed among users following the fair-share policies defined by the administrator, taken also into account the past usage of such resources. We present the architecture of Synergy, the status of its implementation, some preliminary results and the foreseen evolution of the service.
Cloud-Aerosol Interaction and Its Impact on the Onset of the East Asian Summer Monsoon
NASA Technical Reports Server (NTRS)
Kim, Kyu-Myong; Lau, William K.-M.; Hsu, N. Christina; Tsay, Si-Chee
2004-01-01
Effect of aerosols from biomass burning on the early development of East Asian monsoon is investigated using various satellites and in situ observations including TOMS Aerosol Index (AI). GPCP precipitation, ISCCP cloud cover, and GISS surface air temperature. Based on TRMM fire produce and mean winds fields at 85Omb. we identified the source and interaction regions of aerosols and investigated aerosol-cloud-precipitation characteristics in those regions. During March-April, northern Thailand, Myanmar. and Laos are major source of smoke from the combustion of agricultural waste. Excessive smoke. represented by high AI, is observed especially during dry and cloud-free year. On the other hand. there is no ground source of smoke in the interaction region. The most of aerosols in this area are believed to be transported from the source region. AI is appeared to be correlated with more clouds and less precipitation in interaction region. It suggests that the aerosol-cloud interaction can alter the distribution of cloud and the characteristics of regional hydrology. Aerosol-induced changes in atmospheric stability and associated circulation turns out to be very important to pre-monsoon rainfall pattern in southern China. Prolonged biomass burning is especially effective in changing rainfall pattern during April and May. Results suggest that excessive aerosol transported from source region may intensify pre-monsoon rain band over central China in May and lead to early monsoon onset.
Increasing the value of geospatial informatics with open approaches for Big Data
NASA Astrophysics Data System (ADS)
Percivall, G.; Bermudez, L. E.
2017-12-01
Open approaches to big data provide geoscientists with new capabilities to address problems of unmatched size and complexity. Consensus approaches for Big Geo Data have been addressed in multiple international workshops and testbeds organized by the Open Geospatial Consortium (OGC) in the past year. Participants came from government (NASA, ESA, USGS, NOAA, DOE); research (ORNL, NCSA, IU, JPL, CRIM, RENCI); industry (ESRI, Digital Globe, IBM, rasdaman); standards (JTC 1/NIST); and open source software communities. Results from the workshops and testbeds are documented in Testbed reports and a White Paper published by the OGC. The White Paper identifies the following set of use cases: Collection and Ingest: Remote sensed data processing; Data stream processing Prepare and Structure: SQL and NoSQL databases; Data linking; Feature identification Analytics and Visualization: Spatial-temporal analytics; Machine Learning; Data Exploration Modeling and Prediction: Integrated environmental models; Urban 4D models. Open implementations were developed in the Arctic Spatial Data Pilot using Discrete Global Grid Systems (DGGS) and in Testbeds using WPS and ESGF to publish climate predictions. Further development activities to advance open implementations of Big Geo Data include the following: Open Cloud Computing: Avoid vendor lock-in through API interoperability and Application portability. Open Source Extensions: Implement geospatial data representations in projects from Apache, Location Tech, and OSGeo. Investigate parallelization strategies for N-Dimensional spatial data. Geospatial Data Representations: Schemas to improve processing and analysis using geospatial concepts: Features, Coverages, DGGS. Use geospatial encodings like NetCDF and GeoPackge. Big Linked Geodata: Use linked data methods scaled to big geodata. Analysis Ready Data: Support "Download as last resort" and "Analytics as a service". Promote elements common to "datacubes."
Impact of Aerosols on Convective Clouds and Precipitation
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Chen, Jen-Ping; Li, Zhanqing; Wang, Chien; Zhang, Chidong; Li, Xiaowen
2012-01-01
Aerosols are a critical.factor in the atmospheric hydrological cycle and radiation budget. As a major agent for clouds to form and a significant attenuator of solar radiation, aerosols affect climate in several ways. Current research suggests that aerosols have a major impact on the dynamics, microphysics, and electrification properties of continental mixed-phase convective clouds. In addition, high aerosol concentrations in urban environments could affect precipitation variability by providing a significant source of cloud condensation nuclei (CCN). Such pollution . effects on precipitation potentially have enormous climatic consequences both in terms of feedbacks involving the land surface via rainfall as well as the surface energy budget and changes in latent heat input to the atmosphere. Basically, aerosol concentrations can influence cloud droplet size distributions, the warm-rain process, the cold-rain process, cloud-top heights, the depth of the mixed-phase region, and the occurrence of lightning. Recently, many cloud resolution models (CRMs) have been used to examine the role of aerosols on mixed-phase convective clouds. These modeling studies have many differences in terms of model configuration (two- or three-dimensional), domain size, grid spacing (150-3000 m), microphysics (two-moment bulk, simple or sophisticated spectral-bin), turbulence (1st or 1.5 order turbulent kinetic energy (TKE)), radiation, lateral boundary conditions (i.e., closed, radiative open or cyclic), cases (isolated convection, tropical or midlatitude squall lines) and model integration time (e.g., 2.5 to 48 hours). Among these modeling studies, the most striking difference is that cumulative precipitation can either increase or decrease in response to higher concentrations of CCN. In this presentation, we review past efforts and summarize our current understanding of the effect of aerosols on convective precipitation processes. Specifically, this paper addresses the following topics: observational evidence of the effect of aerosols on precipitation processes, and results from (CRM) simulations. Note that this presentation is mainly based on a recent paper published in Geophy. Rev. (Tao et al. 2012).
Evapotranspiration and cloud variability at regional sub-grid scales
NASA Astrophysics Data System (ADS)
Vila-Guerau de Arellano, Jordi; Sikma, Martin; Pedruzo-Bagazgoitia, Xabier; van Heerwaarden, Chiel; Hartogensis, Oscar; Ouwersloot, Huug
2017-04-01
In regional and global models uncertainties arise due to our incomplete understanding of the coupling between biochemical and physical processes. Representing their impact depends on our ability to calculate these processes using physically sound parameterizations, since they are unresolved at scales smaller than the grid size. More specifically over land, the coupling between evapotranspiration, turbulent transport of heat and moisture, and clouds lacks a combined representation to take these sub-grid scales interactions into account. Our approach is based on understanding how radiation, surface exchange, turbulent transport and moist convection are interacting from the leaf- to the cloud scale. We therefore place special emphasis on plant stomatal aperture as the main regulator of CO2-assimilation and water transpiration, a key source of moisture source to the atmosphere. Plant functionality is critically modulated by interactions with atmospheric conditions occurring at very short spatiotemporal scales such as cloud radiation perturbations or water vapour turbulent fluctuations. By explicitly resolving these processes, the LES (large-eddy simulation) technique is enabling us to characterize and better understand the interactions between canopies and the local atmosphere. This includes the adaption time of vegetation to rapid changes in atmospheric conditions driven by turbulence or the presence of cumulus clouds. Our LES experiments are based on explicitly coupling the diurnal atmospheric dynamics to a plant physiology model. Our general hypothesis is that different partitioning of direct and diffuse radiation leads to different responses of the vegetation. As a result there are changes in the water use efficiencies and shifts in the partitioning of sensible and latent heat fluxes under the presence of clouds. Our presentation is as follows. First, we discuss the ability of LES to reproduce the surface energy balance including photosynthesis and CO2 soil respiration coupled to the dynamics of a convective boundary layer. LES results are compared with a complete set of surface and upper-air meteorological and carbon-dioxide observations gathered during a representative day at the 213-meter meteorological tall tower at Cabauw. Second, we perform systematic numerical experiments under a wide range of background wind conditions and stomatal aperture response time. Our analysis unravel how thin clouds, characterized by lower values of the cloud optical depth, have a different impact on evapotranspiration compared to thick clouds due to differences in the partitioning between direct and diffuse radiation at canopy level. Related to this detailed simulation, we discuss how new instrumental techniques, e.g. scintillometery, enable us to obtain new observational insight of the coupling between clouds and vegetation. We will close the presentation with open questions regarding the need to include parameterizations for these interactions at short spatiotemporal scales in regional or climate models.
NASA Astrophysics Data System (ADS)
Lee, H.-H.; Chen, S.-H.; Kleeman, M. J.; Zhang, H.; DeNero, S. P.; Joe, D. K.
2015-11-01
The source-oriented Weather Research and Forecasting chemistry model (SOWC) was modified to include warm cloud processes and applied to investigate how aerosol mixing states influence fog formation and optical properties in the atmosphere. SOWC tracks a 6-dimensional chemical variable (X, Z, Y, Size Bins, Source Types, Species) through an explicit simulation of atmospheric chemistry and physics. A source-oriented cloud condensation nuclei module was implemented into the SOWC model to simulate warm clouds using the modified two-moment Purdue Lin microphysics scheme. The Goddard shortwave and longwave radiation schemes were modified to interact with source-oriented aerosols and cloud droplets so that aerosol direct and indirect effects could be studied. The enhanced SOWC model was applied to study a fog event that occurred on 17 January 2011, in the Central Valley of California. Tule fog occurred because an atmospheric river effectively advected high moisture into the Central Valley and nighttime drainage flow brought cold air from mountains into the valley. The SOWC model produced reasonable liquid water path, spatial distribution and duration of fog events. The inclusion of aerosol-radiation interaction only slightly modified simulation results since cloud optical thickness dominated the radiation budget in fog events. The source-oriented mixture representation of particles reduced cloud droplet number relative to the internal mixture approach that artificially coats hydrophobic particles with hygroscopic components. The fraction of aerosols activating into CCN at a supersaturation of 0.5 % in the Central Valley decreased from 94 % in the internal mixture model to 80 % in the source-oriented model. This increased surface energy flux by 3-5 W m-2 and surface temperature by as much as 0.25 K in the daytime.
The Experiment Factory: Standardizing Behavioral Experiments.
Sochat, Vanessa V; Eisenberg, Ian W; Enkavi, A Zeynep; Li, Jamie; Bissett, Patrick G; Poldrack, Russell A
2016-01-01
The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms.
The Experiment Factory: Standardizing Behavioral Experiments
Sochat, Vanessa V.; Eisenberg, Ian W.; Enkavi, A. Zeynep; Li, Jamie; Bissett, Patrick G.; Poldrack, Russell A.
2016-01-01
The administration of behavioral and experimental paradigms for psychology research is hindered by lack of a coordinated effort to develop and deploy standardized paradigms. While several frameworks (Mason and Suri, 2011; McDonnell et al., 2012; de Leeuw, 2015; Lange et al., 2015) have provided infrastructure and methods for individual research groups to develop paradigms, missing is a coordinated effort to develop paradigms linked with a system to easily deploy them. This disorganization leads to redundancy in development, divergent implementations of conceptually identical tasks, disorganized and error-prone code lacking documentation, and difficulty in replication. The ongoing reproducibility crisis in psychology and neuroscience research (Baker, 2015; Open Science Collaboration, 2015) highlights the urgency of this challenge: reproducible research in behavioral psychology is conditional on deployment of equivalent experiments. A large, accessible repository of experiments for researchers to develop collaboratively is most efficiently accomplished through an open source framework. Here we present the Experiment Factory, an open source framework for the development and deployment of web-based experiments. The modular infrastructure includes experiments, virtual machines for local or cloud deployment, and an application to drive these components and provide developers with functions and tools for further extension. We release this infrastructure with a deployment (http://www.expfactory.org) that researchers are currently using to run a set of over 80 standardized web-based experiments on Amazon Mechanical Turk. By providing open source tools for both deployment and development, this novel infrastructure holds promise to bring reproducibility to the administration of experiments, and accelerate scientific progress by providing a shared community resource of psychological paradigms. PMID:27199843
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948
NASA Astrophysics Data System (ADS)
Papageorgas, Panagiotis G.; Agavanakis, Kyriakos; Dogas, Ioannis; Piromalis, Dimitrios D.
2018-05-01
A cloud-based architecture is presented for the internetworking of sensors and actuators through a universal gateway, network server and application user interface design. The proposed approach targets to Energy Efficiency and sustainability in a holistic way, by integrating an open-source test bed prototype based on long-range low-bandwidth wireless networking technology for sensing and actuation as the elementary block of a viable, cost-effective and reliable solution. The prototype presented is capable of supporting both sensors and actuators, processing data locally and transmitting the results of the imposed computations to a higher level node. Additionally, it is combined with a service-oriented architecture and involves publish/subscribe middleware protocols and cloud technology to confront with the system needs in terms of data volume and processing power. In this context, the integration of instant message (chat) services is demonstrated so that they can be part of an emerging global-scope eco-system of Cyber-Physical Systems to support a wide variety of IoT applications, with strong advantages such as usability, scalability and security, while adopting a unified gateway design and a simple - yet powerful - user interface.
D Data Acquisition Based on Opencv for Close-Range Photogrammetry Applications
NASA Astrophysics Data System (ADS)
Jurjević, L.; Gašparović, M.
2017-05-01
Development of the technology in the area of the cameras, computers and algorithms for 3D the reconstruction of the objects from the images resulted in the increased popularity of the photogrammetry. Algorithms for the 3D model reconstruction are so advanced that almost anyone can make a 3D model of photographed object. The main goal of this paper is to examine the possibility of obtaining 3D data for the purposes of the close-range photogrammetry applications, based on the open source technologies. All steps of obtaining 3D point cloud are covered in this paper. Special attention is given to the camera calibration, for which two-step process of calibration is used. Both, presented algorithm and accuracy of the point cloud are tested by calculating the spatial difference between referent and produced point clouds. During algorithm testing, robustness and swiftness of obtaining 3D data is noted, and certainly usage of this and similar algorithms has a lot of potential in the real-time application. That is the reason why this research can find its application in the architecture, spatial planning, protection of cultural heritage, forensic, mechanical engineering, traffic management, medicine and other sciences.
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.
a Low-Cost Panoramic Camera for the 3d Documentation of Contaminated Crime Scenes
NASA Astrophysics Data System (ADS)
Abate, D.; Toschi, I.; Sturdy-Colls, C.; Remondino, F.
2017-11-01
Crime scene documentation is a fundamental task which has to be undertaken in a fast, accurate and reliable way, highlighting evidence which can be further used for ensuring justice for victims and for guaranteeing the successful prosecution of perpetrators. The main focus of this paper is on the documentation of a typical crime scene and on the rapid recording of any possible contamination that could have influenced its original appearance. A 3D reconstruction of the environment is first generated by processing panoramas acquired with the low-cost Ricoh Theta 360 camera, and further analysed to highlight potentials and limits of this emerging and consumer-grade technology. Then, a methodology is proposed for the rapid recording of changes occurring between the original and the contaminated crime scene. The approach is based on an automatic 3D feature-based data registration, followed by a cloud-to-cloud distance computation, given as input the 3D point clouds generated before and after e.g. the misplacement of evidence. All the algorithms adopted for panoramas pre-processing, photogrammetric 3D reconstruction, 3D geometry registration and analysis, are presented and currently available in open-source or low-cost software solutions.
D Model of AL Zubarah Fortress in Qatar - Terrestrial Laser Scanning VS. Dense Image Matching
NASA Astrophysics Data System (ADS)
Kersten, T.; Mechelke, K.; Maziull, L.
2015-02-01
In September 2011 the fortress Al Zubarah, built in 1938 as a typical Arabic fortress and restored in 1987 as a museum, was recorded by the HafenCity University Hamburg using terrestrial laser scanning with the IMAGER 5006h and digital photogrammetry for the Qatar Museum Authority within the framework of the Qatar Islamic Archaeology and Heritage Project. One goal of the object recording was to provide detailed 2D/3D documentation of the fortress. This was used to complete specific detailed restoration work in the recent years. From the registered laser scanning point clouds several cuttings and 2D plans were generated as well as a 3D surface model by triangle meshing. Additionally, point clouds and surface models were automatically generated from digital imagery from a Nikon D70 using the open-source software Bundler/PMVS2, free software VisualSFM, Autodesk Web Service 123D Catch beta, and low-cost software Agisoft PhotoScan. These outputs were compared with the results from terrestrial laser scanning. The point clouds and surface models derived from imagery could not achieve the same quality of geometrical accuracy as laser scanning (i.e. 1-2 cm).
phpMs: A PHP-Based Mass Spectrometry Utilities Library.
Collins, Andrew; Jones, Andrew R
2018-03-02
The recent establishment of cloud computing, high-throughput networking, and more versatile web standards and browsers has led to a renewed interest in web-based applications. While traditionally big data has been the domain of optimized desktop and server applications, it is now possible to store vast amounts of data and perform the necessary calculations offsite in cloud storage and computing providers, with the results visualized in a high-quality cross-platform interface via a web browser. There are number of emerging platforms for cloud-based mass spectrometry data analysis; however, there is limited pre-existing code accessible to web developers, especially for those that are constrained to a shared hosting environment where Java and C applications are often forbidden from use by the hosting provider. To remedy this, we provide an open-source mass spectrometry library for one of the most commonly used web development languages, PHP. Our new library, phpMs, provides objects for storing and manipulating spectra and identification data as well as utilities for file reading, file writing, calculations, peptide fragmentation, and protein digestion as well as a software interface for controlling search engines. We provide a working demonstration of some of the capabilities at http://pgb.liv.ac.uk/phpMs .
Integration of Cloud resources in the LHCb Distributed Computing
NASA Astrophysics Data System (ADS)
Úbeda García, Mario; Méndez Muñoz, Víctor; Stagni, Federico; Cabarrou, Baptiste; Rauschmayr, Nathalie; Charpentier, Philippe; Closier, Joel
2014-06-01
This contribution describes how Cloud resources have been integrated in the LHCb Distributed Computing. LHCb is using its specific Dirac extension (LHCbDirac) as an interware for its Distributed Computing. So far, it was seamlessly integrating Grid resources and Computer clusters. The cloud extension of DIRAC (VMDIRAC) allows the integration of Cloud computing infrastructures. It is able to interact with multiple types of infrastructures in commercial and institutional clouds, supported by multiple interfaces (Amazon EC2, OpenNebula, OpenStack and CloudStack) - instantiates, monitors and manages Virtual Machines running on this aggregation of Cloud resources. Moreover, specifications for institutional Cloud resources proposed by Worldwide LHC Computing Grid (WLCG), mainly by the High Energy Physics Unix Information Exchange (HEPiX) group, have been taken into account. Several initiatives and computing resource providers in the eScience environment have already deployed IaaS in production during 2013. Keeping this on mind, pros and cons of a cloud based infrasctructure have been studied in contrast with the current setup. As a result, this work addresses four different use cases which represent a major improvement on several levels of our infrastructure. We describe the solution implemented by LHCb for the contextualisation of the VMs based on the idea of Cloud Site. We report on operational experience of using in production several institutional Cloud resources that are thus becoming integral part of the LHCb Distributed Computing resources. Furthermore, we describe as well the gradual migration of our Service Infrastructure towards a fully distributed architecture following the Service as a Service (SaaS) model.
NASA Astrophysics Data System (ADS)
Chidburee, P.; Mills, J. P.; Miller, P. E.; Fieber, K. D.
2016-06-01
Close-range photogrammetric techniques offer a potentially low-cost approach in terms of implementation and operation for initial assessment and monitoring of landslide processes over small areas. In particular, the Structure-from-Motion (SfM) pipeline is now extensively used to help overcome many constraints of traditional digital photogrammetry, offering increased user-friendliness to nonexperts, as well as lower costs. However, a landslide monitoring approach based on the SfM technique also presents some potential drawbacks due to the difficulty in managing and processing a large volume of data in real-time. This research addresses the aforementioned issues by attempting to combine a mobile device with cloud computing technology to develop a photogrammetric measurement solution as part of a monitoring system for landslide hazard analysis. The research presented here focusses on (i) the development of an Android mobile application; (ii) the implementation of SfM-based open-source software in the Amazon cloud computing web service, and (iii) performance assessment through a simulated environment using data collected at a recognized landslide test site in North Yorkshire, UK. Whilst the landslide monitoring mobile application is under development, this paper describes experiments carried out to ensure effective performance of the system in the future. Investigations presented here describe the initial assessment of a cloud-implemented approach, which is developed around the well-known VisualSFM algorithm. Results are compared to point clouds obtained from alternative SfM 3D reconstruction approaches considering a commercial software solution (Agisoft PhotoScan) and a web-based system (Autodesk 123D Catch). Investigations demonstrate that the cloud-based photogrammetric measurement system is capable of providing results of centimeter-level accuracy, evidencing its potential to provide an effective approach for quantifying and analyzing landslide hazard at a local-scale.
Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing
NASA Astrophysics Data System (ADS)
Tang, Jingyin; Matyas, Corene J.
2018-02-01
Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.
NASA Astrophysics Data System (ADS)
Lee, Hsiang-He; Chen, Shu-Hua; Kleeman, Michael J.; Zhang, Hongliang; DeNero, Steven P.; Joe, David K.
2016-07-01
The source-oriented Weather Research and Forecasting chemistry model (SOWC) was modified to include warm cloud processes and was applied to investigate how aerosol mixing states influence fog formation and optical properties in the atmosphere. SOWC tracks a 6-D chemical variable (X, Z, Y, size bins, source types, species) through an explicit simulation of atmospheric chemistry and physics. A source-oriented cloud condensation nuclei module was implemented into the SOWC model to simulate warm clouds using the modified two-moment Purdue Lin microphysics scheme. The Goddard shortwave and long-wave radiation schemes were modified to interact with source-oriented aerosols and cloud droplets so that aerosol direct and indirect effects could be studied. The enhanced SOWC model was applied to study a fog event that occurred on 17 January 2011, in the Central Valley of California. Tule fog occurred because an atmospheric river effectively advected high moisture into the Central Valley and nighttime drainage flow brought cold air from mountains into the valley. The SOWC model produced reasonable liquid water path, spatial distribution and duration of fog events. The inclusion of aerosol-radiation interaction only slightly modified simulation results since cloud optical thickness dominated the radiation budget in fog events. The source-oriented mixture representation of particles reduced cloud droplet number relative to the internal mixture approach that artificially coats hydrophobic particles with hygroscopic components. The fraction of aerosols activating into cloud condensation nuclei (CCN) at a supersaturation of 0.5 % in the Central Valley decreased from 94 % in the internal mixture model to 80 % in the source-oriented model. This increased surface energy flux by 3-5 W m-2 and surface temperature by as much as 0.25 K in the daytime.
Calibration and Field Deployment of the NSF G-V VCSEL Hygrometer
NASA Astrophysics Data System (ADS)
DiGangi, J. P.; O'Brien, A.; Diao, M.; Hamm, C.; Zhang, Q.; Beaton, S. P.; Zondlo, M. A.
2012-12-01
Cloud formation and dynamics have a significant influence on the Earth's radiative forcing budget, which illustrates the importance of clouds with respect to global climate. Therefore, an accurate understanding of the microscale processes dictating cloud formation is crucial for accurate computer modeling of global climate change. A critical tool for understanding these processes from an airborne platform is an instrument capable of measuring water vapor with both high accuracy and time, thus spatial, resolution. Our work focuses on an open-path, compact, vertical-cavity surface-emitting laser (VCSEL) absorption-based hygrometer, capable of 25 Hz temporal resolution, deployed on the NSF/NCAR Gulfstream-V aircraft platform. The open path nature of our instrument also helps to minimize sampling artifacts. We will discuss our efforts toward achieving within 5% accuracy over 5 orders of magnitude of water vapor concentrations. This involves an intercomparison of five independent calibration methods: ice surface saturators using an oil temperature bath, solvent slush baths (e.g. chloroform/LN2, water/ice), a research-grade frost point hygrometer, static pressure experiments, and Pt catalyzed hydrogen gas. This wide variety of available tools allows us to accurately constrain the calibrant water vapor concentrations both before and after the VCSEL hygrometer sampling chamber. For example, the mixing ratio as measured by research-grade frost point hygrometer after the VCSEL hygrometer agreed within 2% of the mixing ration expected from the water/ice bubbler source before the VCSEL over the temperature range -50°C to 20°C. Finally, due to the compact nature of our instrument, we are able to perform these calibrations simultaneously at the same temperatures (-80°C to 30°C) and pressures (150 mbar to 760 mbar) as sampled ambient air during a flight. This higher accuracy can significantly influence the science utilizing this data, which we will illustrate using preliminary data from our most recent field deployment, the NSF Deep Convective Clouds and Chemistry Experiment in May-June 2012
A Modular Approach to Video Designation of Manipulation Targets for Manipulators
2014-05-12
side view of a ray going through a point cloud of a water bottle sitting on the ground. The bottom left image shows the same point cloud after it has...System (ROS), Point Cloud Library (PCL), and OpenRAVE were used to a great extent to help promote reusability of the code developed during this
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pais Pitta de Lacerda Ruivo, Tiago; Bernabeu Altayo, Gerard; Garzoglio, Gabriele
2014-11-11
has been widely accepted that software virtualization has a big negative impact on high-performance computing (HPC) application performance. This work explores the potential use of Infiniband hardware virtualization in an OpenNebula cloud towards the efficient support of MPI-based workloads. We have implemented, deployed, and tested an Infiniband network on the FermiCloud private Infrastructure-as-a-Service (IaaS) cloud. To avoid software virtualization towards minimizing the virtualization overhead, we employed a technique called Single Root Input/Output Virtualization (SRIOV). Our solution spanned modifications to the Linux’s Hypervisor as well as the OpenNebula manager. We evaluated the performance of the hardware virtualization on up to 56more » virtual machines connected by up to 8 DDR Infiniband network links, with micro-benchmarks (latency and bandwidth) as well as w a MPI-intensive application (the HPL Linpack benchmark).« less
NASA Astrophysics Data System (ADS)
Shameoni Niaei, M.; Kilic, Y.; Yildiran, B. E.; Yüzlükoglu, F.; Yesilyaprak, C.
2016-12-01
We have described a new software (MIPS) about the analysis and image processing of the meteorological satellite (Meteosat) data for an astronomical observatory. This software will be able to help to make some atmospherical forecast (cloud, humidity, rain) using meteosat data for robotic telescopes. MIPS uses a python library for Eumetsat data that aims to be completely open-source and licenced under GNU/General Public Licence (GPL). MIPS is a platform independent and uses h5py, numpy, and PIL with the general-purpose and high-level programming language Python and the QT framework.
NASA Astrophysics Data System (ADS)
MacDonald, Alexander B.; Dadashazar, Hossein; Chuang, Patrick Y.; Crosbie, Ewan; Wang, Hailong; Wang, Zhen; Jonsson, Haflidi H.; Flagan, Richard C.; Seinfeld, John H.; Sorooshian, Armin
2018-04-01
This study uses airborne cloud water composition measurements to characterize the vertical structure of air-equivalent mass concentrations of water-soluble species in marine stratocumulus clouds off the California coast. A total of 385 cloud water samples were collected in the months of July and August between 2011 and 2016 and analyzed for water-soluble ionic and elemental composition. Three characteristic profiles emerge: (i) a reduction of concentration with in-cloud altitude for particulate species directly emitted from sources below cloud without in-cloud sources (e.g., Cl- and Na+), (ii) an increase of concentration with in-cloud altitude (e.g., NO2- and formate), and (iii) species exhibiting a peak in concentration in the middle of cloud (e.g., non-sea-salt SO42-, NO3-, and organic acids). Vertical profiles of rainout parameters such as loss frequency, lifetime, and change in concentration with respect to time show that the scavenging efficiency throughout the cloud depth depends strongly on the thickness of the cloud. Thin clouds exhibit a greater scavenging loss frequency at cloud top, while thick clouds have a greater scavenging loss frequency at cloud base. The implications of these results for treatment of wet scavenging in models are discussed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
MacDonald, Alexander B.; Dadashazar, Hossein; Chuang, Patrick Y.
This study uses airborne cloud water composition measurements to characterize the vertical structure of air-equivalent mass concentrations of water-soluble species in marine stratocumulus clouds off the California coast. A total of 385 cloud water samples were collected in the months of July and August between 2011 and 2016 and analyzed for water-soluble ionic and elemental composition. Three characteristic profiles emerge: (i) a reduction of concentration with in-cloud altitude for particulate species directly emitted from sources below cloud without in-cloud sources (e.g., Cl-, Na+); (ii) an increase of concentration with in-cloud altitude (e.g., NO2-, formate); and (iii) species exhibiting a peakmore » in concentration in the middle of cloud (e.g., non-sea salt SO42-, NO3-, organic acids). Vertical profiles of rainout parameters such as loss frequency, lifetime, and change in concentration with respect to time show that the scavenging efficiency throughout the cloud depth depends strongly on the thickness of the cloud. Thin clouds exhibit a greater scavenging loss frequency at cloud top, while thick clouds have a greater scavenging loss frequency at cloud base. The implications of these results for treatment of wet scavenging in models are discussed.« less
IRAS and the Boston University Arecibo Galactic H I Survey: A catalog of cloud properties
NASA Technical Reports Server (NTRS)
Bania, Thomas M.
1992-01-01
The Infrared Astronomy Satellite (IRAS) Galactic Plane Surface Brightness Images were used to identify infrared emission associated with cool, diffuse H I clouds detected by the Boston University-Arecibo Galactic H I Survey. These clouds are associated with galactic star clusters, H II regions, and molecular clouds. Using emission-absorption experiments toward galactic H II regions, we determined the H I properties of cool H I clouds seen in absorption against the thermal continuum, including their kinematic distances. Correlations were then made between IRAS sources and these H II regions, thus some of the spatial confusion associated with the IRAS fields near the galactic plane was resolved since the distances to these sources was known. Because we can also correlate the BU-Arecibo clouds with existing CO surveys, these results will allow us to determine the intrinsic properties of the gas (neutral and ionized atomic as well as molecular) and dust for interstellar clouds in the inner galaxy. For the IRAS-identified H II region sample, we have established the far infrared (FIR) luminosities and galactic distribution of these sources.
ROSAT X-ray sources embedded in the rho Ophiuchi cloud core
NASA Astrophysics Data System (ADS)
Casanova, Sophie; Montmerle, Thierry; Feigelson, Eric D.; Andre, Philippe
1995-02-01
We present a deep ROSAT Position Sensitive Proportional Counter (PSPC) image of the central region of the rho Oph star-forming region. The selected area, about 35 x 35 arcmins in size, is rich with dense molecular cores and young stellar objects (YSOs). Fifty-five reliable X-ray sources are detected (and up to 50 more candidates may be present) above approximately 1 keV,, doubling the number of Einstein sources in this area. These sources are cross-identified with an updated list of 88 YSOs associated with the rho Oph cloud core. A third of the reliable X-ray sources do not have optical counterparts on photographic plates. Most can be cross-identified wth Class II and Class III infrared (IR) sources, which are embedded T Tauri stars, but three reliable X-ray sources and up to seven candidate sources are tentatively identified with Class I protostars. Eighteen reliable, and up to 20 candidate, X-ray sources are probably new cloud members. The overall detection rate of the bona fide cloud population is very high (73% for the Class II and Class III objects). The spatial distribution of the X-ray sources closely follows that of the moleclar gas. The visual extinctions Av estimated from near-IR data) of the ROSAT sources can be as high as 50 or more, confirming that most are embedded in the cloud core and are presumably very young. Using bolometric luminosities Lbol estimated from J-magnitudes a tight correlation between Lx and Lbol is found, similar to that seen for older T Tauri stars in the Cha I cloud: Lx approximately 10-4 Lbol. A general relation Lxproportional to LbolLj seems to apply to all T Tauri-like YSOs. The near equality of the extintion in the IR J band and in the keV X-ray rage implies that this relation is valid for the detected fluxes as well as for the dereddened fluxes. The X-ray luminosity function of the embedded sourced in rho Oph spans a range of Lx approximately 1028.5 to approximately equal to or greater than 1031.5 ergs/s and is statistically indistinguishable from that of X-ray-detected visile T Tauri stars. We estimate a total X-ray luminosity Lx, Oph approximately equal to or greater than 6 x 10 32 ergs/s from approximately equal to 200 X-ray sources in the cloud core, down to Lbol approximately 0.1 solar luminosity or Mstar approximately 0.3 solar mass. We discuss several consequences of in situ irradiation of molecular clouds by X-rays from embedded YSOs. These X-rays must partially ionize the inner regions of circumstellar disk coronae, possibly playing an important role in coupling magnetic ionize the fields and wind or bipolar outflows. Photon-stimulated deportion of large molecules by YSO X-rays may be partly responsible for the bright 12 micrometer halos seen in some molecular clouds.
NASA Astrophysics Data System (ADS)
Jones, Bob; Casu, Francesco
2013-04-01
The feasibility of using commercial cloud services for scientific research is of great interest to research organisations such as CERN, ESA and EMBL, to the suppliers of cloud-based services and to the national and European funding agencies. Through the Helix Nebula - the Science Cloud [1] initiative and with the support of the European Commission, these stakeholders are driving a two year pilot-phase during which procurement processes and governance issues for a framework of public/private partnership will be appraised. Three initial flagship use cases from high energy physics, molecular biology and earth-observation are being used to validate the approach, enable a cost-benefit analysis to be undertaken and prepare the next stage of the Science Cloud Strategic Plan [2] to be developed and approved. The power of Helix Nebula lies in a shared set of services for initially 3 very different sciences each supporting a global community and thus building a common e-Science platform. Of particular relevance is the ESA sponsored flagship application SuperSites Exploitation Platform (SSEP [3]) that offers the global geo-hazard community a common platform for the correlation and processing of observation data for supersites monitoring. The US-NSF Earth Cube [4] and Ocean Observatory Initiative [5] (OOI) are taking a similar approach for data intensive science. The work of Helix Nebula and its recent architecture model [6] has shown that is it technically feasible to allow publicly funded infrastructures, such as EGI [7] and GEANT [8], to interoperate with commercial cloud services. Such hybrid systems are in the interest of the existing users of publicly funded infrastructures and funding agencies because they will provide "freedom of choice" over the type of computing resources to be consumed and the manner in which they can be obtained. But to offer such freedom-of choice across a spectrum of suppliers, various issues such as intellectual property, legal responsibility, service quality agreements and related issues need to be addressed. Investigating these issues is one of the goals of the Helix Nebula initiative. The next generation of researchers will put aside the historical categorisation of research as a neatly defined set of disciplines and integrate the data from different sources and instruments into complex models that are as applicable to earth observation or biomedicine as they are to high-energy physics. This aggregation of datasets and development of new models will accelerate scientific development but will only be possible if the issues of data intensive science described above are addressed. The culture of science has the possibility to develop with the availability of Helix Nebula as a "Science Cloud" because: • Large scale datasets from many disciplines will be accessible • Scientists and others will be able to develop and contribute open source tools to expand the set of services available • Collaboration of scientists will take place around the on-demand availability of data, tools and services • Cross-domain research will advance at a faster pace due to the availability of a common platform. References: 1 http://www.helix-nebula.eu/ 2 http://cdsweb.cern.ch/record/1374172/files/CERN-OPEN-2011-036.pdf 3 http://www.helix-nebula.eu/index.php/helix-nebula-use-cases/uc3.html 4 http://www.nsf.gov/geo/earthcube/ 5 http://www.oceanobservatories.org/ 6 http://cdsweb.cern.ch/record/1478364/files/HelixNebula-NOTE-2012-001.pdf 7 http://www.nsf.gov/geo/earthcube/ 8 http://www.geant.net/
[Porting Radiotherapy Software of Varian to Cloud Platform].
Zou, Lian; Zhang, Weisha; Liu, Xiangxiang; Xie, Zhao; Xie, Yaoqin
2017-09-30
To develop a low-cost private cloud platform of radiotherapy software. First, a private cloud platform which was based on OpenStack and the virtual GPU hardware was builded. Then on the private cloud platform, all the Varian radiotherapy software modules were installed to the virtual machine, and the corresponding function configuration was completed. Finally the software on the cloud was able to be accessed by virtual desktop client. The function test results of the cloud workstation show that a cloud workstation is equivalent to an isolated physical workstation, and any clients on the LAN can use the cloud workstation smoothly. The cloud platform transplantation in this study is economical and practical. The project not only improves the utilization rates of radiotherapy software, but also makes it possible that the cloud computing technology can expand its applications to the field of radiation oncology.
Effect of ship-stack effluents on cloud reflectivity
NASA Technical Reports Server (NTRS)
Coakley, James A., Jr.; Bernstein, Robert L.; Durkee, Philip A.
1987-01-01
Under stable meteorological conditions the effect of ship-stack exhaust on overlying clouds was detected in daytime satellite images as an enhancement in cloud reflectivity at 3.7 micrometers. The exhaust is a source of cloud-condensation nuclei that increases the number of cloud droplets while reducing droplet size. This reduction in droplet size causes the reflectivity at 3.7 micrometers to be greater than the levels for nearby noncontaminated clouds of similar physical characteristics. The increase in droplet number causes the reflectivity at 0.63 micrometer to be significantly higher for the contaminated clouds despite the likelihood that the exhaust is a source of particles that absorb at visible wavelengths. The effect of aerosols on cloud reflectivity is expected to have a larger influence on the earth's albedo than that due to the direct scattering and absorption of sunlight by the aerosols alone.
Contrasting Cloud Composition Between Coupled and Decoupled Marine Boundary Layer Clouds
NASA Astrophysics Data System (ADS)
WANG, Z.; Mora, M.; Dadashazar, H.; MacDonald, A.; Crosbie, E.; Bates, K. H.; Coggon, M. M.; Craven, J. S.; Xian, P.; Campbell, J. R.; AzadiAghdam, M.; Woods, R. K.; Jonsson, H.; Flagan, R. C.; Seinfeld, J.; Sorooshian, A.
2016-12-01
Marine stratocumulus clouds often become decoupled from the vertical layer immediately above the ocean surface. This study contrasts cloud chemical composition between coupled and decoupled marine stratocumulus clouds. Cloud water and droplet residual particle composition were measured in clouds off the California coast during three airborne experiments in July-August of separate years (E-PEACE 2011, NiCE 2013, BOAS 2015). Decoupled clouds exhibited significantly lower overall mass concentrations in both cloud water and droplet residual particles, consistent with reduced cloud droplet number concentration and sub-cloud aerosol (Dp > 100 nm) number concentration, owing to detachment from surface sources. Non-refractory sub-micrometer aerosol measurements show that coupled clouds exhibit higher sulfate mass fractions in droplet residual particles, owing to more abundant precursor emissions from the ocean and ships. Consequently, decoupled clouds exhibited higher mass fractions of organics, nitrate, and ammonium in droplet residual particles, owing to effects of long-range transport from more distant sources. Total cloud water mass concentration in coupled clouds was dominated by sodium and chloride, and their mass fractions and concentrations exceeded those in decoupled clouds. Conversely, with the exception of sea salt constituents (e.g., Cl, Na, Mg, K), cloud water mass fractions of all species examined were higher in decoupled clouds relative to coupled clouds. These results suggest that an important variable is the extent to which clouds are coupled to the surface layer when interpreting microphysical data relevant to clouds and aerosol particles.
OpenNEX, a private-public partnership in support of the national climate assessment
NASA Astrophysics Data System (ADS)
Nemani, R. R.; Wang, W.; Michaelis, A.; Votava, P.; Ganguly, S.
2016-12-01
The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX is funded as an enabling tool for sustaining the national climate assessment. Over the past five years, researchers have used the NEX platform and produced a number of data sets highly relevant to the National Climate Assessment. These include high-resolution climate projections using different downscaling techniques and trends in historical climate from satellite data. To enable a broader community in exploiting the above datasets, the NEX team partnered with public cloud providers to create the OpenNEX platform. OpenNEX provides ready access to NEX data holdings on a number of public cloud platforms along with pertinent analysis tools and workflows in the form of Machine Images and Docker Containers, lectures and tutorials by experts. We will showcase some of the applications of OpenNEX data and tools by the community on Amazon Web Services, Google Cloud and the NEX Sandbox.
An overview of the DII-HEP OpenStack based CMS data analysis
NASA Astrophysics Data System (ADS)
Osmani, L.; Tarkoma, S.; Eerola, P.; Komu, M.; Kortelainen, M. J.; Kraemer, O.; Lindén, T.; Toor, S.; White, J.
2015-05-01
An OpenStack based private cloud with the Cluster File System has been built and used with both CMS analysis and Monte Carlo simulation jobs in the Datacenter Indirection Infrastructure for Secure High Energy Physics (DII-HEP) project. On the cloud we run the ARC middleware that allows running CMS applications without changes on the job submission side. Our test results indicate that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability. To manage the virtual machines (VM) dynamically in an elastic fasion, we are testing the EMI authorization service (Argus) and the Execution Environment Service (Argus-EES). An OpenStackplugin has been developed for Argus-EES. The Host Identity Protocol (HIP) has been designed for mobile networks and it provides a secure method for IP multihoming. HIP separates the end-point identifier and locator role for IP address which increases the network availability for the applications. Our solution leverages HIP for traffic management. This presentation gives an update on the status of the work and our lessons learned in creating an OpenStackbased cloud for HEP.
NOAA - Western Regional Center
open a larger version of the photo. The complete Western Regional Center consists of nine buildings of a cloud with text about 2018 Open house. Links to Open House web page. Privacy Policy | FOIA
Managing competing elastic Grid and Cloud scientific computing applications using OpenNebula
NASA Astrophysics Data System (ADS)
Bagnasco, S.; Berzano, D.; Lusso, S.; Masera, M.; Vallero, S.
2015-12-01
Elastic cloud computing applications, i.e. applications that automatically scale according to computing needs, work on the ideal assumption of infinite resources. While large public cloud infrastructures may be a reasonable approximation of this condition, scientific computing centres like WLCG Grid sites usually work in a saturated regime, in which applications compete for scarce resources through queues, priorities and scheduling policies, and keeping a fraction of the computing cores idle to allow for headroom is usually not an option. In our particular environment one of the applications (a WLCG Tier-2 Grid site) is much larger than all the others and cannot autoscale easily. Nevertheless, other smaller applications can benefit of automatic elasticity; the implementation of this property in our infrastructure, based on the OpenNebula cloud stack, will be described and the very first operational experiences with a small number of strategies for timely allocation and release of resources will be discussed.
NASA Astrophysics Data System (ADS)
Li, Ming; Yin, Hongxi; Xing, Fangyuan; Wang, Jingchao; Wang, Honghuan
2016-02-01
With the features of network virtualization and resource programming, Software Defined Optical Network (SDON) is considered as the future development trend of optical network, provisioning a more flexible, efficient and open network function, supporting intraconnection and interconnection of data centers. Meanwhile cloud platform can provide powerful computing, storage and management capabilities. In this paper, with the coordination of SDON and cloud platform, a multi-domain SDON architecture based on cloud control plane has been proposed, which is composed of data centers with database (DB), path computation element (PCE), SDON controller and orchestrator. In addition, the structure of the multidomain SDON orchestrator and OpenFlow-enabled optical node are proposed to realize the combination of centralized and distributed effective management and control platform. Finally, the functional verification and demonstration are performed through our optical experiment network.
gemcWeb: A Cloud Based Nuclear Physics Simulation Software
NASA Astrophysics Data System (ADS)
Markelon, Sam
2017-09-01
gemcWeb allows users to run nuclear physics simulations from the web. Being completely device agnostic, scientists can run simulations from anywhere with an Internet connection. Having a full user system, gemcWeb allows users to revisit and revise their projects, and share configurations and results with collaborators. gemcWeb is based on simulation software gemc, which is based on standard GEant4. gemcWeb requires no C++, gemc, or GEant4 knowledge. Using a simple but powerful GUI allows users to configure their project from geometries and configurations stored on the deployment server. Simulations are then run on the server, with results being posted to the user, and then securely stored. Python based and open-source, the main version of gemcWeb is hosted internally at Jefferson National Labratory and used by the CLAS12 and Electron-Ion Collider Project groups. However, as the software is open-source, and hosted as a GitHub repository, an instance can be deployed on the open web, or any institution's intra-net. An instance can be configured to host experiments specific to an institution, and the code base can be modified by any individual or group. Special thanks to: Maurizio Ungaro, PhD., creator of gemc; Markus Diefenthaler, PhD., advisor; and Kyungseon Joo, PhD., advisor.
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D. R.
2017-12-01
The NASA CERES project continues to provide the scientific communities a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. CERES data is used mostly by climate modeling communities but also by a wide variety of educational institutions. To better serve our users, a web-based Ordering and Visualization Tool (OVT) was developed by using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others. Due to increased demand by our own scientists, we also implemented a series of specialized functions to be used in the process of CERES Data Quality Control (QC) such as 1- and 2-D histograms, anomalies and differences, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. With the integration of ground site observed surface fluxes we further facilitate the CERES project to QC the CERES computed surface fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.
A Model Evaluation Data Set for the Tropical ARM Sites
Jakob, Christian
2008-01-15
This data set has been derived from various ARM and external data sources with the main aim of providing modelers easy access to quality controlled data for model evaluation. The data set contains highly aggregated (in time) data from a number of sources at the tropical ARM sites at Manus and Nauru. It spans the years of 1999 and 2000. The data set contains information on downward surface radiation; surface meteorology, including precipitation; atmospheric water vapor and cloud liquid water content; hydrometeor cover as a function of height; and cloud cover, cloud optical thickness and cloud top pressure information provided by the International Satellite Cloud Climatology Project (ISCCP).
NASA Astrophysics Data System (ADS)
Kuji, M.; Hagiwara, M.; Hori, M.; Shiobara, M.
2017-12-01
Shipboard observations on cloud fraction were carried out along the round research cruise between East Asia and Antarctica from November 2015 to Aril 2016 using a whole-sky camera and a ceilometer onboard Research Vessel (R/V) Shirase. We retrieved cloud fraction from the whole-sky camera based on the brightness and color of the images, while we estimated cloud fraction from the ceilometer as a cloud frequency of occurrence. As a result, the average cloud fractions over outward open ocean, sea ice region, and returning openocean were approximately 56% (60%), 44% (64%), and 67% (72%), respectively, with the whole-sky camera (ceilometer). The comparison of the daily-averaged cloud fractions from the whole-sky camera and the ceilometer, it is found that the correlation coefficient was 0.73 for the 129 match-up dataset between East Asia and Antarctica including sea ice region as well as open ocean. The results are qualitatively consistent between the two observations as a whole, but there exists some underestimation with the whole-sky camera compared to the ceilometer. One of the reasons is possibly that the imager is apt to dismiss an optically thinner clouds that can be detected by the ceilometer. On the other hand, the difference of their view angles between the imager and the ceilometer possibly affects the estimation. Therefore, it is necessary to elucidate the cloud properties with detailed match-up analyses in future. Another future task is to compare the cloud fractions with satellite observation such as MODIS cloud products. Shipboard observations in themselves are very valuable for the validation of products from satellite observation, because we do not necessarily have many validation sites over Southern Ocean and sea ice region in particular.
NASA Technical Reports Server (NTRS)
Zhang, Zhibo; Platnick, Steven E.; Ackerman, Andrew S.; Cho, Hyoun-Myoung
2014-01-01
Low-level warm marine boundary layer (MBL) clouds cover large regions of Earth's surface. They have a significant role in Earth's radiative energy balance and hydrological cycle. Despite the fundamental role of low-level warm water clouds in climate, our understanding of these clouds is still limited. In particular, connections between their properties (e.g. cloud fraction, cloud water path, and cloud droplet size) and environmental factors such as aerosol loading and meteorological conditions continue to be uncertain or unknown. Modeling these clouds in climate models remains a challenging problem. As a result, the influence of aerosols on these clouds in the past and future, and the potential impacts of these clouds on global warming remain open questions leading to substantial uncertainty in climate projections. To improve our understanding of these clouds, we need continuous observations of cloud properties on both a global scale and over a long enough timescale for climate studies. At present, satellite-based remote sensing is the only means of providing such observations.
Scalable computing for evolutionary genomics.
Prins, Pjotr; Belhachemi, Dominique; Möller, Steffen; Smant, Geert
2012-01-01
Genomic data analysis in evolutionary biology is becoming so computationally intensive that analysis of multiple hypotheses and scenarios takes too long on a single desktop computer. In this chapter, we discuss techniques for scaling computations through parallelization of calculations, after giving a quick overview of advanced programming techniques. Unfortunately, parallel programming is difficult and requires special software design. The alternative, especially attractive for legacy software, is to introduce poor man's parallelization by running whole programs in parallel as separate processes, using job schedulers. Such pipelines are often deployed on bioinformatics computer clusters. Recent advances in PC virtualization have made it possible to run a full computer operating system, with all of its installed software, on top of another operating system, inside a "box," or virtual machine (VM). Such a VM can flexibly be deployed on multiple computers, in a local network, e.g., on existing desktop PCs, and even in the Cloud, to create a "virtual" computer cluster. Many bioinformatics applications in evolutionary biology can be run in parallel, running processes in one or more VMs. Here, we show how a ready-made bioinformatics VM image, named BioNode, effectively creates a computing cluster, and pipeline, in a few steps. This allows researchers to scale-up computations from their desktop, using available hardware, anytime it is required. BioNode is based on Debian Linux and can run on networked PCs and in the Cloud. Over 200 bioinformatics and statistical software packages, of interest to evolutionary biology, are included, such as PAML, Muscle, MAFFT, MrBayes, and BLAST. Most of these software packages are maintained through the Debian Med project. In addition, BioNode contains convenient configuration scripts for parallelizing bioinformatics software. Where Debian Med encourages packaging free and open source bioinformatics software through one central project, BioNode encourages creating free and open source VM images, for multiple targets, through one central project. BioNode can be deployed on Windows, OSX, Linux, and in the Cloud. Next to the downloadable BioNode images, we provide tutorials online, which empower bioinformaticians to install and run BioNode in different environments, as well as information for future initiatives, on creating and building such images.
The Impact of Cloud Correction on the Redistribution of Reactive Nitrogen Species
NASA Astrophysics Data System (ADS)
Pour Biazar, A.; McNider, R. T.; Doty, K.; Cameron, R.
2007-12-01
Clouds are particularly important to air quality. Yet, correct prediction of clouds in time and space remains to be a great challenge for the air quality models. One aspect of cloud impact on air quality is the modification of photolysis reaction rates by clouds. Clouds can significantly alter the solar radiation in the wavelengths affecting the photolysis rates. Such modifications significantly impact atmospheric photochemistry and alter the chemical composition of the boundary layer. It also alters the partitioning of chemical compounds by creating a new equilibrium state. Since air quality models are often being used for air quality and emission reduction assessment, understanding the uncertainty caused by inaccurate cloud prediction is imperative. In this study we investigate the radiative impact of clouds in altering the partitioning of nitrogen species in the emission source regions. Such alterations affect the local nitrogen budget and thereby alter the atmospheric composition within the boundary layer. The results from two model simulations, one in which the model predicted clouds are used (control), and the other in which the satellite observed clouds have been assimilated in the model were analyzed. We use satellite retrieved cloud transmissivity, cloud top height, and observed cloud fraction to correct photolysis rates for cloud cover in the Community Multiscale Air Quality (CMAQ) modeling system. The simulations were performed at 4- and 12-km resolution domains over Texas, extending east to Mississippi, for the period of August 24 to August 31, 2000. The results clearly indicate that not using the cloud observations in the model can drastically alter the predicted atmospheric chemical composition within the boundary layer and exaggerate or under-predict the ozone concentrations. Cloud impact is acute and more pronounced over the emission source regions and can lead to drastic errors in the model predictions of ozone and its precursors. Clouds also increased the lifetime of ozone precursors leading to their transport out of the source regions and caused further ozone production downwind. The longer lifetimes for NOx and its transport over regions high in biogenic hydrocarbon emissions (in the eastern part of the domain) led to increased ozone production that was missing in the control simulation. An indirect impact of the clouds in the emission source areas is the alteration in partitioning of nitrogen oxides and the impact on nitrogen budget due to surface removal. This is caused by the disparity between the deposition velocity of NOx and the nitrates that are produced from oxidation of NOx. Under clear skies, NOx undergoes a chemical transformation and produces nitrates such as HNO3 and PAN. In the presence of thick clouds, due to the reduction in the photochemical activities, nitrogen monoxide (NO) rapidly consumes ozone (O3) and produces nitrogen dioxide (NO2) while the production of HNO3 and loss of NOx due to chemical transformation is reduced. Therefore, in one case there is more loss of nitrogen in the vicinity of emission sources. A detailed analysis of two emission source regions, Houston-Galveston and New Orleans area, will be presented. Acknowledgments. This work was accomplished under partial support from Cooperative Agreement between the University of Alabama in Huntsville and the Minerals Management Service on the Gulf of Mexico Issues.
NASA Astrophysics Data System (ADS)
Gallagher, J. H. R.; Jelenak, A.; Potter, N.; Fulker, D. W.; Habermann, T.
2017-12-01
Providing data services based on cloud computing technology that is equivalent to those developed for traditional computing and storage systems is critical for successful migration to cloud-based architectures for data production, scientific analysis and storage. OPeNDAP Web-service capabilities (comprising the Data Access Protocol (DAP) specification plus open-source software for realizing DAP in servers and clients) are among the most widely deployed means for achieving data-as-service functionality in the Earth sciences. OPeNDAP services are especially common in traditional data center environments where servers offer access to datasets stored in (very large) file systems, and a preponderance of the source data for these services is being stored in the Hierarchical Data Format Version 5 (HDF5). Three candidate architectures for serving NASA satellite Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS) were developed and their performance examined for a set of representative use cases. The performance was based both on runtime and incurred cost. The three architectures differ in how HDF5 files are stored in the Amazon Simple Storage Service (S3) and how the Hyrax server (as an EC2 instance) retrieves their data. The results for both the serial and parallel access to HDF5 data in the S3 will be presented. While the study focused on HDF5 data, OPeNDAP and the Hyrax data server, the architectures are generic and the analysis can be extrapolated to many different data formats, web APIs, and data servers.
Enhancing Security by System-Level Virtualization in Cloud Computing Environments
NASA Astrophysics Data System (ADS)
Sun, Dawei; Chang, Guiran; Tan, Chunguang; Wang, Xingwei
Many trends are opening up the era of cloud computing, which will reshape the IT industry. Virtualization techniques have become an indispensable ingredient for almost all cloud computing system. By the virtual environments, cloud provider is able to run varieties of operating systems as needed by each cloud user. Virtualization can improve reliability, security, and availability of applications by using consolidation, isolation, and fault tolerance. In addition, it is possible to balance the workloads by using live migration techniques. In this paper, the definition of cloud computing is given; and then the service and deployment models are introduced. An analysis of security issues and challenges in implementation of cloud computing is identified. Moreover, a system-level virtualization case is established to enhance the security of cloud computing environments.
NASA Astrophysics Data System (ADS)
Zheng, X.; Albrecht, B.; Jonsson, H. H.; Khelif, D.; Feingold, G.; Minnis, P.; Ayers, K.; Chuang, P.; Donaher, S.; Rossiter, D.; Ghate, V.; Ruiz-Plancarte, J.; Sun-Mack, S.
2011-09-01
Aircraft observations made off the coast of northern Chile in the Southeastern Pacific (20° S, 72° W; named Point Alpha) from 16 October to 13 November 2008 during the VAMOS Ocean-Cloud- Atmosphere-Land Study-Regional Experiment (VOCALS-REx), combined with meteorological reanalysis, satellite measurements, and radiosonde data, are used to investigate the boundary layer (BL) and aerosol-cloud-drizzle variations in this region. On days without predominately synoptic and meso-scale influences, the BL at Point Alpha was typical of a non-drizzling stratocumulus-topped BL. Entrainment rates calculated from the near cloud-top fluxes and turbulence in the BL at Point Alpha appeared to be weaker than those in the BL over the open ocean west of Point Alpha and the BL near the coast of the northeast Pacific. The cloud liquid water path (LWP) varied between 15 g m-2 and 160 g m-2. The BL had a depth of 1140 ± 120 m, was generally well-mixed and capped by a sharp inversion without predominately synoptic and meso-scale influences. The wind direction generally switched from southerly within the BL to northerly above the inversion. On days when a synoptic system and related mesoscale costal circulations affected conditions at Point Alpha (29 October-4 November), a moist layer above the inversion moved over Point Alpha, and the total-water mixing ratio above the inversion was larger than that within the BL. The accumulation mode aerosol varied from 250 to 700 cm-3 within the BL, and CCN at 0.2 % supersaturation within the BL ranged between 150 and 550 cm-3. The main aerosol source at Point Alpha was horizontal advection within the BL from south. The average cloud droplet number concentration ranged between 80 and 400 cm-3. While the mean LWP retrieved from GOES was in good agreement with the in situ measurements, the GOES-derived cloud droplet effective radius tended to be larger than that from the aircraft in situ observations near cloud top. The aerosol and cloud LWP relationship reveals that during the typical well-mixed BL days the cloud LWP increased with the CCN concentrations. On the other hand, meteorological factors and the decoupling processes have large influences on the cloud LWP variation as well.
Cloud feedback mechanisms and their representation in global climate models
Ceppi, Paulo; Brient, Florent; Zelinka, Mark D.; ...
2017-05-11
Cloud feedback—the change in top-of-atmosphere radiative flux resulting from the cloud response to warming—constitutes by far the largest source of uncertainty in the climate response to CO 2 forcing simulated by global climate models (GCMs). In this paper, we review the main mechanisms for cloud feedbacks, and discuss their representation in climate models and the sources of intermodel spread. Global-mean cloud feedback in GCMs results from three main effects: (1) rising free-tropospheric clouds (a positive longwave effect); (2) decreasing tropical low cloud amount (a positive shortwave [SW] effect); (3) increasing high-latitude low cloud optical depth (a negative SW effect). Thesemore » cloud responses simulated by GCMs are qualitatively supported by theory, high-resolution modeling, and observations. Rising high clouds are consistent with the fixed anvil temperature (FAT) hypothesis, whereby enhanced upper-tropospheric radiative cooling causes anvil cloud tops to remain at a nearly fixed temperature as the atmosphere warms. Tropical low cloud amount decreases are driven by a delicate balance between the effects of vertical turbulent fluxes, radiative cooling, large-scale subsidence, and lower-tropospheric stability on the boundary-layer moisture budget. High-latitude low cloud optical depth increases are dominated by phase changes in mixed-phase clouds. Finally, the causes of intermodel spread in cloud feedback are discussed, focusing particularly on the role of unresolved parameterized processes such as cloud microphysics, turbulence, and convection.« less
Cloud feedback mechanisms and their representation in global climate models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceppi, Paulo; Brient, Florent; Zelinka, Mark D.
Cloud feedback—the change in top-of-atmosphere radiative flux resulting from the cloud response to warming—constitutes by far the largest source of uncertainty in the climate response to CO 2 forcing simulated by global climate models (GCMs). In this paper, we review the main mechanisms for cloud feedbacks, and discuss their representation in climate models and the sources of intermodel spread. Global-mean cloud feedback in GCMs results from three main effects: (1) rising free-tropospheric clouds (a positive longwave effect); (2) decreasing tropical low cloud amount (a positive shortwave [SW] effect); (3) increasing high-latitude low cloud optical depth (a negative SW effect). Thesemore » cloud responses simulated by GCMs are qualitatively supported by theory, high-resolution modeling, and observations. Rising high clouds are consistent with the fixed anvil temperature (FAT) hypothesis, whereby enhanced upper-tropospheric radiative cooling causes anvil cloud tops to remain at a nearly fixed temperature as the atmosphere warms. Tropical low cloud amount decreases are driven by a delicate balance between the effects of vertical turbulent fluxes, radiative cooling, large-scale subsidence, and lower-tropospheric stability on the boundary-layer moisture budget. High-latitude low cloud optical depth increases are dominated by phase changes in mixed-phase clouds. Finally, the causes of intermodel spread in cloud feedback are discussed, focusing particularly on the role of unresolved parameterized processes such as cloud microphysics, turbulence, and convection.« less
NASA Astrophysics Data System (ADS)
Horowitz, H. M.; Alexander, B.; Bitz, C. M.; Jaegle, L.; Burrows, S. M.
2017-12-01
In polar regions, sea ice is a major source of sea salt aerosol through lofting of saline frost flowers or blowing saline snow from the sea ice surface. Under continued climate warming, an ice-free Arctic in summer with only first-year, more saline sea ice in winter is likely. Previous work has focused on climate impacts in summer from increasing open ocean sea salt aerosol emissions following complete sea ice loss in the Arctic, with conflicting results suggesting no net radiative effect or a negative climate feedback resulting from a strong first aerosol indirect effect. However, the radiative forcing from changes to the sea ice sources of sea salt aerosol in a future, warmer climate has not previously been explored. Understanding how sea ice loss affects the Arctic climate system requires investigating both open-ocean and sea ice sources of sea-salt aerosol and their potential interactions. Here, we implement a blowing snow source of sea salt aerosol into the Community Earth System Model (CESM) dynamically coupled to the latest version of the Los Alamos sea ice model (CICE5). Snow salinity is a key parameter affecting blowing snow sea salt emissions and previous work has assumed constant regional snow salinity over sea ice. We develop a parameterization for dynamic snow salinity in the sea ice model and examine how its spatial and temporal variability impacts the production of sea salt from blowing snow. We evaluate and constrain the snow salinity parameterization using available observations. Present-day coupled CESM-CICE5 simulations of sea salt aerosol concentrations including sea ice sources are evaluated against in situ and satellite (CALIOP) observations in polar regions. We then quantify the present-day radiative forcing from the addition of blowing snow sea salt aerosol with respect to aerosol-radiation and aerosol-cloud interactions. The relative contributions of sea ice vs. open ocean sources of sea salt aerosol to radiative forcing in polar regions is discussed.
High resolution far-infrared observations of the evolved H II region M16
DOE Office of Scientific and Technical Information (OSTI.GOV)
McBreen, B.; Fazio, G.G.; Jaffe, D.T.
1982-03-01
M16 is an evolved, extremely density bounded H II region, which now consists only of a series of ionization fronts at molecular cloud boundaries. The source of ionization is the OB star cluster (NGC 6611) which is about 5 x 10/sup 6/ years old. We used the CFA/UA 102 cm balloon-borne telescope to map this region and detected three far-infrared (far-IR) sources embedded in an extended ridge of emission. Source I is an unresolved far-IR source embedded in a molecular cloud near a sharp ionization front. An H/sub 2/O maser is associated with this source, but no radio continuum emissionmore » has been observed. The other two far-IR sources (II and III) are associated with ionized gas-molecular cloud interfaces, with the far-IR radiation arising from dust at the boundary heated by the OB cluster. Source II is located at the southern prominent neutral intrusion with its associated bright rims and dark ''elephant trunk'' globules that delineate the current progress of the ionization front into the neutral material, and Source III arises at the interface of the northern molecular cloud fragment.« less
Observations of SO in dark and molecular clouds
NASA Technical Reports Server (NTRS)
Rydbeck, O. E. H.; Hjalmarson, A.; Rydbeck, G.; Ellder, J.; Kollberg, E.; Irvine, W. M.
1980-01-01
The 1(0)-0(1) transition of SO at 30 GHz has been observed in several sources, including the first detection of sulfur monoxide in cold dark clouds without apparent internal energy sources. The SO transition appears to be an excellent tracer of structure in dark clouds, and the data support suggestions that self-absorption is important in determining emission profiles in such regions for large line-strength transitions. Column densities estimated from a comparison of the results for the two isotopic species indicate a high fractional abundance of SO in dark clouds.
Mackey, Sean
2016-01-01
Background We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Objective Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Methods Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. Results The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. Conclusions The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes. PMID:27261155
O'Reilly-Shah, Vikas; Mackey, Sean
2016-06-03
We describe here Survalytics, a software module designed to address two broad areas of need. The first area is in the domain of surveys and app analytics: developers of mobile apps in both academic and commercial environments require information about their users, as well as how the apps are being used, to understand who their users are and how to optimally approach app development. The second area of need is in the field of ecological momentary assessment, also referred to as experience sampling: researchers in a wide variety of fields, spanning from the social sciences to psychology to clinical medicine, would like to be able to capture daily or even more frequent data from research subjects while in their natural environment. Survalytics is an open-source solution for the collection of survey responses as well as arbitrary analytic metadata from users of Android operating system apps. Surveys may be administered in any combination of one-time questions and ongoing questions. The module may be deployed as a stand-alone app for experience sampling purposes or as an add-on to existing apps. The module takes advantage of free-tier NoSQL cloud database management offered by the Amazon Web Services DynamoDB platform to package a secure, flexible, extensible data collection module. DynamoDB is capable of Health Insurance Portability and Accountability Act compliant storage of personal health information. The provided example app may be used without modification for a basic experience sampling project, and we provide example questions for daily collection of blood glucose data from study subjects. The module will help researchers in a wide variety of fields rapidly develop tailor-made Android apps for a variety of data collection purposes.
AWIPS II in the University Community: Unidata's efforts and capabilities of the software
NASA Astrophysics Data System (ADS)
Ramamurthy, Mohan; James, Michael
2015-04-01
The Advanced Weather Interactive Processing System, version II (AWIPS II) is a weather forecasting, display and analysis tool that is used by the National Oceanic and Atmospheric Administration/National Weather Service (NOAA/NWS) and the National Centers for Environmental Prediction (NCEP) to ingest analyze and disseminate operational weather data. The AWIPS II software is built on a Service Oriented Architecture, takes advantage of open source software, and its design affords expandability, flexibility, and portability. Since many university meteorology programs are eager to use the same tools used by NWS forecasters, Unidata community interest in AWIPS II is high. The Unidata Program Center (UPC) has worked closely with NCEP staff during AWIPS II development in order to devise a way to make it available to the university. The Unidata AWIPS II software was released in beta form in 2014, and it incorporates a number of key changes to the baseline U. S. National Weather Service release to process and display additional data formats and run all components in a single-server standalone configuration. In addition to making available open-source instances of the software libraries that can be downloaded and run at any university, Unidata has also deployed the data-server side of AWIPS II, known as EDEX, in the Amazon Web Service and Microsoft Azure cloud environments. In this set up, universities receive all of the data from remote cloud instances, while they only have to run the AWIPS II client, known as CAVE, to analyze and visualize the data. In this presentation, we will describe Unidata's AWIPS II efforts, including the capabilities of the software in visualizing many different types of real-time meteorological data and its myriad uses in university and other settings.
Generic Module for Collecting Data in Smart Cities
NASA Astrophysics Data System (ADS)
Martinez, A.; Ramirez, F.; Estrada, H.; Torres, L. A.
2017-09-01
The Future Internet brings new technologies to the common life of people, such as Internet of Things, Cloud Computing or Big Data. All this technologies have change the way people communicate and also the way the devices interact with the context, giving rise to new paradigms, as the case of smart cities. Currently, the mobile devices represent one of main sources of information for new applications that take into account the user context, such as apps for mobility, health, of security. Several platforms have been proposed that consider the development of Future Internet applications, however, no generic modules can be found that implement the collection of context data from smartphones. In this research work we present a generic module to collect data from different sensors of the mobile devices and also to send, in a standard manner, this data to the Open FIWARE Cloud to be stored or analyzed by software tools. The proposed module enables the human-as-a-sensor approach for FIWARE Platform.
Juicebox.js Provides a Cloud-Based Visualization System for Hi-C Data.
Robinson, James T; Turner, Douglass; Durand, Neva C; Thorvaldsdóttir, Helga; Mesirov, Jill P; Aiden, Erez Lieberman
2018-02-28
Contact mapping experiments such as Hi-C explore how genomes fold in 3D. Here, we introduce Juicebox.js, a cloud-based web application for exploring the resulting datasets. Like the original Juicebox application, Juicebox.js allows users to zoom in and out of such datasets using an interface similar to Google Earth. Juicebox.js also has many features designed to facilitate data reproducibility and sharing. Furthermore, Juicebox.js encodes the exact state of the browser in a shareable URL. Creating a public browser for a new Hi-C dataset does not require coding and can be accomplished in under a minute. The web app also makes it possible to create interactive figures online that can complement or replace ordinary journal figures. When combined with Juicer, this makes the entire process of data analysis transparent, insofar as every step from raw reads to published figure is publicly available as open source code. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.
TomoMiner and TomoMinerCloud: A software platform for large-scale subtomogram structural analysis
Frazier, Zachary; Xu, Min; Alber, Frank
2017-01-01
SUMMARY Cryo-electron tomography (cryoET) captures the 3D electron density distribution of macromolecular complexes in close to native state. With the rapid advance of cryoET acquisition technologies, it is possible to generate large numbers (>100,000) of subtomograms, each containing a macromolecular complex. Often, these subtomograms represent a heterogeneous sample due to variations in structure and composition of a complex in situ form or because particles are a mixture of different complexes. In this case subtomograms must be classified. However, classification of large numbers of subtomograms is a time-intensive task and often a limiting bottleneck. This paper introduces an open source software platform, TomoMiner, for large-scale subtomogram classification, template matching, subtomogram averaging, and alignment. Its scalable and robust parallel processing allows efficient classification of tens to hundreds of thousands of subtomograms. Additionally, TomoMiner provides a pre-configured TomoMinerCloud computing service permitting users without sufficient computing resources instant access to TomoMiners high-performance features. PMID:28552576
NASA Technical Reports Server (NTRS)
Gopalswamy, N.; Akiyama, S.; Yashiro, S.; Michalek, G.; Lepping, R. P.
2009-01-01
One of the figures (Fig. 4) in "Solar sources and geospace consequences of interplanetary magnetic Clouds observed during solar cycle 23 -- Paper 1" by Gopalswamy et al. (2008, JASTP, Vol. 70, Issues 2-4, February 2008, pp. 245-253) is incorrect because of a software error in t he routine that was used to make the plot. The source positions of various magnetic cloud (MC) types are therefore not plotted correctly.
Aerosol-Cloud Interactions During Puijo Cloud Experiments - The effects of weather and local sources
NASA Astrophysics Data System (ADS)
Komppula, Mika; Portin, Harri; Leskinen, Ari; Romakkaniemi, Sami; Brus, David; Neitola, Kimmo; Hyvärinen, Antti-Pekka; Kortelainen, Aki; Hao, Liqing; Miettinen, Pasi; Jaatinen, Antti; Ahmad, Irshad; Lihavainen, Heikki; Laaksonen, Ari; Lehtinen, Kari E. J.
2013-04-01
The Puijo measurement station has provided continuous data on aerosol-cloud interactions since 2006. The station is located on top of the Puijo observation tower (306 m a.s.l, 224 m above the surrounding lake level) in Kuopio, Finland. The top of the tower is covered by cloud about 15 % of the time, offering perfect conditions for studying aerosol-cloud interactions. With a twin-inlet setup (total and interstitial inlets) we are able to separate the activated particles from the interstitial (non-activated) particles. The continuous twin-inlet measurements include aerosol size distribution, scattering and absorption. In addition cloud droplet number and size distribution are measured continuously with weather parameters. During the campaigns the twin-inlet system was additionally equipped with aerosol mass spectrometer (AMS) and Single Particle Soot Photometer (SP-2). This way we were able to define the differences in chemical composition of the activated and non-activated particles. Potential cloud condensation nuclei (CCN) in different supersaturations were measured with two CCN counters (CCNC). The other CCNC was operated with a Differential Mobility Analyzer (DMA) to obtain size selected CCN spectra. Other additional measurements included Hygroscopic Tandem Differential Mobility Analyzer (HTDMA) for particle hygroscopicity. Additionally the valuable vertical wind profiles (updraft velocities) are available from Halo Doppler lidar during the 2011 campaign. Cloud properties (droplet number and effective radius) from MODIS instrument onboard Terra and Aqua satellites were retrieved and compared with the measured values. This work summarizes the two latest intensive campaigns, Puijo Cloud Experiments (PuCE) 2010 & 2011. We study especially the effect of the local sources on the cloud activation behaviour of the aerosol particles. The main local sources include a paper mill, a heating plant, traffic and residential areas. The sources can be categorized and identified by wind direction. Clear changes can be seen in the aerosol and cloud properties when being under the influence of a local pollutant source. Also differences in the chemical composition of aerosol activated to cloud droplet and those staying interstitial has been observed. For example, the light absorption by cloud interstitial particles is higher when the wind blows from the local pollutant sources compared to a cleaner sector. This may be due to the fact that the absorptive material, e.g. fresh soot, is generally hydrophobic and therefore inhibits activation. Another point of interest is the occasional freezing conditions during the campaign (temperature below zero), which also affects the activation behaviour. The full usage of this special data set will provide new information on the properties and differences of activating and non-activating aerosol particles, as well as on the variables affecting the activation.
OpenFDA: an innovative platform providing access to a wealth of FDA's publicly available data.
Kass-Hout, Taha A; Xu, Zhiheng; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A
2016-05-01
The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Using cutting-edge technologies deployed on FDA's new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. © The Author 2015. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved.
OpenFDA: an innovative platform providing access to a wealth of FDA’s publicly available data
Kass-Hout, Taha A; Mohebbi, Matthew; Nelsen, Hans; Baker, Adam; Levine, Jonathan; Johanson, Elaine; Bright, Roselie A
2016-01-01
Objective The objective of openFDA is to facilitate access and use of big important Food and Drug Administration public datasets by developers, researchers, and the public through harmonization of data across disparate FDA datasets provided via application programming interfaces (APIs). Materials and Methods Using cutting-edge technologies deployed on FDA’s new public cloud computing infrastructure, openFDA provides open data for easier, faster (over 300 requests per second per process), and better access to FDA datasets; open source code and documentation shared on GitHub for open community contributions of examples, apps and ideas; and infrastructure that can be adopted for other public health big data challenges. Results Since its launch on June 2, 2014, openFDA has developed four APIs for drug and device adverse events, recall information for all FDA-regulated products, and drug labeling. There have been more than 20 million API calls (more than half from outside the United States), 6000 registered users, 20,000 connected Internet Protocol addresses, and dozens of new software (mobile or web) apps developed. A case study demonstrates a use of openFDA data to understand an apparent association of a drug with an adverse event. Conclusion With easier and faster access to these datasets, consumers worldwide can learn more about FDA-regulated products. PMID:26644398
Laboratory simulations of cumulus cloud flows explain the entrainment anomaly
NASA Astrophysics Data System (ADS)
Narasimha, Roddam; Diwan, Sourabh S.; Subrahmanyam, Duvvuri; Sreenivas, K. R.; Bhat, G. S.
2010-11-01
In the present laboratory experiments, cumulus cloud flows are simulated by starting plumes and jets subjected to off-source heat addition in amounts that are dynamically similar to latent heat release due to condensation in real clouds. The setup permits incorporation of features like atmospheric inversion layers and the active control of off-source heat addition. Herein we report, for the first time, simulation of five different cumulus cloud types (and many shapes), including three genera and three species (WMO Atlas 1987), which show striking resemblance to real clouds. It is known that the rate of entrainment in cumulus cloud flows is much less than that in classical plumes - the main reason for the failure of early entrainment models. Some of the previous studies on steady-state jets and plumes (done in a similar setup) have attributed this anomaly to the disruption of the large-scale turbulent structures upon the addition of off-source heat. We present estimates of entrainment coefficients from these measurements which show a qualitatively consistent variation with height. We propose that this explains the observed entrainment anomaly in cumulus clouds; further experiments are planned to address this question in the context of starting jets and plumes.
Design and verification of a cloud field optical simulator
NASA Technical Reports Server (NTRS)
Davis, J. M.; Cox, S. K.; Mckee, T. B.
1982-01-01
A concept and an apparatus designed to investigate the reflected and transmitted distributions of light from optically thick clouds is presented. The Cloud Field Optical Simulator (CFOS) is a laboratory device which utilizes an array of incandescent lamps as a source, simulated clouds made from cotton or styrofoam as targets, and an array of silicon photodiodes as detectors. The device allows virtually any source-target-detector geometry to be examined. Similitude between real clouds and their CFOS cotton or styrofoam counterparts is established by relying on a linear relationship between optical depth and the ratio of reflected to transmitted light for a semi-infinite layer. Comparisons of principal plane radiances observed by the CFOS with Monte Carlo computations for a water cloud at 0.7 microns show excellent agreement.
Use of ARM Products in Reanalysis Applications and IPCC Model Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Walsh, John E; Chapman, William L
2011-09-30
Year-3 of the project was spent developing an observed cloud climatology for Barrow, AK and relating the observed cloud fractions to the surface circulation patterns and locally observed winds. Armed with this information, we identified errors and sources of errors of cloud fraction simulations by numerical models in the Arctic. Specifically, we compared the cloud simulations output by the North American Regional Reanalysis (NARR) to corresponding observed cloud fractions obtained by the Department of Energy's Atmospheric Radiation Measurement (ARM) program for four mid-season months: (January, April, July, and October). Reanalyses are obtained from numerical weather prediction models that are notmore » run in real-time. Instead, a reanalysis model ingests a wide variety of historical observations for the purpose of producing a gridded dataset of many model-derived quantities that are as temporally homogeneous as possible. Therefore, reanalysis output can be used as a proxy for observations, although some biases and other errors are inevitable because of model parameterizations and observational gaps. In the observational analysis we documented the seasonality of cloudiness at the north slope including cloud base height and dependence on synoptic regime. We followed this with an evaluation of the associations of wind-speed and direction and cloud amounts in both the observational record and the reanalysis model. The Barrow cloud fraction data show that clear conditions are most often associated with anomalous high pressure to the north of Barrow, especially in spring and early summer. Overcast skies are most commonly associated with anomalous low pressure to the south. The observational analysis shows that low, boundary layer clouds are the most common type of cloud observed North Slope ARM observing site. However, these near-surface clouds are a major source of errors in the NARR simulations. When compared to observations, the NARR over-simulates the fraction of low clouds during the winter months, and under-simulates the fraction of low clouds during the summer months. The NARR wind speeds at the North Slope are correlated to the observed ARM wind speeds at Barrow. The following correlations were obtained using the 3-hourly data: Jan (0.84); Apr (0.83); Jul (0.69); Oct (0.79). A negative bias (undersimulation) exists in the reanalysis wind speeds for January through July, but is typically 3ms-1 or less in magnitude. Overall, the magnitude of the wind vector is undersimulated approximately 74% of the time in the cold season months and 85% of the time July, but only about half of the time in October. Wind direction biases in the model are generally small (10-20 degrees), but they are generally in the leftward-turning direction in all months. We also synthesized NARR atmospheric output into a composite analysis of the synoptic conditions that are present when the reanalysis model fails in its simulations of Arctic cloud fractions, and similarly, those conditions present when the model simulates accurate cloud fractions. Cold season errors were highest when high pressure was located north of Barrow favoring anomalous winds and longer fetches from the northeast. In addition, larger cloud fraction biases were found on days with relatively calm winds (2-5 m/s). The most pronounced oversimulation biases associated with poorly simulated clouds occur during conditions with very low cloud-base heights (< 50 m). In contrast, the model appears more adept at capturing cloudless conditions in the spring than the winter with oversimulations occurring just 5% of the time in spring compared to 20% in the winter months. During the warm season, low level clouds are present in 32% of the time with onshore flow and less than half this frequent in offshore wind conditions. Composite sea level pressure fields indicate that clear sky conditions typically result when high pressure is centered at or near Barrow, AK. Overcast days are associated with generally lower sea level pressures near the North Slope and onshore flow from the NW in most months. Warm season errors were highest when high pressure was persistent to the north of Barrow, AK. This synoptic situation results in onshore flow for the North Slope with persistent winds from the east and northeast. In these situations, the predominant climatological synoptic situation, the NARR model under-simulates summer clouds on the North Slope. In general, the NARR often fails to capture clouds in the lowest 200 meters of the atmosphere. We conclude that the cloud model parameterization fails to cature boundary layer clouds like Arctic stratus and fog, which are observed in 65% of the undersimulations. These NARR undersimulations occur most often during onshore flow environments, such as when high pressure is located north of Barrow and the prevailing winds are from the northeast. In these cases, the airflow is along a fetch of scattered sea ice and open ocean (ice concentrations between 0 and 100%). NARR treats sea ice as a binary function. Grid cells are either considered a slap of ice cover, or totally open ocean. We note that implementing provisions for partial sea ice concentrations in the reanalysis model may help in more accurately depicting surface moisture fluxes and associated model-derived low cloud amounts.« less
Metric Evaluation Pipeline for 3d Modeling of Urban Scenes
NASA Astrophysics Data System (ADS)
Bosch, M.; Leichtman, A.; Chilcott, D.; Goldberg, H.; Brown, M.
2017-05-01
Publicly available benchmark data and metric evaluation approaches have been instrumental in enabling research to advance state of the art methods for remote sensing applications in urban 3D modeling. Most publicly available benchmark datasets have consisted of high resolution airborne imagery and lidar suitable for 3D modeling on a relatively modest scale. To enable research in larger scale 3D mapping, we have recently released a public benchmark dataset with multi-view commercial satellite imagery and metrics to compare 3D point clouds with lidar ground truth. We now define a more complete metric evaluation pipeline developed as publicly available open source software to assess semantically labeled 3D models of complex urban scenes derived from multi-view commercial satellite imagery. Evaluation metrics in our pipeline include horizontal and vertical accuracy and completeness, volumetric completeness and correctness, perceptual quality, and model simplicity. Sources of ground truth include airborne lidar and overhead imagery, and we demonstrate a semi-automated process for producing accurate ground truth shape files to characterize building footprints. We validate our current metric evaluation pipeline using 3D models produced using open source multi-view stereo methods. Data and software is made publicly available to enable further research and planned benchmarking activities.
NASA Astrophysics Data System (ADS)
Tjernström, Michael; Sotiropoulou, Georgia; Sedlar, Joseph; Achtert, Peggy; Brooks, Barbara; Brooks, Ian; Persson, Ola; Prytherch, John; Salsbury, Dominic; Shupe, Matthew; Johnston, Paul; Wolfe, Dan
2016-04-01
With more open water present in the Arctic summer, an understanding of atmospheric processes over open-water and sea-ice surfaces as summer turns into autumn and ice starts forming becomes increasingly important. The Arctic Clouds in Summer Experiment (ACSE) was conducted in a mix of open water and sea ice in the eastern Arctic along the Siberian shelf during late summer and early autumn 2014, providing detailed observations of the seasonal transition, from melt to freeze. Measurements were taken over both ice-free and ice-covered surfaces, offering an insight to the role of the surface state in shaping the lower troposphere and the boundary-layer conditions as summer turned into autumn. During summer, strong surface inversions persisted over sea ice, while well-mixed boundary layers capped by elevated inversions were frequent over open-water. The former were often associated with advection of warm air from adjacent open-water or land surfaces, whereas the latter were due to a positive buoyancy flux from the warm ocean surface. Fog and stratus clouds often persisted over the ice, whereas low-level liquid-water clouds developed over open water. These differences largely disappeared in autumn, when mixed-phase clouds capped by elevated inversions dominated in both ice-free and ice-covered conditions. Low-level-jets occurred ~20-25% of the time in both seasons. The observations indicate that these jets were typically initiated at air-mass boundaries or along the ice edge in autumn, while in summer they appeared to be inertial oscillations initiated by partial frictional decoupling as warm air was advected in over the sea ice. The start of the autumn season was related to an abrupt change in atmospheric conditions, rather than to the gradual change in solar radiation. The autumn onset appeared as a rapid cooling of the whole atmosphere and the freeze up followed as the warm surface lost heat to the atmosphere. While the surface type had a pronounced impact on boundary-layer structure in summer, the surface was often warmer than the atmosphere in autumn, regardless of surface type. Hence the autumn boundary-layer structure was more dependent on synoptic scale meteorology.
Atmospheric State, Cloud Microphysics and Radiative Flux
Mace, Gerald
2008-01-15
Atmospheric thermodynamics, cloud properties, radiative fluxes and radiative heating rates for the ARM Southern Great Plains (SGP) site. The data represent a characterization of the physical state of the atmospheric column compiled on a five-minute temporal and 90m vertical grid. Sources for this information include raw measurements, cloud property and radiative retrievals, retrievals and derived variables from other third-party sources, and radiative calculations using the derived quantities.
Using Cloud Computing infrastructure with CloudBioLinux, CloudMan and Galaxy
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-01-01
Cloud computing has revolutionized availability and access to computing and storage resources; making it possible to provision a large computational infrastructure with only a few clicks in a web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this protocol, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatics analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to setup the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command line interface, and the web-based Galaxy interface. PMID:22700313
Using cloud computing infrastructure with CloudBioLinux, CloudMan, and Galaxy.
Afgan, Enis; Chapman, Brad; Jadan, Margita; Franke, Vedran; Taylor, James
2012-06-01
Cloud computing has revolutionized availability and access to computing and storage resources, making it possible to provision a large computational infrastructure with only a few clicks in a Web browser. However, those resources are typically provided in the form of low-level infrastructure components that need to be procured and configured before use. In this unit, we demonstrate how to utilize cloud computing resources to perform open-ended bioinformatic analyses, with fully automated management of the underlying cloud infrastructure. By combining three projects, CloudBioLinux, CloudMan, and Galaxy, into a cohesive unit, we have enabled researchers to gain access to more than 100 preconfigured bioinformatics tools and gigabytes of reference genomes on top of the flexible cloud computing infrastructure. The protocol demonstrates how to set up the available infrastructure and how to use the tools via a graphical desktop interface, a parallel command-line interface, and the Web-based Galaxy interface.
Volcanic eruption source parameters from active and passive microwave sensors
NASA Astrophysics Data System (ADS)
Montopoli, Mario; Marzano, Frank S.; Cimini, Domenico; Mereu, Luigi
2016-04-01
It is well known, in the volcanology community, that precise information of the source parameters characterising an eruption are of predominant interest for the initialization of the Volcanic Transport and Dispersion Models (VTDM). Source parameters of main interest would be the top altitude of the volcanic plume, the flux of the mass ejected at the emission source, which is strictly related to the cloud top altitude, the distribution of volcanic mass concentration along the vertical column as well as the duration of the eruption and the erupted volume. Usually, the combination of a-posteriori field and numerical studies allow constraining the eruption source parameters for a given volcanic event thus making possible the forecast of ash dispersion and deposition from future volcanic eruptions. So far, remote sensors working at visible and infrared channels (cameras and radiometers) have been mainly used to detect, track and provide estimates of the concentration content and the prevailing size of the particles propagating within the ash clouds up to several thousand of kilometres far from the source as well as track back, a-posteriori, the accuracy of the VATDM outputs thus testing the initial choice made for the source parameters. Acoustic wave (infrasound) and microwave fixed scan radar (voldorad) were also used to infer source parameters. In this work we want to put our attention on the role of sensors operating at microwave wavelengths as complementary tools for the real time estimations of source parameters. Microwaves can benefit of the operability during night and day and a relatively negligible sensitivity to the presence of clouds (non precipitating weather clouds) at the cost of a limited coverage and larger spatial resolution when compared with infrared sensors. Thanks to the aforementioned advantages, the products from microwaves sensors are expected to be sensible mostly to the whole path traversed along the tephra cloud making microwaves particularly appealing for estimates close to the volcano emission source. Near the source the cloud optical thickness is expected to be large enough to induce saturation effects at the infrared sensor receiver thus vanishing the brightness temperature difference methods for the ash cloud identification. In the light of the introduction above, some case studies at Eyjafjallajökull 2010 (Iceland), Etna (Italy) and Calbuco (Cile), on 5-10 May 2010, 23rd Nov., 2013 and 23 Apr., 2015, respectively, are analysed in terms of source parameter estimates (manly the cloud top and mass flax rate) from ground based microwave weather radar (9.6 GHz) and satellite Low Earth Orbit microwave radiometers (50 - 183 GH). A special highlight will be given to the advantages and limitations of microwave-related products with respect to more conventional tools.
NASA Astrophysics Data System (ADS)
Steer, Adam; Trenham, Claire; Druken, Kelsey; Evans, Benjamin; Wyborn, Lesley
2017-04-01
High resolution point clouds and other topology-free point data sources are widely utilised for research, management and planning activities. A key goal for research and management users is making these data and common derivatives available in a way which is seamlessly interoperable with other observed and modelled data. The Australian National Computational Infrastructure (NCI) stores point data from a range of disciplines, including terrestrial and airborne LiDAR surveys, 3D photogrammetry, airborne and ground-based geophysical observations, bathymetric observations and 4D marine tracers. These data are stored alongside a significant store of Earth systems data including climate and weather, ecology, hydrology, geoscience and satellite observations, and available from NCI's National Environmental Research Data Interoperability Platform (NERDIP) [1]. Because of the NERDIP requirement for interoperability with gridded datasets, the data models required to store these data may not conform to the LAS/LAZ format - the widely accepted community standard for point data storage and transfer. The goal for NCI is making point data discoverable, accessible and useable in ways which allow seamless integration with earth observation datasets and model outputs - in turn assisting researchers and decision-makers in the often-convoluted process of handling and analyzing massive point datasets. With a use-case of providing a web data service and supporting a derived product workflow, NCI has implemented and tested a web-based point cloud service using the Open Geospatial Consortium (OGC) Web Processing Service [2] as a transaction handler between a web-based client and server-side computing tools based on a native Linux operating system. Using this model, the underlying toolset for driving a data service is flexible and can take advantage of NCI's highly scalable research cloud. Present work focusses on the Point Data Abstraction Library (PDAL) [3] as a logical choice for efficiently handling LAS/LAZ based point workflows, and native HDF5 libraries for handling point data kept in HDF5-based structures (eg NetCDF4, SPDlib [4]). Points stored in database tables (eg postgres-pointcloud [5]) will be considered as testing continues. Visualising and exploring massive point datasets in a web browser alongside multiple datasets has been demonstrated by the entwine-3D tiles project [6]. This is a powerful interface which enables users to investigate and select appropriate data, and is also being investigated as a potential front-end to a WPS-based point data service. In this work we show preliminary results for a WPS-based point data access system, in preparation for demonstration at FOSS4G 2017, Boston (http://2017.foss4g.org/) [1] http://nci.org.au/data-collections/nerdip/ [2] http://www.opengeospatial.org/standards/wps [3] http://www.pdal.io [4] http://www.spdlib.org/doku.php [5] https://github.com/pgpointcloud/pointcloud [6] http://cesium.entwine.io
OT1_mputman_1: ASCII: All Sky observations of Galactic CII
NASA Astrophysics Data System (ADS)
Putman, M.
2010-07-01
The Milky Way and other galaxies require a significant source of ongoing star formation fuel to explain their star formation histories. A new ubiquitous population of discrete, cold clouds have recently been discovered at the disk-halo interface of our Galaxy that could potentially provide this source of fuel. We propose to observe a small sample of these disk-halo clouds with HIFI to determine if the level of [CII] emission detected suggests they represent the cooling of warm clouds at the interface between the star forming disk and halo. These cooling clouds are predicted by simulations of warm clouds moving into the disk-halo interface region. We target 5 clouds in this proposal for which we have high resolution HI maps and can observe the densest core of the cloud. The results of our observations will also be used to interpret the surprisingly high detections of [CII] for low HI column density clouds in the Galactic Plane by the GOT C+ Key Program by extending the clouds probed to high latitude environments.
Integration of XRootD into the cloud infrastructure for ALICE data analysis
NASA Astrophysics Data System (ADS)
Kompaniets, Mikhail; Shadura, Oksana; Svirin, Pavlo; Yurchenko, Volodymyr; Zarochentsev, Andrey
2015-12-01
Cloud technologies allow easy load balancing between different tasks and projects. From the viewpoint of the data analysis in the ALICE experiment, cloud allows to deploy software using Cern Virtual Machine (CernVM) and CernVM File System (CVMFS), to run different (including outdated) versions of software for long term data preservation and to dynamically allocate resources for different computing activities, e.g. grid site, ALICE Analysis Facility (AAF) and possible usage for local projects or other LHC experiments. We present a cloud solution for Tier-3 sites based on OpenStack and Ceph distributed storage with an integrated XRootD based storage element (SE). One of the key features of the solution is based on idea that Ceph has been used as a backend for Cinder Block Storage service for OpenStack, and in the same time as a storage backend for XRootD, with redundancy and availability of data preserved by Ceph settings. For faster and easier OpenStack deployment was applied the Packstack solution, which is based on the Puppet configuration management system. Ceph installation and configuration operations are structured and converted to Puppet manifests describing node configurations and integrated into Packstack. This solution can be easily deployed, maintained and used even in small groups with limited computing resources and small organizations, which usually have lack of IT support. The proposed infrastructure has been tested on two different clouds (SPbSU & BITP) and integrates successfully with the ALICE data analysis model.
GenomeVIP: a cloud platform for genomic variant discovery and interpretation
Mashl, R. Jay; Scott, Adam D.; Huang, Kuan-lin; Wyczalkowski, Matthew A.; Yoon, Christopher J.; Niu, Beifang; DeNardo, Erin; Yellapantula, Venkata D.; Handsaker, Robert E.; Chen, Ken; Koboldt, Daniel C.; Ye, Kai; Fenyö, David; Raphael, Benjamin J.; Wendl, Michael C.; Ding, Li
2017-01-01
Identifying genomic variants is a fundamental first step toward the understanding of the role of inherited and acquired variation in disease. The accelerating growth in the corpus of sequencing data that underpins such analysis is making the data-download bottleneck more evident, placing substantial burdens on the research community to keep pace. As a result, the search for alternative approaches to the traditional “download and analyze” paradigm on local computing resources has led to a rapidly growing demand for cloud-computing solutions for genomics analysis. Here, we introduce the Genome Variant Investigation Platform (GenomeVIP), an open-source framework for performing genomics variant discovery and annotation using cloud- or local high-performance computing infrastructure. GenomeVIP orchestrates the analysis of whole-genome and exome sequence data using a set of robust and popular task-specific tools, including VarScan, GATK, Pindel, BreakDancer, Strelka, and Genome STRiP, through a web interface. GenomeVIP has been used for genomic analysis in large-data projects such as the TCGA PanCanAtlas and in other projects, such as the ICGC Pilots, CPTAC, ICGC-TCGA DREAM Challenges, and the 1000 Genomes SV Project. Here, we demonstrate GenomeVIP's ability to provide high-confidence annotated somatic, germline, and de novo variants of potential biological significance using publicly available data sets. PMID:28522612
A Comparative Study of Point Cloud Data Collection and Processing
NASA Astrophysics Data System (ADS)
Pippin, J. E.; Matheney, M.; Gentle, J. N., Jr.; Pierce, S. A.; Fuentes-Pineda, G.
2016-12-01
Over the past decade, there has been dramatic growth in the acquisition of publicly funded high-resolution topographic data for scientific, environmental, engineering and planning purposes. These data sets are valuable for applications of interest across a large and varied user community. However, because of the large volumes of data produced by high-resolution mapping technologies and expense of aerial data collection, it is often difficult to collect and distribute these datasets. Furthermore, the data can be technically challenging to process, requiring software and computing resources not readily available to many users. This study presents a comparison of advanced computing hardware and software that is used to collect and process point cloud datasets, such as LIDAR scans. Activities included implementation and testing of open source libraries and applications for point cloud data processing such as, Meshlab, Blender, PDAL, and PCL. Additionally, a suite of commercial scale applications, Skanect and Cloudcompare, were applied to raw datasets. Handheld hardware solutions, a Structure Scanner and Xbox 360 Kinect V1, were tested for their ability to scan at three field locations. The resultant data projects successfully scanned and processed subsurface karst features ranging from small stalactites to large rooms, as well as a surface waterfall feature. Outcomes support the feasibility of rapid sensing in 3D at field scales.
Photogrammetric point cloud compression for tactical networks
NASA Astrophysics Data System (ADS)
Madison, Andrew C.; Massaro, Richard D.; Wayant, Clayton D.; Anderson, John E.; Smith, Clint B.
2017-05-01
We report progress toward the development of a compression schema suitable for use in the Army's Common Operating Environment (COE) tactical network. The COE facilitates the dissemination of information across all Warfighter echelons through the establishment of data standards and networking methods that coordinate the readout and control of a multitude of sensors in a common operating environment. When integrated with a robust geospatial mapping functionality, the COE enables force tracking, remote surveillance, and heightened situational awareness to Soldiers at the tactical level. Our work establishes a point cloud compression algorithm through image-based deconstruction and photogrammetric reconstruction of three-dimensional (3D) data that is suitable for dissimination within the COE. An open source visualization toolkit was used to deconstruct 3D point cloud models based on ground mobile light detection and ranging (LiDAR) into a series of images and associated metadata that can be easily transmitted on a tactical network. Stereo photogrammetric reconstruction is then conducted on the received image stream to reveal the transmitted 3D model. The reported method boasts nominal compression ratios typically on the order of 250 while retaining tactical information and accurate georegistration. Our work advances the scope of persistent intelligence, surveillance, and reconnaissance through the development of 3D visualization and data compression techniques relevant to the tactical operations environment.
Monte Carlo simulation of photon migration in a cloud computing environment with MapReduce
Pratx, Guillem; Xing, Lei
2011-01-01
Monte Carlo simulation is considered the most reliable method for modeling photon migration in heterogeneous media. However, its widespread use is hindered by the high computational cost. The purpose of this work is to report on our implementation of a simple MapReduce method for performing fault-tolerant Monte Carlo computations in a massively-parallel cloud computing environment. We ported the MC321 Monte Carlo package to Hadoop, an open-source MapReduce framework. In this implementation, Map tasks compute photon histories in parallel while a Reduce task scores photon absorption. The distributed implementation was evaluated on a commercial compute cloud. The simulation time was found to be linearly dependent on the number of photons and inversely proportional to the number of nodes. For a cluster size of 240 nodes, the simulation of 100 billion photon histories took 22 min, a 1258 × speed-up compared to the single-threaded Monte Carlo program. The overall computational throughput was 85,178 photon histories per node per second, with a latency of 100 s. The distributed simulation produced the same output as the original implementation and was resilient to hardware failure: the correctness of the simulation was unaffected by the shutdown of 50% of the nodes. PMID:22191916
BioBlocks: Programming Protocols in Biology Made Easier.
Gupta, Vishal; Irimia, Jesús; Pau, Iván; Rodríguez-Patón, Alfonso
2017-07-21
The methods to execute biological experiments are evolving. Affordable fluid handling robots and on-demand biology enterprises are making automating entire experiments a reality. Automation offers the benefit of high-throughput experimentation, rapid prototyping, and improved reproducibility of results. However, learning to automate and codify experiments is a difficult task as it requires programming expertise. Here, we present a web-based visual development environment called BioBlocks for describing experimental protocols in biology. It is based on Google's Blockly and Scratch, and requires little or no experience in computer programming to automate the execution of experiments. The experiments can be specified, saved, modified, and shared between multiple users in an easy manner. BioBlocks is open-source and can be customized to execute protocols on local robotic platforms or remotely, that is, in the cloud. It aims to serve as a de facto open standard for programming protocols in Biology.
Fairchild C-82 Packet Destroyed in NACA Crash Fire Tests
1952-09-21
A Fairchild C-82 Packet is purposely destroyed by researchers at the National Advisory Committee for Aeronautics (NACA) Lewis Flight Propulsion Laboratory. In response to an escalating number of transport aircraft crashes in the mid-1940s, the NACA researchers undertook a decade-long investigation into a number of issues surrounding low-altitude aircraft crashes. The tests were conducted at the Ravenna Arsenal, approximately 60 miles south of the Lewis laboratory in Cleveland, Ohio. The aircraft were excess military transports from World War II. The aircraft was guided down the runway at speeds of 80 to 105 miles per hour. It came into contact with poles which tore open the 1500-gallon fuel tanks in the wings before reaching the barriers at the end of the runway. Fuel poured from the tanks and supply lines, resulting in the spread of both liquid fuel and a large cloud of spray. Solomon Weiss developed a method of dying the fuel red to improve its visibility during the crashes. This red fuel cloud trailed slightly behind the skidding aircraft, then rushed forward when the aircraft stopped. The nine-crash initial phase of testing used Lockheed C-56 Lodestar and C-82 transport aircraft to identify potential ignition sources and analyze the spread of flammable materials. The researchers were able to identify different classes of ignition sources, fuel disbursement patterns, the time when a particular ignition source might appear, rate of the fire spread, cabin survival times, and deceleration rates.
A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures
NASA Astrophysics Data System (ADS)
Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.
2017-10-01
An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.
Vertical profiling of aerosol particles and trace gases over the central Arctic Ocean during summer
NASA Astrophysics Data System (ADS)
Kupiszewski, P.; Leck, C.; Tjernström, M.; Sjogren, S.; Sedlar, J.; Graus, M.; Müller, M.; Brooks, B.; Swietlicki, E.; Norris, S.; Hansel, A.
2013-12-01
Unique measurements of vertical size-resolved aerosol particle concentrations, trace gas concentrations and meteorological data were obtained during the Arctic Summer Cloud Ocean Study (ASCOS, www.ascos.se), an International Polar Year project aimed at establishing the processes responsible for formation and evolution of low-level clouds over the high Arctic summer pack ice. The experiment was conducted from on board the Swedish icebreaker Oden, and provided both ship- and helicopter-based measurements. This study focuses on the vertical helicopter profiles and onboard measurements obtained during a three-week period when Oden was anchored to a drifting ice floe, and sheds light on the characteristics of Arctic aerosol particles and their distribution throughout the lower atmosphere. Distinct differences in aerosol particle characteristics within defined atmospheric layers are identified. Within the lowermost couple hundred metres, transport from the marginal ice zone (MIZ), condensational growth and cloud processing develop the aerosol population. During two of the four representative periods defined in this study, such influence is shown. At altitudes above about 1 km, long-range transport occurs frequently. However, only infrequently does large-scale subsidence descend such air masses to become entrained into the mixed layer in the high Arctic, and therefore long-range transport plumes are unlikely to directly influence low-level stratiform cloud formation. Nonetheless, such plumes can influence the radiative balance of the planetary boundary layer (PBL) by influencing formation and evolution of higher clouds, as well as through precipitation transport of particles downwards. New particle formation was occasionally observed, particularly in the near-surface layer. We hypothesize that the origin of these ultrafine particles could be in biological processes, both primary and secondary, within the open leads between the pack ice and/or along the MIZ. In general, local sources, in combination with upstream boundary-layer transport of precursor gases from the MIZ, are considered to constitute the origin of cloud condensation nuclei (CCN) particles and thus be of importance for the formation of interior Arctic low-level clouds during summer, and subsequently, through cloud influences, for the melting and freezing of sea ice.
Design and verification of a cloud field optical simulator
NASA Technical Reports Server (NTRS)
Davis, J. M.; Cox, S. K.; Mckee, T. B.
1983-01-01
A concept and an apparatus designed to investigate the reflected and transmitted distributions of light from optically thick clouds is presented. The Cloud Field Optical Simulator (CFOS) is a laboratory device which utilizes an array of incandescent lamps as a source, simulated clouds made from cotton or styrofoam as targets, and an array of silicon photodiodes as detectors. The device allows virtually any source-target-detector geometry to be examined. Similitude between real clouds and their CFOS cotton or styrofoam counterparts is established by relying on a linear relationship between optical depth and the ratio of reflected to transmitted light for a semiinfinite layer. Comparisons of principal plane radiances observed by the CFOS with Monte Carlo computations for a water cloud at 0.7 micron show excellent agreement. Initial applications of the CFOS are discussed.
2017-12-29
Oddly enough, an elongated coronal hole (the darker area near the center) seems to shape itself into a single, recognizable question mark over the period of one day (Dec. 21-22, 2017). Coronal holes are areas of open magnetic field that appear darker in extreme ultraviolet light, as is seen here. These holes are the source of streaming plasma that we call solar wind. While this exercise is akin to seeing shapes in clouds, it is fun to consider what the sun might be asking? Perhaps what the new year will bring? Guess what I am going to do next? Movies are available at https://photojournal.jpl.nasa.gov/catalog/PIA22197
Implementation and performance test of cloud platform based on Hadoop
NASA Astrophysics Data System (ADS)
Xu, Jingxian; Guo, Jianhong; Ren, Chunlan
2018-01-01
Hadoop, as an open source project for the Apache foundation, is a distributed computing framework that deals with large amounts of data and has been widely used in the Internet industry. Therefore, it is meaningful to study the implementation of Hadoop platform and the performance of test platform. The purpose of this subject is to study the method of building Hadoop platform and to study the performance of test platform. This paper presents a method to implement Hadoop platform and a test platform performance method. Experimental results show that the proposed test performance method is effective and it can detect the performance of Hadoop platform.
RAIN: A Bio-Inspired Communication and Data Storage Infrastructure.
Monti, Matteo; Rasmussen, Steen
2017-01-01
We summarize the results and perspectives from a companion article, where we presented and evaluated an alternative architecture for data storage in distributed networks. We name the bio-inspired architecture RAIN, and it offers file storage service that, in contrast with current centralized cloud storage, has privacy by design, is open source, is more secure, is scalable, is more sustainable, has community ownership, is inexpensive, and is potentially faster, more efficient, and more reliable. We propose that a RAIN-style architecture could form the backbone of the Internet of Things that likely will integrate multiple current and future infrastructures ranging from online services and cryptocurrency to parts of government administration.
NASA Technical Reports Server (NTRS)
Creamean, Jessie M.; White, Allen B.; Minnis, Patrick; Palikonda, Rabindra; Spangenberg, Douglas A.; Prather, Kimberly A.
2016-01-01
Ice formation in orographic mixed-phase clouds can enhance precipitation and depends on the type of aerosols that serve as ice nucleating particles (INP). The resulting precipitation from these clouds is a viable source of water, especially for regions such as the California Sierra Nevada. Thus, a better understanding of the sources of INP that impact orographic clouds is important for assessing water availability in California. This study presents a multi-site, multi-year analysis of single particle insoluble residues in precipitation samples that likely influenced cloud ice and precipitation formation above Yosemite National Park. Dust and biological particles represented the dominant fraction of the residues (64% on average). Cloud glaciation, determined using GOES satellite observations, not only depended on high cloud tops (greater than 6.2 km) and low temperatures (less than -26 C), but also on the composition of the dust and biological residues. The greatest prevalence of ice-phase clouds occurred in conjunction with biologically-rich residues and mineral dust rich in calcium, followed by iron and aluminosilicates. Dust and biological particles are known to be efficient INP, thus these residues are what likely influenced ice formation in clouds above the sites and subsequent precipitation quantities reaching the surface during events with similar meteorology. The goal of this study is to use precipitation chemistry information to gain a better understanding of the potential sources of INP in the south-central Sierra Nevada, where cloud-aerosol-precipitation interactions are under-studied and where mixed-phase orographic clouds represent a key element in the generation of precipitation and thus the water supply in California.
Analysis, Thematic Maps and Data Mining from Point Cloud to Ontology for Software Development
NASA Astrophysics Data System (ADS)
Nespeca, R.; De Luca, L.
2016-06-01
The primary purpose of the survey for the restoration of Cultural Heritage is the interpretation of the state of building preservation. For this, the advantages of the remote sensing systems that generate dense point cloud (range-based or image-based) are not limited only to the acquired data. The paper shows that it is possible to extrapolate very useful information in diagnostics using spatial annotation, with the use of algorithms already implemented in open-source software. Generally, the drawing of degradation maps is the result of manual work, so dependent on the subjectivity of the operator. This paper describes a method of extraction and visualization of information, obtained by mathematical procedures, quantitative, repeatable and verifiable. The case study is a part of the east facade of the Eglise collégiale Saint-Maurice also called Notre Dame des Grâces, in Caromb, in southern France. The work was conducted on the matrix of information contained in the point cloud asci format. The first result is the extrapolation of new geometric descriptors. First, we create the digital maps with the calculated quantities. Subsequently, we have moved to semi-quantitative analyses that transform new data into useful information. We have written the algorithms for accurate selection, for the segmentation of point cloud, for automatic calculation of the real surface and the volume. Furthermore, we have created the graph of spatial distribution of the descriptors. This work shows that if we work during the data processing we can transform the point cloud into an enriched database: the use, the management and the data mining is easy, fast and effective for everyone involved in the restoration process.
The dynamics behind Titan's methane clouds.
Mitchell, Jonathan L; Pierrehumbert, Raymond T; Frierson, Dargan M W; Caballero, Rodrigo
2006-12-05
We present results of an axisymmetric global circulation model of Titan with a simplified suite of atmospheric physics forced by seasonally varying insolation. The recent discovery of midlatitude tropospheric clouds on Titan has caused much excitement about the roles of surface sources of methane and the global circulation in forming clouds. Although localized surface sources, such as methane geysers or "cryovolcanoes," have been invoked to explain these clouds, we find in this work that clouds appear in regions of convergence by the mean meridional circulation and over the poles during solstices, where the solar forcing reaches its seasonal maximum. Other regions are inhibited from forming clouds because of dynamical transports of methane and strong subsidence. We find that for a variety of moist regimes, i.e., with the effect of methane thermodynamics included, the observed cloud features can be explained by the large-scale dynamics of the atmosphere. Clouds at the solsticial pole are found to be a robust feature of Titan's dynamics, whereas isolated midlatitude clouds are present exclusively in a variety of moist dynamical regimes. In all cases, even without including methane thermodynamics, our model ceases to produce polar clouds approximately 4-6 terrestrial years after solstices.
Crowded: a crowd-sourced perspective of events as they happen
NASA Astrophysics Data System (ADS)
Brantingham, Richard; Hossain, Aleem
2013-05-01
`Crowded' is a web-based application developed by the Defence Science & Technology Laboratory (Dstl) that collates imagery of a particular location from a variety of media sources to provide an operator with real-time situational awareness. Emergency services and other relevant agencies have detected or become aware of an event - a riot or an explosion, for instance - and its location or text associated with it. The ubiquity of mobile devices allows people to collect and upload media of the incident to the Internet, in real time. Crowded manages the interactions with online sources of media: Flickr; Instagram; YouTube; Twitter; and Transport for London traffic cameras, to retrieve imagery that is being uploaded at that point in time. In doing so, it aims to provide human operators with near-instantaneous `eyes-on' from a variety of different perspectives. The first instantiation of Crowded was implemented as a series of integrated web-services with the aim of rapidly understanding whether the approach was viable. In doing so, it demonstrated how non-traditional, open sources can be used to provide a richer current intelligence picture than can be obtained alone from classified sources. The development of Crowded also explored how open source technology and cloud-based services can be used in the modern intelligence and security environment to provide a multi-agency Common Operating Picture to help achieve a co-ordinated response. The lessons learned in building the prototype are currently being used to design and develop a second version, and identify options and priorities for future development.
Contrasting cloud composition between coupled and decoupled marine boundary layer clouds
NASA Astrophysics Data System (ADS)
Wang, Zhen; Mora Ramirez, Marco; Dadashazar, Hossein; MacDonald, Alex B.; Crosbie, Ewan; Bates, Kelvin H.; Coggon, Matthew M.; Craven, Jill S.; Lynch, Peng; Campbell, James R.; Azadi Aghdam, Mojtaba; Woods, Roy K.; Jonsson, Haflidi; Flagan, Richard C.; Seinfeld, John H.; Sorooshian, Armin
2016-10-01
Marine stratocumulus clouds often become decoupled from the vertical layer immediately above the ocean surface. This study contrasts cloud chemical composition between coupled and decoupled marine stratocumulus clouds for dissolved nonwater substances. Cloud water and droplet residual particle composition were measured in clouds off the California coast during three airborne experiments in July-August of separate years (Eastern Pacific Emitted Aerosol Cloud Experiment 2011, Nucleation in California Experiment 2013, and Biological and Oceanic Atmospheric Study 2015). Decoupled clouds exhibited significantly lower air-equivalent mass concentrations in both cloud water and droplet residual particles, consistent with reduced cloud droplet number concentration and subcloud aerosol (Dp > 100 nm) number concentration, owing to detachment from surface sources. Nonrefractory submicrometer aerosol measurements show that coupled clouds exhibit higher sulfate mass fractions in droplet residual particles, owing to more abundant precursor emissions from the ocean and ships. Consequently, decoupled clouds exhibited higher mass fractions of organics, nitrate, and ammonium in droplet residual particles, owing to effects of long-range transport from more distant sources. Sodium and chloride dominated in terms of air-equivalent concentration in cloud water for coupled clouds, and their mass fractions and concentrations exceeded those in decoupled clouds. Conversely, with the exception of sea-salt constituents (e.g., Cl, Na, Mg, and K), cloud water mass fractions of all species examined were higher in decoupled clouds relative to coupled clouds. Satellite and Navy Aerosol Analysis and Prediction System-based reanalysis data are compared with each other, and the airborne data to conclude that limitations in resolving boundary layer processes in a global model prevent it from accurately quantifying observed differences between coupled and decoupled cloud composition.
Atmospheric Science Data Center
2018-06-07
... Larger Image Clouds and the Earth's Radiant Energy System Flight Model 6 (CERES FM6) opened its cover on Jan. 5, 2018 ... radiometer that NASA/NOAA has flown that measures the solar energy reflected by Earth, heat the planet emits, and the role of clouds in ...
Homomorphic encryption experiments on IBM's cloud quantum computing platform
NASA Astrophysics Data System (ADS)
Huang, He-Liang; Zhao, You-Wei; Li, Tan; Li, Feng-Guang; Du, Yu-Tao; Fu, Xiang-Qun; Zhang, Shuo; Wang, Xiang; Bao, Wan-Su
2017-02-01
Quantum computing has undergone rapid development in recent years. Owing to limitations on scalability, personal quantum computers still seem slightly unrealistic in the near future. The first practical quantum computer for ordinary users is likely to be on the cloud. However, the adoption of cloud computing is possible only if security is ensured. Homomorphic encryption is a cryptographic protocol that allows computation to be performed on encrypted data without decrypting them, so it is well suited to cloud computing. Here, we first applied homomorphic encryption on IBM's cloud quantum computer platform. In our experiments, we successfully implemented a quantum algorithm for linear equations while protecting our privacy. This demonstration opens a feasible path to the next stage of development of cloud quantum information technology.
Single-particle characterization of the High Arctic summertime aerosol
NASA Astrophysics Data System (ADS)
Sierau, B.; Chang, R. Y.-W.; Leck, C.; Paatero, J.; Lohmann, U.
2014-01-01
Single-particle mass spectrometric measurements were carried out in the High Arctic north of 80° during summer 2008. The campaign took place onboard the icebreaker Oden and was part of the Arctic Summer Cloud Ocean Study (ASCOS). The instrument deployed was an Aerosol Time-of-Flight Mass Spectrometer (ATOFMS) that provides information on the chemical composition of individual particles and their mixing state in real-time. Aerosols were sampled in the marine boundary layer at stations in the open ocean, in the marginal ice zone, and in the pack ice region. The largest fraction of particles detected for subsequent analysis in the size range of the ATOFMS between approximately 200 nm to 3000 nm in diameter showed mass spectrometric patterns indicating an internal mixing state and a biomass burning and/or biofuel source. The majority of these particles were connected to an air mass layer of elevated particle concentration mixed into the surface mixed layer from the upper part of the marine boundary layer. The second largest fraction was represented by sea salt particles. The chemical analysis of the over-ice sea salt aerosol revealed tracer compounds that reflect chemical aging of the particles during their long-range advection from the marginal ice zone, or open waters south thereof prior to detection at the ship. From our findings we conclude that long-range transport of particles is one source of aerosols in the High Arctic. To assess the importance of long-range particle sources for aerosol-cloud interactions over the inner Arctic in comparison to local and regional biogenic primary aerosol sources, the chemical composition of the detected particles was analyzed for indicators of marine biological origin. Only a~minor fraction showed chemical signatures of potentially ocean-derived primary particles of that kind. However, a chemical bias in the ATOFMS's detection capabilities observed during ASCOS might suggest a presence of a particle type of unknown composition and source. In general, the study suffered from low counting statistics due to the overall small number of particles found in this pristine environment, the small sizes of the prevailing aerosol below the detection limit of the ATOFMS and its low hit rate. To our knowledge, this study reports on the first in-situ single-particle mass spectrometric measurements in the marine boundary layer of the High-Arctic pack-ice region.
Single-particle characterization of the high-Arctic summertime aerosol
NASA Astrophysics Data System (ADS)
Sierau, B.; Chang, R. Y.-W.; Leck, C.; Paatero, J.; Lohmann, U.
2014-07-01
Single-particle mass-spectrometric measurements were carried out in the high Arctic north of 80° during summer 2008. The campaign took place onboard the icebreaker Oden and was part of the Arctic Summer Cloud Ocean Study (ASCOS). The instrument deployed was an aerosol time-of-flight mass spectrometer (ATOFMS) that provides information on the chemical composition of individual particles and their mixing state in real time. Aerosols were sampled in the marine boundary layer at stations in the open ocean, in the marginal ice zone, and in the pack ice region. The largest fraction of particles detected for subsequent analysis in the size range of the ATOFMS between approximately 200 and 3000 nm in diameter showed mass-spectrometric patterns, indicating an internal mixing state and a biomass burning and/or biofuel source. The majority of these particles were connected to an air mass layer of elevated particle concentration mixed into the surface mixed layer from the upper part of the marine boundary layer. The second largest fraction was represented by sea salt particles. The chemical analysis of the over-ice sea salt aerosol revealed tracer compounds that reflect chemical aging of the particles during their long-range advection from the marginal ice zone, or open waters south thereof prior to detection at the ship. From our findings we conclude that long-range transport of particles is one source of aerosols in the high Arctic. To assess the importance of long-range particle sources for aerosol-cloud interactions over the inner Arctic in comparison to local and regional biogenic primary aerosol sources, the chemical composition of the detected particles was analyzed for indicators of marine biological origin. Only a minor fraction showed chemical signatures of potentially ocean-derived primary particles of that kind. However, a chemical bias in the ATOFMS's detection capabilities observed during ASCOS might suggest the presence of a particle type of unknown composition and source. In general, the study suffered from low counting statistics due to the overall small number of particles found in this pristine environment, the small sizes of the prevailing aerosol below the detection limit of the ATOFMS, and its low hit rate. To our knowledge, this study reports on the first in situ single-particle mass-spectrometric measurements in the marine boundary layer of the high-Arctic pack ice region.
NASA Astrophysics Data System (ADS)
Creamean, J.; Ault, A. P.; White, A. B.; Neiman, P. J.; Minnis, P.; Prather, K. A.
2014-12-01
Aerosols that serve as cloud condensation nuclei (CCN) and ice nuclei (IN) have the potential to profoundly influence precipitation processes. Furthermore, changes in orographic precipitation have broad implications for reservoir storage and flood risks. As part of the CalWater I field campaign (2009-2011), the impacts of aerosol sources on precipitation were investigated in the California Sierra Nevada Mountains. In 2009, the precipitation collected on the ground was influenced by both local biomass burning and long-range transported dust and biological particles, while in 2010, by mostly local sources of biomass burning and pollution, and in 2011 by mostly long-range transport of dust and biological particles from distant sources. Although vast differences in the sources of residues were observed from year-to-year, dust and biological residues were omnipresent (on average, 55% of the total residues combined) and were associated with storms consisting of deep convective cloud systems and larger quantities of precipitation initiated in the ice phase. Further, biological residues were dominant during storms with relatively warm cloud temperatures (up to -15°C), suggesting biological components were more efficient IN than mineral dust. On the other hand, when precipitation quantities were lower, local biomass burning and pollution residues were observed (on average 31% and 9%, respectively), suggesting these residues potentially served as CCN at the base of shallow cloud systems and that lower level polluted clouds of storm systems produced less precipitation than non-polluted (i.e., marine) clouds. The direct connection of the sources of aerosols within clouds and precipitation type and quantity can be used in models to better assess how local emissions versus long-range transported dust and biological aerosols play a role in impacting regional weather and climate, ultimately with the goal of more accurate predictive weather forecast models and water resource management.
NASA Technical Reports Server (NTRS)
Wu, Dongliang L.
2017-01-01
Clouds, ice clouds in particular, are a major source of uncertainty in climate models. Submm-wave sensors fill the sensitivity gap between MW and IR.Cloud microphysical properties (particle size and shape) account for large (200 and 40) measurement uncertainty.
A Hybrid Cloud Computing Service for Earth Sciences
NASA Astrophysics Data System (ADS)
Yang, C. P.
2016-12-01
Cloud Computing is becoming a norm for providing computing capabilities for advancing Earth sciences including big Earth data management, processing, analytics, model simulations, and many other aspects. A hybrid spatiotemporal cloud computing service is bulit at George Mason NSF spatiotemporal innovation center to meet this demands. This paper will report the service including several aspects: 1) the hardware includes 500 computing services and close to 2PB storage as well as connection to XSEDE Jetstream and Caltech experimental cloud computing environment for sharing the resource; 2) the cloud service is geographically distributed at east coast, west coast, and central region; 3) the cloud includes private clouds managed using open stack and eucalyptus, DC2 is used to bridge these and the public AWS cloud for interoperability and sharing computing resources when high demands surfing; 4) the cloud service is used to support NSF EarthCube program through the ECITE project, ESIP through the ESIP cloud computing cluster, semantics testbed cluster, and other clusters; 5) the cloud service is also available for the earth science communities to conduct geoscience. A brief introduction about how to use the cloud service will be included.
High-resolution stable isotope signature of a land-falling atmospheric river in Southern Norway
NASA Astrophysics Data System (ADS)
Weng, Yongbiao; Sodemann, Harald
2017-04-01
Gathering observational evidence of the long-range moisture versus local source contributions remains a scientific challenge, but is critical for understanding how hydrological extremes develop. Moisture transport to the west coast of Norway is often connected to elongated meridional structures of high water vapour flux known as Atmospheric Rivers. It is still an open question how well moisture sources estimated by different numerical models for such events of long-range transport correspond with reality. In this study, we present high resolution stable isotope information collected during a land-falling Atmospheric River in Southern Norway during winter 2016, and analyse the data with the aim to differentiate between moisture source signatures and below-cloud processes affecting the stable isotope composition. The precipitation characterised by a pronounced warm front was sampled manually on a rooftop platform at a 10-20 minute interval during the 24h of the event and later measured by a laser spectrometer (Picarro L2140-i) in the lab for δ18O, δD, and d-excess. Simultaneously, the stable isotope composition of water vapor was continuously measured at high resolution. To that end, ambient air was continuously pumped from a nearby inlet at 25 m above the ground and measured by another laser spectrometer (Picarro L2130-i). Stable water isotope measurements were supplemented by detailed precipitation parameters from a laser disdrometer (OTT Parsivel2), Micro Rain Radar (MRR-2), Total Precipitation Sensor (TPS-3100), and a nearby weather station. Measurements show a signature of two depletion periods in the main stable isotope parameters that are not apparent in precipitation amount and atmospheric temperature measurements. The deuterium excess in rainfall responds differently, with first and increase and then a decrease during these depletion periods. We interpret this as a combined consequence of airmass change, cloud microphysics, and below-cloud effects. Moisture sources identified during the atmospheric river event show a clear transition that points to the need to constrain this kind of analysis by additional stable water isotope observations en route and upstream.
Scalable and Resilient Middleware to Handle Information Exchange during Environment Crisis
NASA Astrophysics Data System (ADS)
Tao, R.; Poslad, S.; Moßgraber, J.; Middleton, S.; Hammitzsch, M.
2012-04-01
The EU FP7 TRIDEC project focuses on enabling real-time, intelligent, information management of collaborative, complex, critical decision processes for earth management. A key challenge is to promote a communication infrastructure to facilitate interoperable environment information services during environment events and crises such as tsunamis and drilling, during which increasing volumes and dimensionality of disparate information sources, including sensor-based and human-based ones, can result, and need to be managed. Such a system needs to support: scalable, distributed messaging; asynchronous messaging; open messaging to handling changing clients such as new and retired automated system and human information sources becoming online or offline; flexible data filtering, and heterogeneous access networks (e.g., GSM, WLAN and LAN). In addition, the system needs to be resilient to handle the ICT system failures, e.g. failure, degradation and overloads, during environment events. There are several system middleware choices for TRIDEC based upon a Service-oriented-architecture (SOA), Event-driven-Architecture (EDA), Cloud Computing, and Enterprise Service Bus (ESB). In an SOA, everything is a service (e.g. data access, processing and exchange); clients can request on demand or subscribe to services registered by providers; more often interaction is synchronous. In an EDA system, events that represent significant changes in state can be processed simply, or as streams or more complexly. Cloud computing is a virtualization, interoperable and elastic resource allocation model. An ESB, a fundamental component for enterprise messaging, supports synchronous and asynchronous message exchange models and has inbuilt resilience against ICT failure. Our middleware proposal is an ESB based hybrid architecture model: an SOA extension supports more synchronous workflows; EDA assists the ESB to handle more complex event processing; Cloud computing can be used to increase and decrease the ESB resources on demand. To reify this hybrid ESB centric architecture, we will adopt two complementary approaches: an open source one for scalability and resilience improvement while a commercial one can be used for ultra-speed messaging, whilst we can bridge between these two to support interoperability. In TRIDEC, to manage such a hybrid messaging system, overlay and underlay management techniques will be adopted. The managers (both global and local) will collect, store and update status information (e.g. CPU utilization, free space, number of clients) and balance the usage, throughput, and delays to improve resilience and scalability. The expected resilience improvement includes dynamic failover, self-healing, pre-emptive load balancing, and bottleneck prediction while the expected improvement for scalability includes capacity estimation, Http Bridge, and automatic configuration and reconfiguration (e.g. add or delete clients and servers).
Sources and Variability of Aerosols and Aerosol-Cloud Interactions in the Arctic
NASA Astrophysics Data System (ADS)
Liu, H.; Zhang, B.; Taylor, P. C.; Moore, R.; Barahona, D.; Fairlie, T. D.; Chen, G.; Ham, S. H.; Kato, S.
2017-12-01
Arctic sea ice in recent decades has significantly declined. This requires understanding of the Arctic surface energy balance, of which clouds are a major driver. However, the mechanisms for the formation and evolution of clouds in the Arctic and the roles of aerosols therein are highly uncertain. Here we conduct data analysis and global model simulations to examine the sources and variability of aerosols and aerosol-cloud interactions in the Arctic. We use the MERRA-2 reanalysis data (2006-present) from the NASA Global Modeling and Assimilation Office (GMAO) to (1) quantify contributions of different aerosol types to the aerosol budget and aerosol optical depths in the Arctic, (2) examine aerosol distributions and variability and diagnose the major pathways for mid-latitude pollution transport to the Arctic, including their seasonal and interannual variability, and (3) characterize the distribution and variability of clouds (cloud optical depth, cloud fraction, cloud liquid and ice water path, cloud top height) in the Arctic. We compare MERRA-2 aerosol and cloud properties with those from C3M, a 3-D aerosol and cloud data product developed at NASA Langley Research Center and merged from multiple A-Train satellite (CERES, CloudSat, CALIPSO, and MODIS) observations. We also conduct perturbation experiments using the NASA GEOS-5 chemistry-climate model (with GOCART aerosol module coupled with two-moment cloud microphysics), and discuss the roles of various types of aerosols in the formation and evolution of clouds in the Arctic.
The role of ice nuclei recycling in the maintenance of cloud ice in Arctic mixed-phase stratocumulus
Solomon, Amy; Feingold, G.; Shupe, M. D.
2015-09-25
This study investigates the maintenance of cloud ice production in Arctic mixed-phase stratocumulus in large eddy simulations that include a prognostic ice nuclei (IN) formulation and a diurnal cycle. Balances derived from a mixed-layer model and phase analyses are used to provide insight into buffering mechanisms that maintain ice in these cloud systems. We find that, for the case under investigation, IN recycling through subcloud sublimation considerably prolongs ice production over a multi-day integration. This effective source of IN to the cloud dominates over mixing sources from above or below the cloud-driven mixed layer. Competing feedbacks between dynamical mixing andmore » recycling are found to slow the rate of ice lost from the mixed layer when a diurnal cycle is simulated. Furthermore, the results of this study have important implications for maintaining phase partitioning of cloud ice and liquid that determine the radiative forcing of Arctic mixed-phase clouds.« less
The role of ice nuclei recycling in the maintenance of cloud ice in Arctic mixed-phase stratocumulus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Solomon, Amy; Feingold, G.; Shupe, M. D.
This study investigates the maintenance of cloud ice production in Arctic mixed-phase stratocumulus in large eddy simulations that include a prognostic ice nuclei (IN) formulation and a diurnal cycle. Balances derived from a mixed-layer model and phase analyses are used to provide insight into buffering mechanisms that maintain ice in these cloud systems. We find that, for the case under investigation, IN recycling through subcloud sublimation considerably prolongs ice production over a multi-day integration. This effective source of IN to the cloud dominates over mixing sources from above or below the cloud-driven mixed layer. Competing feedbacks between dynamical mixing andmore » recycling are found to slow the rate of ice lost from the mixed layer when a diurnal cycle is simulated. Furthermore, the results of this study have important implications for maintaining phase partitioning of cloud ice and liquid that determine the radiative forcing of Arctic mixed-phase clouds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Fan; Ovchinnikov, Mikhail; Shaw, Raymond A.
Mixed-phase stratiform clouds can persist even with steady ice precipitation fluxes, and the origin and microphysical properties of the ice crystals are of interest. Vapor deposition growth and sedimentation of ice particles along with a uniform volume source of ice nucleation, leads to a power law relation between ice water content wi and ice number concentration ni with exponent 2.5. The result is independent of assumptions about the vertical velocity structure of the cloud and is therefore more general than the related expression of Yang et al. [2013]. The sensitivity of the wi-ni relationship to the spatial distribution of icemore » nucleation is confirmed by Lagrangian tracking and ice growth with cloud-volume, cloud-top, and cloud-base sources of ice particles through a time-dependent cloud field. Based on observed wi and ni from ISDAC, a lower bound of 0.006 m^3/s is obtained for the ice crystal formation rate.« less
A Sensitive Cloud Chamber without Radioactive Sources
ERIC Educational Resources Information Center
Zeze, Syoji; Itoh, Akio; Oyama, Ayu; Takahashi, Haruka
2012-01-01
We present a sensitive diffusion cloud chamber which does not require any radioactive sources. A major difference from commonly used chambers is the use of a heat sink as its bottom plate. The result of a performance test of the chamber is given. (Contains 8 figures.)
Exploration of cloud computing late start LDRD #149630 : Raincoat. v. 2.1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Echeverria, Victor T.; Metral, Michael David; Leger, Michelle A.
This report contains documentation from an interoperability study conducted under the Late Start LDRD 149630, Exploration of Cloud Computing. A small late-start LDRD from last year resulted in a study (Raincoat) on using Virtual Private Networks (VPNs) to enhance security in a hybrid cloud environment. Raincoat initially explored the use of OpenVPN on IPv4 and demonstrates that it is possible to secure the communication channel between two small 'test' clouds (a few nodes each) at New Mexico Tech and Sandia. We extended the Raincoat study to add IPSec support via Vyatta routers, to interface with a public cloud (Amazon Elasticmore » Compute Cloud (EC2)), and to be significantly more scalable than the previous iteration. The study contributed to our understanding of interoperability in a hybrid cloud.« less
The EarthServer Federation: State, Role, and Contribution to GEOSS
NASA Astrophysics Data System (ADS)
Merticariu, Vlad; Baumann, Peter
2016-04-01
The intercontinental EarthServer initiative has established a European datacube platform with proven scalability: known databases exceed 100 TB, and single queries have been split across more than 1,000 cloud nodes. Its service interface being rigorously based on the OGC "Big Geo Data" standards, Web Coverage Service (WCS) and Web Coverage Processing Service (WCPS), a series of clients can dock into the services, ranging from open-source OpenLayers and QGIS over open-source NASA WorldWind to proprietary ESRI ArcGIS. Datacube fusion in a "mix and match" style is supported by the platform technolgy, the rasdaman Array Database System, which transparently federates queries so that users simply approach any node of the federation to access any data item, internally optimized for minimal data transfer. Notably, rasdaman is part of GEOSS GCI. NASA is contributing its Web WorldWind virtual globe for user-friendly data extraction, navigation, and analysis. Integrated datacube / metadata queries are contributed by CITE. Current federation members include ESA (managed by MEEO sr.l.), Plymouth Marine Laboratory (PML), the European Centre for Medium-Range Weather Forecast (ECMWF), Australia's National Computational Infrastructure, and Jacobs University (adding in Planetary Science). Further data centers have expressed interest in joining. We present the EarthServer approach, discuss its underlying technology, and illustrate the contribution this datacube platform can make to GEOSS.
NASA Astrophysics Data System (ADS)
Hammitzsch, M.; Spazier, J.; Reißland, S.
2014-12-01
Usually, tsunami early warning and mitigation systems (TWS or TEWS) are based on several software components deployed in a client-server based infrastructure. The vast majority of systems importantly include desktop-based clients with a graphical user interface (GUI) for the operators in early warning centers. However, in times of cloud computing and ubiquitous computing the use of concepts and paradigms, introduced by continuously evolving approaches in information and communications technology (ICT), have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in three research projects - 'German Indonesian Tsunami Early Warning System' (GITEWS), 'Distant Early Warning System' (DEWS), and 'Collaborative, Complex, and Critical Decision-Support in Evolving Crises' (TRIDEC) - new technologies are exploited to implement a cloud-based and web-based prototype to open up new prospects for EWS. This prototype, named 'TRIDEC Cloud', merges several complementary external and in-house cloud-based services into one platform for automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The prototype in its current version addresses tsunami early warning and mitigation. The integration of GPU accelerated tsunami simulation computations have been an integral part of this prototype to foster early warning with on-demand tsunami predictions based on actual source parameters. However, the platform is meant for researchers around the world to make use of the cloud-based GPU computation to analyze other types of geohazards and natural hazards and react upon the computed situation picture with a web-based GUI in a web browser at remote sites. The current website is an early alpha version for demonstration purposes to give the concept a whirl and to shape science's future. Further functionality, improvements and possible profound changes have to implemented successively based on the users' evolving needs.
Natural Aerosols Explain Seasonal and Spatial Patterns of Southern Ocean Cloud Albedo
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Daniel; Burrows, Susannah M.; Wood, R.
2015-07-17
Small particles called aerosols act as nucleation sites for cloud drop formation, affecting clouds and cloud properties – ultimately influencing the cloud dynamics, lifetime, water path and areal extent that determine the reflectivity (albedo) of clouds. The concentration Nd of droplets in clouds that influences planetary albedo is sensitive to the availability of aerosol particles on which the droplets form. Natural aerosol concentrations not only affect cloud properties themselves, but also modulate the sensitivity of clouds to changes in anthropogenic aerosols. Here, it is shown that modeled natural aerosols, principally marine biogenic primary and secondary aerosol sources, explain more thanmore » half of the spatiotemporal variability in satellite-observed Nd. Enhanced Nd over regions of high biological activity is found to be driven primarily by high concentrations of sulfate aerosol at lower Southern Ocean latitudes (35-45°S) and by organic matter in sea spray aerosol at higher latitudes (45-55°S). Biogenic sources are estimated to increase the summertime mean reflected solar radiation in excess of 10 W m-2 over parts of the Southern Ocean, which is comparable to the annual mean increases expected from anthropogenic aerosols over heavily polluted regions of the Northern Hemisphere.« less
SeqWare Query Engine: storing and searching sequence data in the cloud.
O'Connor, Brian D; Merriman, Barry; Nelson, Stanley F
2010-12-21
Since the introduction of next-generation DNA sequencers the rapid increase in sequencer throughput, and associated drop in costs, has resulted in more than a dozen human genomes being resequenced over the last few years. These efforts are merely a prelude for a future in which genome resequencing will be commonplace for both biomedical research and clinical applications. The dramatic increase in sequencer output strains all facets of computational infrastructure, especially databases and query interfaces. The advent of cloud computing, and a variety of powerful tools designed to process petascale datasets, provide a compelling solution to these ever increasing demands. In this work, we present the SeqWare Query Engine which has been created using modern cloud computing technologies and designed to support databasing information from thousands of genomes. Our backend implementation was built using the highly scalable, NoSQL HBase database from the Hadoop project. We also created a web-based frontend that provides both a programmatic and interactive query interface and integrates with widely used genome browsers and tools. Using the query engine, users can load and query variants (SNVs, indels, translocations, etc) with a rich level of annotations including coverage and functional consequences. As a proof of concept we loaded several whole genome datasets including the U87MG cell line. We also used a glioblastoma multiforme tumor/normal pair to both profile performance and provide an example of using the Hadoop MapReduce framework within the query engine. This software is open source and freely available from the SeqWare project (http://seqware.sourceforge.net). The SeqWare Query Engine provided an easy way to make the U87MG genome accessible to programmers and non-programmers alike. This enabled a faster and more open exploration of results, quicker tuning of parameters for heuristic variant calling filters, and a common data interface to simplify development of analytical tools. The range of data types supported, the ease of querying and integrating with existing tools, and the robust scalability of the underlying cloud-based technologies make SeqWare Query Engine a nature fit for storing and searching ever-growing genome sequence datasets.
SeqWare Query Engine: storing and searching sequence data in the cloud
2010-01-01
Background Since the introduction of next-generation DNA sequencers the rapid increase in sequencer throughput, and associated drop in costs, has resulted in more than a dozen human genomes being resequenced over the last few years. These efforts are merely a prelude for a future in which genome resequencing will be commonplace for both biomedical research and clinical applications. The dramatic increase in sequencer output strains all facets of computational infrastructure, especially databases and query interfaces. The advent of cloud computing, and a variety of powerful tools designed to process petascale datasets, provide a compelling solution to these ever increasing demands. Results In this work, we present the SeqWare Query Engine which has been created using modern cloud computing technologies and designed to support databasing information from thousands of genomes. Our backend implementation was built using the highly scalable, NoSQL HBase database from the Hadoop project. We also created a web-based frontend that provides both a programmatic and interactive query interface and integrates with widely used genome browsers and tools. Using the query engine, users can load and query variants (SNVs, indels, translocations, etc) with a rich level of annotations including coverage and functional consequences. As a proof of concept we loaded several whole genome datasets including the U87MG cell line. We also used a glioblastoma multiforme tumor/normal pair to both profile performance and provide an example of using the Hadoop MapReduce framework within the query engine. This software is open source and freely available from the SeqWare project (http://seqware.sourceforge.net). Conclusions The SeqWare Query Engine provided an easy way to make the U87MG genome accessible to programmers and non-programmers alike. This enabled a faster and more open exploration of results, quicker tuning of parameters for heuristic variant calling filters, and a common data interface to simplify development of analytical tools. The range of data types supported, the ease of querying and integrating with existing tools, and the robust scalability of the underlying cloud-based technologies make SeqWare Query Engine a nature fit for storing and searching ever-growing genome sequence datasets. PMID:21210981
Private Cloud Communities for Faculty and Students
ERIC Educational Resources Information Center
Tomal, Daniel R.; Grant, Cynthia
2015-01-01
Massive open online courses (MOOCs) and public and private cloud communities continue to flourish in the field of higher education. However, MOOCs have received criticism in recent years and offer little benefit to students already enrolled at an institution. This article advocates for the collaborative creation and use of institutional, program…
ERIC Educational Resources Information Center
Fredette, Michelle
2012-01-01
"Rent or buy?" is a question people ask about everything from housing to textbooks. It is also a question universities must consider when it comes to high-performance computing (HPC). With the advent of Amazon's Elastic Compute Cloud (EC2), Microsoft Windows HPC Server, Rackspace's OpenStack, and other cloud-based services, researchers now have…
ERIC Educational Resources Information Center
Malleus, Elina; Kikas, Eve; Kruus, Sigrid
2016-01-01
This study describes primary school students' knowledge about rainfall, clouds and rainbow formation together with teachers' predictions about students' performance. In our study, primary school students' (N = 177) knowledge about rainfall and rainbow formation was examined using structured interviews with open-ended questions. Primary school…
Local Atmospheric Response to an Open-Ocean Polynya in a High-Resolution Climate Model
Weijer, Wilbert; Veneziani, Milena; Stössel, Achim; ...
2017-03-01
For this scientific paper, we study the atmospheric response to an open-ocean polynya in the Southern Ocean by analyzing the results from an atmospheric and oceanic synoptic-scale resolving Community Earth System Model (CESM) simulation. While coarser-resolution versions of CESM generally do not produce open-ocean polynyas in the Southern Ocean, they do emerge and disappear on interannual timescales in the synoptic-scale simulation. This provides an ideal opportunity to study the polynya’s impact on the overlying and surrounding atmosphere. This has been pursued here by investigating the seasonal cycle of differences of surface and air-column variables between polynya and non-polynya years. Ourmore » results indicate significant local impacts on turbulent heat fluxes, precipitation, cloud characteristics, and radiative fluxes. In particular, we find that clouds over polynyas are optically thicker and higher than clouds over sea ice during non-polynya years. Although the lower albedo of polynyas significantly increases the net shortwave absorption, the enhanced cloud brightness tempers this increase by almost 50%. Also, in this model, enhanced longwave radiation emitted from the warmer surface of polynyas is balanced by stronger downwelling fluxes from the thicker cloud deck. Impacts are found to be sensitive to the synoptic wind direction. Strongest regional impacts are found when northeasterly winds cross the polynya and interact with katabatic winds. Finally, surface air pressure anomalies over the polynya are only found to be significant when cold, dry air masses strike over the polynya, i.e. in case of southerly winds.« less
Local Atmospheric Response to an Open-Ocean Polynya in a High-Resolution Climate Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weijer, Wilbert; Veneziani, Milena; Stössel, Achim
For this scientific paper, we study the atmospheric response to an open-ocean polynya in the Southern Ocean by analyzing the results from an atmospheric and oceanic synoptic-scale resolving Community Earth System Model (CESM) simulation. While coarser-resolution versions of CESM generally do not produce open-ocean polynyas in the Southern Ocean, they do emerge and disappear on interannual timescales in the synoptic-scale simulation. This provides an ideal opportunity to study the polynya’s impact on the overlying and surrounding atmosphere. This has been pursued here by investigating the seasonal cycle of differences of surface and air-column variables between polynya and non-polynya years. Ourmore » results indicate significant local impacts on turbulent heat fluxes, precipitation, cloud characteristics, and radiative fluxes. In particular, we find that clouds over polynyas are optically thicker and higher than clouds over sea ice during non-polynya years. Although the lower albedo of polynyas significantly increases the net shortwave absorption, the enhanced cloud brightness tempers this increase by almost 50%. Also, in this model, enhanced longwave radiation emitted from the warmer surface of polynyas is balanced by stronger downwelling fluxes from the thicker cloud deck. Impacts are found to be sensitive to the synoptic wind direction. Strongest regional impacts are found when northeasterly winds cross the polynya and interact with katabatic winds. Finally, surface air pressure anomalies over the polynya are only found to be significant when cold, dry air masses strike over the polynya, i.e. in case of southerly winds.« less
NASA Astrophysics Data System (ADS)
Lopez Garcia, Alvaro; Zangrando, Lisa; Sgaravatto, Massimo; Llorens, Vincent; Vallero, Sara; Zaccolo, Valentina; Bagnasco, Stefano; Taneja, Sonia; Dal Pra, Stefano; Salomoni, Davide; Donvito, Giacinto
2017-10-01
Performing efficient resource provisioning is a fundamental aspect for any resource provider. Local Resource Management Systems (LRMS) have been used in data centers for decades in order to obtain the best usage of the resources, providing their fair usage and partitioning for the users. In contrast, current cloud schedulers are normally based on the immediate allocation of resources on a first-come, first-served basis, meaning that a request will fail if there are no resources (e.g. OpenStack) or it will be trivially queued ordered by entry time (e.g. OpenNebula). Moreover, these scheduling strategies are based on a static partitioning of the resources, meaning that existing quotas cannot be exceeded, even if there are idle resources allocated to other projects. This is a consequence of the fact that cloud instances are not associated with a maximum execution time and leads to a situation where the resources are under-utilized. These facts have been identified by the INDIGO-DataCloud project as being too simplistic for accommodating scientific workloads in an efficient way, leading to an underutilization of the resources, a non desirable situation in scientific data centers. In this work, we will present the work done in the scheduling area during the first year of the INDIGO project and the foreseen evolutions.
Cloud-Scale Numerical Modeling of the Arctic Boundary Layer
NASA Technical Reports Server (NTRS)
Krueger, Steven K.
1998-01-01
The interactions between sea ice, open ocean, atmospheric radiation, and clouds over the Arctic Ocean exert a strong influence on global climate. Uncertainties in the formulation of interactive air-sea-ice processes in global climate models (GCMs) result in large differences between the Arctic, and global, climates simulated by different models. Arctic stratus clouds are not well-simulated by GCMs, yet exert a strong influence on the surface energy budget of the Arctic. Leads (channels of open water in sea ice) have significant impacts on the large-scale budgets during the Arctic winter, when they contribute about 50 percent of the surface fluxes over the Arctic Ocean, but cover only 1 to 2 percent of its area. Convective plumes generated by wide leads may penetrate the surface inversion and produce condensate that spreads up to 250 km downwind of the lead, and may significantly affect the longwave radiative fluxes at the surface and thereby the sea ice thickness. The effects of leads and boundary layer clouds must be accurately represented in climate models to allow possible feedbacks between them and the sea ice thickness. The FIRE III Arctic boundary layer clouds field program, in conjunction with the SHEBA ice camp and the ARM North Slope of Alaska and Adjacent Arctic Ocean site, will offer an unprecedented opportunity to greatly improve our ability to parameterize the important effects of leads and boundary layer clouds in GCMs.
Impact of Biomass Burning Aerosols on Cloud Formation in Coastal Regions
NASA Astrophysics Data System (ADS)
Nair, U. S.; Wu, Y.; Reid, J. S.
2017-12-01
In the tropics, shallow and deep convective cloud structures organize in hierarchy of spatial scales ranging from meso-gamma (2-20 km) to planetary scales (40,000km). At the lower end of the spectrum is shallow convection over the open ocean, whose upscale growth is dependent upon mesoscale convergence triggers. In this context, cloud systems associated with land breezes that propagate long distances into open ocean areas are important. We utilized numerical model simulations to examine the impact of biomass burning on such cloud systems in the maritime continent, specifically along the coastal regions of Sarawak. Numerical model simulations conducted using the Weather Research and Forecasting Chemistry (WRF-Chem) model show spatial patterns of smoke that show good agreement to satellite observations. Analysis of model simulations show that, during daytime the horizontal convective rolls (HCRs) that form over land play an important role in organizing transport of smoke in the coastal regions. Alternating patterns of low and high smoke concentrations that are well correlated to the wavelengths of HCRs are found in both the simulations and satellite observations. During night time, smoke transport is modulated by the land breeze circulation and a band of enhanced smoke concentration is found along the land breeze front. Biomass burning aerosols are ingested by the convective clouds that form along the land breeze and leads to changes in total water path, cloud structure and precipitation formation.
Szyrkowiec, Thomas; Autenrieth, Achim; Gunning, Paul; Wright, Paul; Lord, Andrew; Elbers, Jörg-Peter; Lumb, Alan
2014-02-10
For the first time, we demonstrate the orchestration of elastic datacenter and inter-datacenter transport network resources using a combination of OpenStack and OpenFlow. Programmatic control allows a datacenter operator to dynamically request optical lightpaths from a transport network operator to accommodate rapid changes of inter-datacenter workflows.
NASA Astrophysics Data System (ADS)
Zheng, X.; Albrecht, B.; Jonsson, H. H.; Khelif, D.; Feingold, G.; Minnis, P.; Ayers, K.; Chuang, P.; Donaher, S.; Rossiter, D.; Ghate, V.; Ruiz-Plancarte, J.; Sun-Mack, S.
2011-05-01
Aircraft observations made off the coast of northern Chile in the Southeastern Pacific (20° S, 72° W; named Point Alpha) from 16 October to 13 November 2008 during the VAMOS Ocean-Cloud-Atmosphere-Land Study-Regional Experiment (VOCALS-REx), combined with meteorological reanalysis, satellite measurements, and radiosonde data, are used to investigate the boundary layer (BL) and aerosol-cloud-drizzle variations in this region. The BL at Point Alpha was typical of a non-drizzling stratocumulus-topped BL on days without predominately synoptic and meso-scale influences. The BL had a depth of 1140 ± 120 m, was well-mixed and capped by a sharp inversion. The wind direction generally switched from southerly within the BL to northerly above the inversion. The cloud liquid water path (LWP) varied between 15 g m-2 and 160 g m-2. From 29 October to 4 November, when a synoptic system affected conditions at Point Alpha, the cloud LWP was higher than on the other days by around 40 g m-2. On 1 and 2 November, a moist layer above the inversion moved over Point Alpha. The total-water specific humidity above the inversion was larger than that within the BL during these days. Entrainment rates (average of 1.5 ± 0.6 mm s-1) calculated from the near cloud-top fluxes and turbulence (vertical velocity variance) in the BL at Point Alpha appeared to be weaker than those in the BL over the open ocean west of Point Alpha and the BL near the coast of the northeast Pacific. The accumulation mode aerosol varied from 250 to 700 cm-3 within the BL, and CCN at 0.2 % supersaturation within the BL ranged between 150 and 550 cm-3. The main aerosol source at Point Alpha was horizontal advection within the BL from south. The average cloud droplet number concentration ranged between 80 and 400 cm-3, which was consistent with the satellite-derived values. The relationship of cloud droplet number concentration and CCN at 0.2 % supersaturation from 18 flights is Nd =4.6 × CCN0.71. While the mean LWP retrieved from GOES was in good agreement with the in situ measurements, the GOES-derived cloud droplet effective radius tended to be larger than that from the aircraft {in situ} observations near cloud top. The aerosol and cloud LWP relationship reveals that during the typical well-mixed BL days the cloud LWP increased with the CCN concentrations. On the other hand, meteorological factors and the decoupling processes have large influences on the cloud LWP variation as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gabriele, Fatuzzo; Michele, Mangiameli, E-mail: amichele.mangiameli@dica.unict.it; Giuseppe, Mussumeci
The laser scanning is a technology that allows in a short time to run the relief geometric objects with a high level of detail and completeness, based on the signal emitted by the laser and the corresponding return signal. When the incident laser radiation hits the object to detect, then the radiation is reflected. The purpose is to build a three-dimensional digital model that allows to reconstruct the reality of the object and to conduct studies regarding the design, restoration and/or conservation. When the laser scanner is equipped with a digital camera, the result of the measurement process is amore » set of points in XYZ coordinates showing a high density and accuracy with radiometric and RGB tones. In this case, the set of measured points is called “point cloud” and allows the reconstruction of the Digital Surface Model. Even the post-processing is usually performed by closed source software, which is characterized by Copyright restricting the free use, free and open source software can increase the performance by far. Indeed, this latter can be freely used providing the possibility to display and even custom the source code. The experience started at the Faculty of Engineering in Catania is aimed at finding a valuable free and open source tool, MeshLab (Italian Software for data processing), to be compared with a reference closed source software for data processing, i.e. RapidForm. In this work, we compare the results obtained with MeshLab and Rapidform through the planning of the survey and the acquisition of the point cloud of a morphologically complex statue.« less
The impact of radiatively active water-ice clouds on Martian mesoscale atmospheric circulations
NASA Astrophysics Data System (ADS)
Spiga, A.; Madeleine, J.-B.; Hinson, D.; Navarro, T.; Forget, F.
2014-04-01
Background and Goals Water ice clouds are a key component of the Martian climate [1]. Understanding the properties of the Martian water ice clouds is crucial to constrain the Red Planet's climate and hydrological cycle both in the present and in the past [2]. In recent years, this statement have become all the more true as it was shown that the radiative effects of water ice clouds is far from being as negligible as hitherto believed; water ice clouds plays instead a key role in the large-scale thermal structure and dynamics of the Martian atmosphere [3, 4, 5]. Nevertheless, the radiative effect of water ice clouds at lower scales than the large synoptic scale (the so-called meso-scales) is still left to be explored. Here we use for the first time mesoscale modeling with radiatively active water ice clouds to address this open question.
Towards Cloud-based Asynchronous Elasticity for Iterative HPC Applications
NASA Astrophysics Data System (ADS)
da Rosa Righi, Rodrigo; Facco Rodrigues, Vinicius; André da Costa, Cristiano; Kreutz, Diego; Heiss, Hans-Ulrich
2015-10-01
Elasticity is one of the key features of cloud computing. It allows applications to dynamically scale computing and storage resources, avoiding over- and under-provisioning. In high performance computing (HPC), initiatives are normally modeled to handle bag-of-tasks or key-value applications through a load balancer and a loosely-coupled set of virtual machine (VM) instances. In the joint-field of Message Passing Interface (MPI) and tightly-coupled HPC applications, we observe the need of rewriting source codes, previous knowledge of the application and/or stop-reconfigure-and-go approaches to address cloud elasticity. Besides, there are problems related to how profit this new feature in the HPC scope, since in MPI 2.0 applications the programmers need to handle communicators by themselves, and a sudden consolidation of a VM, together with a process, can compromise the entire execution. To address these issues, we propose a PaaS-based elasticity model, named AutoElastic. It acts as a middleware that allows iterative HPC applications to take advantage of dynamic resource provisioning of cloud infrastructures without any major modification. AutoElastic provides a new concept denoted here as asynchronous elasticity, i.e., it provides a framework to allow applications to either increase or decrease their computing resources without blocking the current execution. The feasibility of AutoElastic is demonstrated through a prototype that runs a CPU-bound numerical integration application on top of the OpenNebula middleware. The results showed the saving of about 3 min at each scaling out operations, emphasizing the contribution of the new concept on contexts where seconds are precious.
NASA Astrophysics Data System (ADS)
Jiang, Guodong; Fan, Ming; Li, Lihua
2016-03-01
Mammography is the gold standard for breast cancer screening, reducing mortality by about 30%. The application of a computer-aided detection (CAD) system to assist a single radiologist is important to further improve mammographic sensitivity for breast cancer detection. In this study, a design and realization of the prototype for remote diagnosis system in mammography based on cloud platform were proposed. To build this system, technologies were utilized including medical image information construction, cloud infrastructure and human-machine diagnosis model. Specifically, on one hand, web platform for remote diagnosis was established by J2EE web technology. Moreover, background design was realized through Hadoop open-source framework. On the other hand, storage system was built up with Hadoop distributed file system (HDFS) technology which enables users to easily develop and run on massive data application, and give full play to the advantages of cloud computing which is characterized by high efficiency, scalability and low cost. In addition, the CAD system was realized through MapReduce frame. The diagnosis module in this system implemented the algorithms of fusion of machine and human intelligence. Specifically, we combined results of diagnoses from doctors' experience and traditional CAD by using the man-machine intelligent fusion model based on Alpha-Integration and multi-agent algorithm. Finally, the applications on different levels of this system in the platform were also discussed. This diagnosis system will have great importance for the balanced health resource, lower medical expense and improvement of accuracy of diagnosis in basic medical institutes.
Contaminant transport from point source on water surface in open channel flow with bed absorption
NASA Astrophysics Data System (ADS)
Guo, Jinlan; Wu, Xudong; Jiang, Weiquan; Chen, Guoqian
2018-06-01
Studying solute dispersion in channel flows is of significance for environmental and industrial applications. Two-dimensional concentration distribution for a most typical case of a point source release on the free water surface in a channel flow with bed absorption is presented by means of Chatwin's long-time asymptotic technique. Five basic characteristics of Taylor dispersion and vertical mean concentration distribution with skewness and kurtosis modifications are also analyzed. The results reveal that bed absorption affects both the longitudinal and vertical concentration distributions and causes the contaminant cloud to concentrate in the upper layer. Additionally, the cross-sectional concentration distribution shows an asymptotic Gaussian distribution at large time which is unaffected by the bed absorption. The vertical concentration distribution is found to be nonuniform even at large time. The obtained results are essential for practical implements with strict environmental standards.
NASA Technical Reports Server (NTRS)
Koratkar, Anuradha P.; Macalpine, Gordon M.
1992-01-01
Well-constrained photoionization models for the Seyfert I galaxy NGC 3783 are developed. Both cross-correlation analyses and line variability trends with varying ionizing radiation flux require a multicomponent picture. All the data for He II 1640 A, C IV 1549 A, and semiforbidden C III 1909 A can be reasonably well reproduced by two cloud components. One has a source-cloud distance of 24 lt-days, gas density around 3 x 10 exp 10/cu cm, ionization parameter range of 0.04-0.2, and cloud thickness such that about half of the carbon is doubly ionized and about half is triply ionized. The other component is located approximately 96 lt-days from the source, is shielded from the source by the inner cloud, has a density about 3 x 10 to the 9th/cu cm, and is characterized by an ionization parameter range of 0.001-0.03, The cloud thickness is such that about 45 percent carbon is doubly ionized and about 55 percent is singly ionized.
The effect of a hot, spherical scattering cloud on quasi-periodic oscillation behavior
NASA Astrophysics Data System (ADS)
Bussard, R. W.; Weisskopf, M. C.; Elsner, R. F.; Shibazaki, N.
1988-04-01
A Monte Carlo technique is used to investigate the effects of a hot electron scattering cloud surrounding a time-dependent X-ray source. Results are presented for the time-averaged emergent energy spectra and the mean residence time in the cloud as a function of energy. Moreover, after Fourier transforming the scattering Green's function, it is shown how the cloud affects both the observed power spectrum of a time-dependent source and the cross spectrum (Fourier transform of a cross correlation between energy bands). It is found that the power spectra intrinsic to the source are related to those observed by a relatively simple frequency-dependent multiplicative factor (a transmission function). The cloud can severely attenuate high frequencies in the power spectra, depending on optical depth, and, at lower frequencies, the transmission function has roughly a Lorentzian shape. It is also found that if the intrinsic energy spectrum is constant in time, the phase of the cross spectrum is determined entirely by scattering. Finally, the implications of the results for studies of the X-ray quasi-periodic oscillators are discussed.
Vertical profiling of aerosol particles and trace gases over the central Arctic Ocean during summer
NASA Astrophysics Data System (ADS)
Kupiszewski, P.; Leck, C.; Tjernström, M.; Sjogren, S.; Sedlar, J.; Graus, M.; Müller, M.; Brooks, B.; Swietlicki, E.; Norris, S.; Hansel, A.
2013-04-01
Unique measurements of vertical size resolved aerosol particle concentrations, trace gas concentrations and meteorological data were obtained during the Arctic Summer Cloud Ocean Study (ASCOS, http://www.ascos.se), an International Polar Year project aimed at establishing the processes responsible for formation and evolution of low-level clouds over the high Arctic summer pack ice. The experiment was conducted from onboard the Swedish icebreaker Oden, and provided both ship- and helicopter-based measurements. This study focuses on the vertical helicopter profiles and onboard measurements obtained during a three-week period when Oden was anchored to a drifting ice floe, and sheds light on the characteristics of Arctic aerosol particles and their distribution throughout the lower atmosphere. Distinct differences in aerosol particle characteristics within defined atmospheric layers are identified. Near the surface (lowermost couple hundred meters), transport from the marginal ice zone (MIZ), if sufficiently short (less than ca. 2 days), condensational growth and cloud-processing develop the aerosol population. During two of the four representative periods defined in this study, such influence is shown. At altitudes above about 1 km, long-range transport occurs frequently. However, only infrequently does large-scale subsidence descend such air masses to become entrained into the mixed layer in the high Arctic, and therefore they are unlikely to directly influence low-level stratiform cloud formation. Nonetheless, long-range transport plumes can influence the radiative balance of the PBL by influencing formation and evolution of higher clouds, as well as through precipitation transport of particles downwards. New particle formation was occasionally observed, particularly in the near-surface layer. We hypothesize that the origin of these ultrafine particles can be from biological processes, both primary and secondary, within the open leads between the pack ice and/or along the MIZ. In general, local sources, in combination with upstream boundary layer transport of precursor gases from the MIZ, are suggested to constitute the origin of CCN particles and thus be of importance for the formation of interior Arctic low level clouds during summer, and subsequently, through cloud influences, on the melting and freezing of sea ice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Modest, Michael
The effects of radiation in particle-laden flows were the object of the present research. The presence of particles increases optical thickness substantially, making the use of the “optically thin” approximation in most cases a very poor assumption. However, since radiation fluxes peak at intermediate optical thicknesses, overall radiative effects may not necessarily be stronger than in gas combustion. Also, the spectral behavior of particle radiation properties is much more benign, making spectral models simpler (and making the assumption of a gray radiator halfway acceptable, at least for fluidized beds when gas radiation is not large). On the other hand, particlesmore » scatter radiation, making the radiative transfer equation (RTE) much more di fficult to solve. The research carried out in this project encompassed three general areas: (i) assessment of relevant radiation properties of particle clouds encountered in fluidized bed and pulverized coal combustors, (ii) development of proper spectral models for gas–particulate mixtures for various types of two-phase combustion flows, and (iii) development of a Radiative Transfer Equation (RTE) solution module for such applications. The resulting models were validated against artificial cases since open literature experimental data were not available. The final models are in modular form tailored toward maximum portability, and were incorporated into two research codes: (i) the open-source CFD code OpenFOAM, which we have extensively used in our previous work, and (ii) the open-source multi-phase flow code MFIX, which is maintained by NETL.« less
Arctic PBL Cloud Height and Motion Retrievals from MISR and MINX
NASA Technical Reports Server (NTRS)
Wu, Dong L.
2012-01-01
How Arctic clouds respond and feedback to sea ice loss is key to understanding of the rapid climate change seen in the polar region. As more open water becomes available in the Arctic Ocean, cold air outbreaks (aka. off-ice flow from polar lows) produce a vast sheet of roll clouds in the planetary boundary layer (PBl). The cold air temperature and wind velocity are the critical parameters to determine and understand the PBl structure formed under these roll clouds. It has been challenging for nadir visible/IR sensors to detect Arctic clouds due to lack of contrast between clouds and snowy/icy surfaces. In addition) PBl temperature inversion creates a further problem for IR sensors to relate cloud top temperature to cloud top height. Here we explore a new method with the Multiangle Imaging Spectro-Radiometer (MISR) instrument to measure cloud height and motion over the Arctic Ocean. Employing a stereoscopic-technique, MISR is able to measure cloud top height accurately and distinguish between clouds and snowy/icy surfaces with the measured height. We will use the MISR INteractive eXplorer (MINX) to quantify roll cloud dynamics during cold-air outbreak events and characterize PBl structures over water and over sea ice.
IRAS images of nearby dark clouds
NASA Technical Reports Server (NTRS)
Wood, Douglas O. S.; Myers, Philip C.; Daugherty, Debra A.
1994-01-01
We have investigated approximately 100 nearby molecular clouds using the extensive, all-sky database of IRAS. The clouds in this study cover a wide range of physical properties including visual extinction, size, mass, degree of isolation, homogeneity and morphology. IRAS 100 and 60 micron co-added images were used to calculate the 100 micron optical depth of dust in the clouds. These images of dust optical depth compare very well with (12)CO and (13)CO observations, and can be related to H2 column density. From the optical depth images we locate the edges of dark clouds and the dense cores inside them. We have identified a total of 43 `IRAS clouds' (regions with A(sub v) greater than 2) which contain a total of 255 `IRAS cores' (regions with A(sub v) greater than 4) and we catalog their physical properties. We find that the clouds are remarkably filamentary, and that the cores within the clouds are often distributed along the filaments. The largest cores are usually connected to other large cores by filaments. We have developed selection criteria to search the IRAS Point Source Catalog for stars that are likely to be associated with the clouds and we catalog the IRAS sources in each cloud or core. Optically visible stars associated with the clouds have been identified from the Herbig and Bell catalog. From these data we characterize the physical properties of the clouds including their star-formation efficiency.
Multi-Level Secure Information Sharing Between Smart Cloud Systems of Systems
2014-03-01
implementation of virtual hardware (VMWare), along with a commercial implementation of virtual networking (VPN), such as OpenVPN . 1. VMWare Virtualization...en.wikipedia.org/wiki/MongoDB. Wikipedia. 2014b. Accessed February 26. s.v. “Open VPN,” http://en.wikipedia.org/wiki/ OpenVPN . Wikipedia. 2014c. Accessed
NASA Technical Reports Server (NTRS)
Caillault, Jean-Pierre; Magnani, Loris; Fryer, Chris
1995-01-01
In order to discern whether the high-latitude molecular clouds are regions of ongoing star formation, we have used X-ray emission as a tracer of youthful stars. The entire Einstein database yields 18 images which overlap 10 of the clouds mapped partially or completely in the CO (1-0) transition, providing a total of approximately 6 deg squared of overlap. Five previously unidentified X-ray sources were detected: one has an optical counterpart which is a pre-main-sequence (PMS) star, and two have normal main-sequence stellar counterparts, while the other two are probably extragalactic sources. The PMS star is located in a high Galactic latitude Lynds dark cloud, so this result is not too suprising. The translucent clouds, though, have yet to reveal any evidence of star formation.
Discovery of interstellar ketenyl (HCCO), a surprisingly abundant radical
NASA Astrophysics Data System (ADS)
Agúndez, Marcelino; Cernicharo, José; Guélin, Michel
2015-05-01
We conducted radioastronomical observations of 9 dark clouds with the IRAM 30 m telescope. We present the first identification in space of the ketenyl radical (HCCO) toward the starless core Lupus-1A and the molecular cloud L483 and the detection of the related molecules ketene (H2CCO) and acetaldehyde (CH3CHO) in these two sources and 3 additional dark clouds. We also report the detection of the formyl radical (HCO) in the 9 targeted sources and of propylene (CH2CHCH3) in 4 of the observed sources, which significantly extends the number of dark clouds where these molecules are known to be present. We have derived a beam-averaged column density of HCCO of ~5 × 1011 cm-2 in both Lupus-1A and L483, which means that the ketenyl radical is just ~10 times less abundant than ketene in these sources. The non-negligible abundance of HCCO found implies that there must be a powerful formation mechanism able to counterbalance the efficient destruction of this radical through reactions with neutral atoms. The column densities derived for HCO, (0.5-2.7) ×1012 cm-2, and CH2CHCH3, (1.9-4-2) ×1013 cm-2, are remarkably uniform across the sources where these species are detected, confirming their ubiquity in dark clouds. Gas phase chemical models of cold dark clouds can reproduce the observed abundances of HCO, but cannot explain the presence of HCCO in Lupus-1A and L483 and the high abundances derived for propylene. The chemistry of cold dark clouds needs to be revised in light of these new observational results. Based on observations carried out with the IRAM 30 m Telescope. IRAM is supported by INSU/CNRS (France), MPG (Germany) and IGN (Spain).Tables 3-6 are available in electronic form at http://www.aanda.org
Synchrotron radiation and diffusive shock acceleration - A short review and GRB perspective
NASA Astrophysics Data System (ADS)
Karlica, Mile
2015-12-01
In this talk we present the sponge" model and its possible implications on the GRB afterglow light curves. "Sponge" model describes source of GRB afterglow radiation as fragmented GRB ejecta where bubbles move through the rarefied medium. In the first part of the talk a short introduction to synchrotron radiation and Fermi acceleration was presented. In the assumption that X-ray luminosity of GRB afterglow phase comes from the kinetic energy losses of clouds in ejecta medium radiated as synchrotron radiation we solved currently very simple equation of motion to find which combination of cloud and medium regime describes the afterglow light curve the best. We proposed for the first step to watch simple combinations of expansion regimes for both bubbles and surrounding medium. The closest case to the numerical fit of GRB 150403A with time power law index k = 1.38 is the combination of constant bubbles and Sedov like expanding medium with time power law index k = 1.25. Of course the question of possible mixture of variuos regime combinations is still open within this model.
Reconstructing evolutionary trees in parallel for massive sequences.
Zou, Quan; Wan, Shixiang; Zeng, Xiangxiang; Ma, Zhanshan Sam
2017-12-14
Building the evolutionary trees for massive unaligned DNA sequences is challenging and crucial. However, reconstructing evolutionary tree for ultra-large sequences is hard. Massive multiple sequence alignment is also challenging and time/space consuming. Hadoop and Spark are developed recently, which bring spring light for the classical computational biology problems. In this paper, we tried to solve the multiple sequence alignment and evolutionary reconstruction in parallel. HPTree, which is developed in this paper, can deal with big DNA sequence files quickly. It works well on the >1GB files, and gets better performance than other evolutionary reconstruction tools. Users could use HPTree for reonstructing evolutioanry trees on the computer clusters or cloud platform (eg. Amazon Cloud). HPTree could help on population evolution research and metagenomics analysis. In this paper, we employ the Hadoop and Spark platform and design an evolutionary tree reconstruction software tool for unaligned massive DNA sequences. Clustering and multiple sequence alignment are done in parallel. Neighbour-joining model was employed for the evolutionary tree building. We opened our software together with source codes via http://lab.malab.cn/soft/HPtree/ .
Remote Numerical Simulations of the Interaction of High Velocity Clouds with Random Magnetic Fields
NASA Astrophysics Data System (ADS)
Santillan, Alfredo; Hernandez--Cervantes, Liliana; Gonzalez--Ponce, Alejandro; Kim, Jongsoo
The numerical simulations associated with the interaction of High Velocity Clouds (HVC) with the Magnetized Galactic Interstellar Medium (ISM) are a powerful tool to describe the evolution of the interaction of these objects in our Galaxy. In this work we present a new project referred to as Theoretical Virtual i Observatories. It is oriented toward to perform numerical simulations in real time through a Web page. This is a powerful astrophysical computational tool that consists of an intuitive graphical user interface (GUI) and a database produced by numerical calculations. In this Website the user can make use of the existing numerical simulations from the database or run a new simulation introducing initial conditions such as temperatures, densities, velocities, and magnetic field intensities for both the ISM and HVC. The prototype is programmed using Linux, Apache, MySQL, and PHP (LAMP), based on the open source philosophy. All simulations were performed with the MHD code ZEUS-3D, which solves the ideal MHD equations by finite differences on a fixed Eulerian mesh. Finally, we present typical results that can be obtained with this tool.
Generating DEM from LIDAR data - comparison of available software tools
NASA Astrophysics Data System (ADS)
Korzeniowska, K.; Lacka, M.
2011-12-01
In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.
Low cost sensing of vegetation volume and structure with a Microsoft Kinect sensor
NASA Astrophysics Data System (ADS)
Azzari, G.; Goulden, M.
2011-12-01
The market for videogames and digital entertainment has decreased the cost of advanced technology to affordable levels. The Microsoft Kinect sensor for Xbox 360 is an infrared time of flight camera designed to track body position and movement at a single-articulation level. Using open source drivers and libraries, we acquired point clouds of vegetation directly from the Kinect sensor. The data were filtered for outliers, co-registered, and cropped to isolate the plant of interest from the surroundings and soil. The volume of single plants was then estimated with several techniques, including fitting with solid shapes (cylinders, spheres, boxes), voxel counts, and 3D convex/concave hulls. Preliminary results are presented here. The volume of a series of wild artichoke plants was measured from nadir using a Kinect on a 3m-tall tower. The calculated volumes were compared with harvested biomass; comparisons and derived allometric relations will be presented, along with examples of the acquired point clouds. This Kinect sensor shows promise for ground-based, automated, biomass measurement systems, and possibly for comparison/validation of remotely sensed LIDAR.
Galactic cold cores. IV. Cold submillimetre sources: catalogue and statistical analysis
NASA Astrophysics Data System (ADS)
Montillaud, J.; Juvela, M.; Rivera-Ingraham, A.; Malinen, J.; Pelkonen, V.-M.; Ristorcelli, I.; Montier, L.; Marshall, D. J.; Marton, G.; Pagani, L.; Toth, L. V.; Zahorecz, S.; Ysard, N.; McGehee, P.; Paladini, R.; Falgarone, E.; Bernard, J.-P.; Motte, F.; Zavagno, A.; Doi, Y.
2015-12-01
Context. For the project Galactic cold cores, Herschel photometric observations were carried out as a follow-up of cold regions of interstellar clouds previously identified with the Planck satellite. The aim of the project is to derive the physical properties of the population of cold sources and to study its connection to ongoing and future star formation. Aims: We build a catalogue of cold sources within the clouds in 116 fields observed with the Herschel PACS and SPIRE instruments. We wish to determine the general physical characteristics of the cold sources and to examine the correlations with their host cloud properties. Methods: From Herschel data, we computed colour temperature and column density maps of the fields. We estimated the distance to the target clouds and provide both uncertainties and reliability flags for the distances. The getsources multiwavelength source extraction algorithm was employed to build a catalogue of several thousand cold sources. Mid-infrared data were used, along with colour and position criteria, to separate starless and protostellar sources. We also propose another classification method based on submillimetre temperature profiles. We analysed the statistical distributions of the physical properties of the source samples. Results: We provide a catalogue of ~4000 cold sources within or near star forming clouds, most of which are located either in nearby molecular complexes (≲1 kpc) or in star forming regions of the nearby galactic arms (~2 kpc). About 70% of the sources have a size compatible with an individual core, and 35% of those sources are likely to be gravitationally bound. Significant statistical differences in physical properties are found between starless and protostellar sources, in column density versus dust temperature, mass versus size, and mass versus dust temperature diagrams. The core mass functions are very similar to those previously reported for other regions. On statistical grounds we find that gravitationally bound sources have higher background column densities (median Nbg(H2) ~ 5 × 1021 cm-2) than unbound sources (median Nbg(H2) ~ 3 × 1021 cm-2). These values of Nbg(H2) are higher for higher dust temperatures of the external layers of the parent cloud. However, only in a few cases do we find clear Nbg(H2) thresholds for the presence of cores. The dust temperatures of cloud external layers show clear variations with galactic location, as may the source temperatures. Conclusions: Our data support a more complex view of star formation than in the simple idea of a column density threshold. They show a clear influence of the surrounding UV-visible radiation on how cores distribute in their host clouds with possible variations on the Galactic scale. Planck (http://www.esa.int/Planck) is a project of the European Space Agency - ESA - with instruments provided by two scientific consortia funded by ESA member states (in particular the lead countries: France and Italy) with contributions from NASA (USA), and telescope reflectors provided in a collaboration between ESA and a scientific consortium led and funded by Denmark.Herschel is an ESA space observatory with science instruments provided by European-led Principal Investigator consortia and with important participation from NASA.Full Table B.1 is only available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (ftp://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/584/A92
ERIC Educational Resources Information Center
Karamete, Aysen
2015-01-01
This study aims to show the present conditions about the usage of cloud computing in the department of Computer Education and Instructional Technology (CEIT) amongst teacher trainees in School of Necatibey Education, Balikesir University, Turkey. In this study, a questionnaire with open-ended questions was used. 17 CEIT teacher trainees…
Secure and Resilient Cloud Computing for the Department of Defense
2015-11-16
platform as a service (PaaS), and software as a service ( SaaS )—that target system administrators, developers, and end-users respectively (see Table 2...interfaces (API) and services Medium Amazon Elastic MapReduce, MathWorks Cloud, Red Hat OpenShift SaaS Full-fledged applications Low Google gMail
Metabolizing Data in the Cloud.
Warth, Benedikt; Levin, Nadine; Rinehart, Duane; Teijaro, John; Benton, H Paul; Siuzdak, Gary
2017-06-01
Cloud-based bioinformatic platforms address the fundamental demands of creating a flexible scientific environment, facilitating data processing and general accessibility independent of a countries' affluence. These platforms have a multitude of advantages as demonstrated by omics technologies, helping to support both government and scientific mandates of a more open environment. Copyright © 2016 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boss, Alan P.; Keiser, Sandra A., E-mail: boss@dtm.ciw.edu, E-mail: keiser@dtm.ciw.edu
2013-06-10
A variety of stellar sources have been proposed for the origin of the short-lived radioisotopes that existed at the time of the formation of the earliest solar system solids, including Type II supernovae (SNe), asymptotic giant branch (AGB) and super-AGB stars, and Wolf-Rayet star winds. Our previous adaptive mesh hydrodynamics models with the FLASH2.5 code have shown which combinations of shock wave parameters are able to simultaneously trigger the gravitational collapse of a target dense cloud core and inject significant amounts of shock wave gas and dust, showing that thin SN shocks may be uniquely suited for the task. However,more » recent meteoritical studies have weakened the case for a direct SN injection to the presolar cloud, motivating us to re-examine a wider range of shock wave and cloud core parameters, including rotation, in order to better estimate the injection efficiencies for a variety of stellar sources. We find that SN shocks remain as the most promising stellar source, though planetary nebulae resulting from AGB star evolution cannot be conclusively ruled out. Wolf-Rayet (WR) star winds, however, are likely to lead to cloud core shredding, rather than to collapse. Injection efficiencies can be increased when the cloud is rotating about an axis aligned with the direction of the shock wave, by as much as a factor of {approx}10. The amount of gas and dust accreted from the post-shock wind can exceed that injected from the shock wave, with implications for the isotopic abundances expected for a SN source.« less
1.0 Mm Maps and Radial Density Distributions of Southern Hii/molecular Cloud Complexes
NASA Technical Reports Server (NTRS)
Cheung, L. H.; Frogel, J. A.; Gezar, D. Y.; Hauser, M. G.
1980-01-01
Several 1.0 continuum mapping observations were made of seven southern hemisphere h12/molecular cloud complexes with 65 arcsec resolution. The radial density distribution of the clouds with central luminosity sources was determined observationally. Strong similarities in morphology and general physical conditions were found to exist among all of the southern clouds in the sample.
What good is SWIR? Passive day comparison of VIS, NIR, and SWIR
NASA Astrophysics Data System (ADS)
Driggers, Ronald G.; Hodgkin, Van; Vollmerhausen, Richard
2013-06-01
This paper is the first of three papers associated with the military benefits of SWIR imaging. This paper describes the benefits associated with passive daytime operations with comparisons of SWIR, NIR, and VIS bands and sensors. This paper includes quantitative findings from previously published papers, analysis of open source data, summaries of various expert analyses, and calculations of notional system performance. We did not accept anecdotal findings as acceptable benefits. Topics include haze and fog penetration, atmospheric transmission, cloud and smoke penetration, target and background contrasts, spectral discrimination, turbulence degradation, and long range target identification. The second and third papers will address passive night imaging and active night imaging.
The SMAT fiber laser for industrial applications
NASA Astrophysics Data System (ADS)
Ding, Jianwu; Liu, Jinghui; Wei, Xi; Xu, Jun
2017-02-01
With the increased adoption of high power fiber laser for various industrial applications, the downtime and the reliability of fiber lasers become more and more important. Here we present our approach toward a more reliable and more intelligent laser source for industrial applications: the SMAT fiber laser with the extensive sensor network and multi-level protection mechanism, the mobile connection and the mobile App, and the Smart Cloud. The proposed framework is the first IoT (Internet of Things) approach integrated in an industrial laser not only prolongs the reliability of an industrial laser but open up enormous potential for value-adding services by gathering and analyzing the Big data from the connected SMAT lasers.
A pilot biomedical engineering course in rapid prototyping for mobile health.
Stokes, Todd H; Venugopalan, Janani; Hubbard, Elena N; Wang, May D
2013-01-01
Rapid prototyping of medically assistive mobile devices promises to fuel innovation and provides opportunity for hands-on engineering training in biomedical engineering curricula. This paper presents the design and outcomes of a course offered during a 16-week semester in Fall 2011 with 11 students enrolled. The syllabus covered a mobile health design process from end-to-end, including storyboarding, non-functional prototypes, integrated circuit programming, 3D modeling, 3D printing, cloud computing database programming, and developing patient engagement through animated videos describing the benefits of a new device. Most technologies presented in this class are open source and thus provide unlimited "hackability". They are also cost-effective and easily transferrable to other departments.
Shared Medical Imaging Repositories.
Lebre, Rui; Bastião, Luís; Costa, Carlos
2018-01-01
This article describes the implementation of a solution for the integration of ownership concept and access control over medical imaging resources, making possible the centralization of multiple instances of repositories. The proposed architecture allows the association of permissions to repository resources and delegation of rights to third entities. It includes a programmatic interface for management of proposed services, made available through web services, with the ability to create, read, update and remove all components resulting from the architecture. The resulting work is a role-based access control mechanism that was integrated with Dicoogle Open-Source Project. The solution has several application scenarios like, for instance, collaborative platforms for research and tele-radiology services deployed at Cloud.
Far-infrared observations of the evolved H II region M16
NASA Technical Reports Server (NTRS)
Mcbreen, B.; Fazio, G. G.; Jaffe, D. T.
1982-01-01
The results of far infrared (FIR) observations of the larger H II region M16, associated with the young open star cluster NGC 6611, are discussed. Three FIR sources detected on an extended ridge of FIR emission within the scanned region are described. The observations confirm that M16 is an H II region in a late stage of evolution. The H II region has expanded and is now extremely density bounded, consisting of an extended region of ionized gas and a series of ionization fronts located at the surrounding molecular cloud boundaries nearest to the exciting OB star cluster. The FIR radiation arises from heated dust at these boundaries.
New particle formation leads to cloud dimming
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sullivan, Ryan C.; Crippa, Paola; Matsui, Hitoshi
New particle formation (NPF), nucleation of condensable vapors to the solid or liquid phase, is a significant source of atmospheric aerosol particle number concentrations. With sufficient growth, these nucleated particles may be a significant source of cloud condensation nuclei (CCN), thus altering cloud albedo, structure, and lifetimes, and insolation reaching the Earth's surface. Herein we present one of the first numerical experiments to quantify the impact of NPF on cloud radiative properties that is conducted at a convection permitting resolution and that explicitly simulates cloud droplet number concentrations. Consistent with observations, these simulations suggest that in spring over the Midwesternmore » U.S.A., NPF occurs frequently and on regional scales. However, the simulations suggest that NPF is not associated with enhancement of regional cloud albedos as would be expected from an increase of CCN. These simulations indicate that NPF reduces ambient sulfuric acid concentrations sufficiently to inhibit growth of preexisting particles to CCN sizes. This reduction in CCN-sized particles reduces cloud albedo, resulting in a domain average positive top of atmosphere cloud radiative forcing of 10 W m-2 and up to ~ 50 W m-2 in individual grid cells relative to a simulation in which NPF is excluded.« less
X-ray and IR Surveys of the Orion Molecular Clouds and the Cepheus OB3b Cluster
NASA Astrophysics Data System (ADS)
Megeath, S. Thomas; Wolk, Scott J.; Pillitteri, Ignazio; Allen, Tom
2014-08-01
X-ray and IR surveys of molecular clouds between 400 and 700 pc provide complementary means to map the spatial distribution of young low mass stars associated with the clouds. We overview an XMM survey of the Orion Molecular Clouds, at a distance of 400 pc. By using the fraction of X-ray sources with disks as a proxy for age, this survey has revealed three older clusters rich in diskless X-ray sources. Two are smaller clusters found at the northern and southern edges of the Orion A molecular cloud. The third cluster surrounds the O-star Iota Ori (the point of Orion's sword) and is in the foreground to the Orion molecular cloud. In addition, we present a Chandra and Spitzer survey of the Cep OB3b cluster at 700 pc. These data show a spatially variable disk fraction indicative of age variations within the cluster. We discuss the implication of these results for understanding the spread of ages in young clusters and the star formation histories of molecular clouds.
Cloud Infrastructures for In Silico Drug Discovery: Economic and Practical Aspects
Clematis, Andrea; Quarati, Alfonso; Cesini, Daniele; Milanesi, Luciano; Merelli, Ivan
2013-01-01
Cloud computing opens new perspectives for small-medium biotechnology laboratories that need to perform bioinformatics analysis in a flexible and effective way. This seems particularly true for hybrid clouds that couple the scalability offered by general-purpose public clouds with the greater control and ad hoc customizations supplied by the private ones. A hybrid cloud broker, acting as an intermediary between users and public providers, can support customers in the selection of the most suitable offers, optionally adding the provisioning of dedicated services with higher levels of quality. This paper analyses some economic and practical aspects of exploiting cloud computing in a real research scenario for the in silico drug discovery in terms of requirements, costs, and computational load based on the number of expected users. In particular, our work is aimed at supporting both the researchers and the cloud broker delivering an IaaS cloud infrastructure for biotechnology laboratories exposing different levels of nonfunctional requirements. PMID:24106693
Natural aerosols explain seasonal and spatial patterns of Southern Ocean cloud albedo
McCoy, Daniel T.; Burrows, Susannah M.; Wood, Robert; Grosvenor, Daniel P.; Elliott, Scott M.; Ma, Po-Lun; Rasch, Phillip J.; Hartmann, Dennis L.
2015-01-01
Atmospheric aerosols, suspended solid and liquid particles, act as nucleation sites for cloud drop formation, affecting clouds and cloud properties—ultimately influencing the cloud dynamics, lifetime, water path, and areal extent that determine the reflectivity (albedo) of clouds. The concentration Nd of droplets in clouds that influences planetary albedo is sensitive to the availability of aerosol particles on which the droplets form. Natural aerosol concentrations affect not only cloud properties themselves but also modulate the sensitivity of clouds to changes in anthropogenic aerosols. It is shown that modeled natural aerosols, principally marine biogenic primary and secondary aerosol sources, explain more than half of the spatiotemporal variability in satellite-observed Nd. Enhanced Nd is spatially correlated with regions of high chlorophyll a, and the spatiotemporal variability in Nd is found to be driven primarily by high concentrations of sulfate aerosol at lower Southern Ocean latitudes (35o to 45oS) and by organic matter in sea spray aerosol at higher latitudes (45o to 55oS). Biogenic sources are estimated to increase the summertime mean reflected solar radiation in excess of 10 W m–2 over parts of the Southern Ocean, which is comparable to the annual mean increases expected from anthropogenic aerosols over heavily polluted regions of the Northern Hemisphere. PMID:26601216
Natural aerosols explain seasonal and spatial patterns of Southern Ocean cloud albedo.
McCoy, Daniel T; Burrows, Susannah M; Wood, Robert; Grosvenor, Daniel P; Elliott, Scott M; Ma, Po-Lun; Rasch, Phillip J; Hartmann, Dennis L
2015-07-01
Atmospheric aerosols, suspended solid and liquid particles, act as nucleation sites for cloud drop formation, affecting clouds and cloud properties-ultimately influencing the cloud dynamics, lifetime, water path, and areal extent that determine the reflectivity (albedo) of clouds. The concentration N d of droplets in clouds that influences planetary albedo is sensitive to the availability of aerosol particles on which the droplets form. Natural aerosol concentrations affect not only cloud properties themselves but also modulate the sensitivity of clouds to changes in anthropogenic aerosols. It is shown that modeled natural aerosols, principally marine biogenic primary and secondary aerosol sources, explain more than half of the spatiotemporal variability in satellite-observed N d. Enhanced N d is spatially correlated with regions of high chlorophyll a, and the spatiotemporal variability in N d is found to be driven primarily by high concentrations of sulfate aerosol at lower Southern Ocean latitudes (35(o) to 45(o)S) and by organic matter in sea spray aerosol at higher latitudes (45(o) to 55(o)S). Biogenic sources are estimated to increase the summertime mean reflected solar radiation in excess of 10 W m(-2) over parts of the Southern Ocean, which is comparable to the annual mean increases expected from anthropogenic aerosols over heavily polluted regions of the Northern Hemisphere.
NASA Astrophysics Data System (ADS)
Griessbach, Sabine; Hoffmann, Lars; Höpfner, Michael; Riese, Martin; Spang, Reinhold
2013-09-01
The viability of a spectrally averaging model to perform radiative transfer calculations in the infrared including scattering by atmospheric particles is examined for the application of infrared limb remote sensing measurements. Here we focus on the Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) aboard the European Space Agency's Envisat. Various spectra for clear air and cloudy conditions were simulated with a spectrally averaging radiative transfer model and a line-by-line radiative transfer model for three atmospheric window regions (825-830, 946-951, 1224-1228 cm-1) and compared to each other. The results are rated in terms of the MIPAS noise equivalent spectral radiance (NESR). The clear air simulations generally agree within one NESR. The cloud simulations neglecting the scattering source term agree within two NESR. The differences between the cloud simulations including the scattering source term are generally below three and always below four NESR. We conclude that the spectrally averaging approach is well suited for fast and accurate infrared radiative transfer simulations including scattering by clouds. We found that the main source for the differences between the cloud simulations of both models is the cloud edge sampling. Furthermore we reasoned that this model comparison for clouds is also valid for atmospheric aerosol in general.
An integrated multi-sensors approach for volcanic cloud retrievals and source characterization
NASA Astrophysics Data System (ADS)
Corradini, Stefano; Merucci, Luca
2017-04-01
Volcanic eruptions are one the most important sources of natural pollution. In particular the volcanic clouds represent a severe threat for aviation safety. Worldwide the volcanic activity is monitored by using satellite and ground-based instruments working at different spectral ranges, with different spatial resolutions and sensitivities. Here the complementarity between geostationary and polar satellites and ground based measurements is exploited, in order to significantly improve the volcanic cloud detection and retrievals and to fully characterize the eruption source. The integration procedure named MACE (Multi-platform volcanic Ash Cloud Estimation), has been developed during the EU-FP7 APhoRISM project aimed to develop innovative products to support the management and mitigation of the volcanic and the seismic crisis. The proposed method integrates in a novel manner the volcanic ash retrievals at the space-time scale of typical geostationary observations using both the polar satellite estimations and in-situ measurements. On MACE the typical volcanic cloud retrievals in the thermal infrared are integrated by using a wider spectral range from visible to microwave. Moreover the volcanic cloud detection is extended in case of cloudy atmosphere or steam plumes. As example, the integrated approach is tested on different recent eruptions, occurred on Etna (Italy) in 2013 and 2015 and on Calbuco (Chile) in 2015.
SparkClouds: visualizing trends in tag clouds.
Lee, Bongshin; Riche, Nathalie Henry; Karlson, Amy K; Carpendale, Sheelash
2010-01-01
Tag clouds have proliferated over the web over the last decade. They provide a visual summary of a collection of texts by visually depicting the tag frequency by font size. In use, tag clouds can evolve as the associated data source changes over time. Interesting discussions around tag clouds often include a series of tag clouds and consider how they evolve over time. However, since tag clouds do not explicitly represent trends or support comparisons, the cognitive demands placed on the person for perceiving trends in multiple tag clouds are high. In this paper, we introduce SparkClouds, which integrate sparklines into a tag cloud to convey trends between multiple tag clouds. We present results from a controlled study that compares SparkClouds with two traditional trend visualizations—multiple line graphs and stacked bar charts—as well as Parallel Tag Clouds. Results show that SparkClouds ability to show trends compares favourably to the alternative visualizations.
NASA Astrophysics Data System (ADS)
Perez, G. L.; Larour, E. Y.; Halkides, D. J.; Cheng, D. L. C.
2015-12-01
The Virtual Ice Sheet Laboratory(VISL) is a Cryosphere outreach effort byscientists at the Jet Propulsion Laboratory(JPL) in Pasadena, CA, Earth and SpaceResearch(ESR) in Seattle, WA, and the University of California at Irvine (UCI), with the goal of providing interactive lessons for K-12 and college level students,while conforming to STEM guidelines. At the core of VISL is the Ice Sheet System Model(ISSM), an open-source project developed jointlyat JPL and UCI whose main purpose is to model the evolution of the polar ice caps in Greenland and Antarctica. By using ISSM, VISL students have access tostate-of-the-art modeling software that is being used to conduct scientificresearch by users all over the world. However, providing this functionality isby no means simple. The modeling of ice sheets in response to sea and atmospheric temperatures, among many other possible parameters, requiressignificant computational resources. Furthermore, this service needs to beresponsive and capable of handling burst requests produced by classrooms ofstudents. Cloud computing providers represent a burgeoning industry. With majorinvestments by tech giants like Amazon, Google and Microsoft, it has never beeneasier or more affordable to deploy computational elements on-demand. This isexactly what VISL needs and ISSM is capable of. Moreover, this is a promisingalternative to investing in expensive and rapidly devaluing hardware.
Spatially Resolved Metal Gas Clouds
NASA Astrophysics Data System (ADS)
Péroux, C.; Rahmani, H.; Arrigoni Battaia, F.; Augustin, R.
2018-05-01
We now have mounting evidences that the circumgalactic medium (CGM) of galaxies is polluted with metals processed through stars. The fate of these metals is however still an open question and several findings indicate that they remain poorly mixed. A powerful tool to study the low-density gas of the CGM is offered by absorption lines in quasar spectra, although the information retrieved is limited to 1D along the sightline. We report the serendipitous discovery of two close-by bright zgal=1.148 extended galaxies with a fortuitous intervening zabs=1.067 foreground absorber. MUSE IFU observations spatially probes kpc-scales in absorption in the plane of the sky over a total area spanning ˜30 kpc-2. We identify two [O II] emitters at zabs down to 21 kpc with SFR˜2 M⊙/yr. We measure small fractional variations (<30%) in the equivalent widths of Fe II and Mg II cold gas absorbers on coherence scales of 8kpc but stronger variation on larger scales (25kpc). We compute the corresponding cloud gas mass <2 × 109M⊙. Our results indicate a good efficiency of the metal mixing on kpc-scales in the CGM of a typical z˜1 galaxy. This study show-cases new prospects for mapping the distribution and sizes of metal clouds observed in absorption against extended background sources with 3D spectroscopy.
Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.
A model to determine open or closed cellular convection
NASA Technical Reports Server (NTRS)
Helfand, H. M.; Kalnay, E.
1981-01-01
A simple mechanism is proposed to explain the observed presence in the atmosphere of open or closed cellular convection. If convection is produced by cooling concentrated near the top of the cloud layer, as in radiative cooling of stratus clouds, it develops strong descending currents which are compensated by weak ascent over most of the horizontal area, and closed cells result. Conversely, heating concentrated near the bottom of a layer, as when an air mass is heated by warm water, results in strong ascending currents compensated by weak descent over most of the area, or open cells. This mechanism is similar to the one suggested by Stommel (1962) to explain the smallness of the oceans' sinking regions. The mechanism is studied numerically by means of a two-dimensional, nonlinear Boussinesq model.
A Multi-Year Data Set of Cloud Properties Derived for CERES from Aqua, Terra, and TRMM
NASA Technical Reports Server (NTRS)
Minnis, Patrick; Sunny Sun-Mack; Trepte, Quinz Z.; Yan Chen; Brown, Richard R.; Gibson, Sharon C.; Heck, Michael L.; Dong, Xiquan; Xi, Baike
2007-01-01
The Clouds and Earth's Radiant Energy System (CERES) Project is producing a suite of cloud properties from high-resolution imagers on several satellites and matching them precisely with broadband radiance data to study the influence of clouds and radiation on climate. The cloud properties generally compare well with independent validation sources. Distinct differences are found between the CERES cloud properties and those derived with other algorithms from the same imager data. CERES products will be updated beginning in late 2006.
Cloud and Radiation Studies during SAFARI 2000
NASA Technical Reports Server (NTRS)
Platnick, Steven; King, M. D.; Hobbs, P. V.; Osborne, S.; Piketh, S.; Bruintjes, R.; Lau, William K. M. (Technical Monitor)
2001-01-01
Though the emphasis of the Southern Africa Regional Science Initiative 2000 (SAFARI-2000) dry season campaign was largely on emission sources and transport, the assemblage of aircraft (including the high altitude NASA ER-2 remote sensing platform and the University of Washington CV-580, UK MRF C130, and South African Weather Bureau JRA in situ aircrafts) provided a unique opportunity for cloud studies. Therefore, as part of the SAFARI initiative, investigations were undertaken to assess regional aerosol-cloud interactions and cloud remote sensing algorithms. In particular, the latter part of the experiment concentrated on marine boundary layer stratocumulus clouds off the southwest coast of Africa. Associated with cold water upwelling along the Benguela current, the Namibian stratocumulus regime has received limited attention but appears to be unique for several reasons. During the dry season, outflow of continental fires and industrial pollution over this area can be extreme. From below, upwelling provides a rich nutrient source for phytoplankton (a source of atmospheric sulphur through DMS production as well as from decay processes). The impact of these natural and anthropogenic sources on the microphysical and optical properties of the stratocumulus is unknown. Continental and Indian Ocean cloud systems of opportunity were also studied during the campaign. Aircraft flights were coordinated with NASA Terra Satellite overpasses for synergy with the Moderate Resolution Imaging Spectroradiometer (MODIS) and other Terra instruments. An operational MODIS algorithm for the retrieval of cloud optical and physical properties (including optical thickness, effective particle radius, and water path) has been developed. Pixel-level MODIS retrievals (11 km spatial resolution at nadir) and gridded statistics of clouds in th SAFARI region will be presented. In addition, the MODIS Airborne Simulator flown on the ER-2 provided high spatial resolution retrievals (50 m at nadir). These retrievals will be discussed and compared with in situ observations.
The Cloud Area Padovana: from pilot to production
NASA Astrophysics Data System (ADS)
Andreetto, P.; Costa, F.; Crescente, A.; Dorigo, A.; Fantinel, S.; Fanzago, F.; Sgaravatto, M.; Traldi, S.; Verlato, M.; Zangrando, L.
2017-10-01
The Cloud Area Padovana has been running for almost two years. This is an OpenStack-based scientific cloud, spread across two different sites: the INFN Padova Unit and the INFN Legnaro National Labs. The hardware resources have been scaled horizontally and vertically, by upgrading some hypervisors and by adding new ones: currently it provides about 1100 cores. Some in-house developments were also integrated in the OpenStack dashboard, such as a tool for user and project registrations with direct support for the INFN-AAI Identity Provider as a new option for the user authentication. In collaboration with the EU-funded Indigo DataCloud project, the integration with Docker-based containers has been experimented with and will be available in production soon. This computing facility now satisfies the computational and storage demands of more than 70 users affiliated with about 20 research projects. We present here the architecture of this Cloud infrastructure, the tools and procedures used to operate it. We also focus on the lessons learnt in these two years, describing the problems that were found and the corrective actions that had to be applied. We also discuss about the chosen strategy for upgrades, which combines the need to promptly integrate the OpenStack new developments, the demand to reduce the downtimes of the infrastructure, and the need to limit the effort requested for such updates. We also discuss how this Cloud infrastructure is being used. In particular we focus on two big physics experiments which are intensively exploiting this computing facility: CMS and SPES. CMS deployed on the cloud a complex computational infrastructure, composed of several user interfaces for job submission in the Grid environment/local batch queues or for interactive processes; this is fully integrated with the local Tier-2 facility. To avoid a static allocation of the resources, an elastic cluster, based on cernVM, has been configured: it allows to automatically create and delete virtual machines according to the user needs. SPES, using a client-server system called TraceWin, exploits INFN’s virtual resources performing a very large number of simulations on about a thousand nodes elastically managed.
AstroCloud, a Cyber-Infrastructure for Astronomy Research: Overview
NASA Astrophysics Data System (ADS)
Cui, C.; Yu, C.; Xiao, J.; He, B.; Li, C.; Fan, D.; Wang, C.; Hong, Z.; Li, S.; Mi, L.; Wan, W.; Cao, Z.; Wang, J.; Yin, S.; Fan, Y.; Wang, J.
2015-09-01
AstroCloud is a cyber-Infrastructure for Astronomy Research initiated by Chinese Virtual Observatory (China-VO) under funding support from NDRC (National Development and Reform commission) and CAS (Chinese Academy of Sciences). Tasks such as proposal submission, proposal peer-review, data archiving, data quality control, data release and open access, Cloud based data processing and analyzing, will be all supported on the platform. It will act as a full lifecycle management system for astronomical data and telescopes. Achievements from international Virtual Observatories and Cloud Computing are adopted heavily. In this paper, backgrounds of the project, key features of the system, and latest progresses are introduced.
Integrating Cloud-Computing-Specific Model into Aircraft Design
NASA Astrophysics Data System (ADS)
Zhimin, Tian; Qi, Lin; Guangwen, Yang
Cloud Computing is becoming increasingly relevant, as it will enable companies involved in spreading this technology to open the door to Web 3.0. In the paper, the new categories of services introduced will slowly replace many types of computational resources currently used. In this perspective, grid computing, the basic element for the large scale supply of cloud services, will play a fundamental role in defining how those services will be provided. The paper tries to integrate cloud computing specific model into aircraft design. This work has acquired good results in sharing licenses of large scale and expensive software, such as CFD (Computational Fluid Dynamics), UG, CATIA, and so on.
Interstellar Explorer Observations of the Solar System's Debris Disks
NASA Astrophysics Data System (ADS)
Lisse, C. M.; McNutt, R. L., Jr.; Brandt, P. C.
2017-12-01
Planetesimal belts and debris disks full of dust are known as the "signposts of planet formation" in exosystems. The overall brightness of a disk provides information on the amount of sourcing planetesimal material, while asymmetries in the shape of the disk can be used to search for perturbing planets. The solar system is known to house two such belts, the Asteroid belt and the Kuiper Belt; and at least one debris cloud, the Zodiacal Cloud, sourced by planetisimal collisions and Kuiper Belt comet evaporative sublimation. However these are poorly understood in toto because we live inside of them. E.g., while we know of the two planetesimal belt systems, it is not clear how much, if any, dust is produced from the Kuiper belt since the near-Sun comet contributions dominate near-Earth space. Understanding how much dust is produced in the Kuiper belt would give us a much better idea of the total number of bodies in the belt, especially the smallest ones, and their dynamical collisional state. Even for the close in Zodiacal cloud, questions remain concerning its overall shape and orientation with respect to the ecliptic and invariable planes of the solar system - they aren't explainable from the perturbations caused by the known planets alone. In this paper we explore the possibilities of using an Interstellar Explorer telescope placed at 200 AU from the sun to observe the brightness, shape, and extent of the solar system's debris disk(s). We should be able to measure the entire extent of the inner, near-earth zodiacal cloud; whether it connects smoothly into an outer cloud, or if there is a second outer cloud sourced by the Kuiper belt and isolated by the outer planets, as predicted by Stark & Kuchner (2009, 2010) and Poppe et al. (2012, 2016; Figure 1). VISNIR imagery will inform about the dust cloud's density, while MIR cameras will provide thermal imaging photometry related to the cloud's dust particle size and composition. Observing at high phase angle by looking back towards the sun from 200 AU, we will be able to perform deep searches for the presence of rings and dust clouds around discrete sources, and thus we will be able to search for possible strong individual sources of the debris clouds - like the Haumea family collisional fragments, or the rings of the Centaur Chariklo, or dust emitted from spallation off the 6 known bodies of the Pluto system.
The EPOS Vision for the Open Science Cloud
NASA Astrophysics Data System (ADS)
Jeffery, Keith; Harrison, Matt; Cocco, Massimo
2016-04-01
Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be considered as ICS-ds by EPOS.. Provision of access to ICS-Ds from ICS-C concerns several aspects: (a) Technical : it may be more or less difficult to connect and pass from ICS-C to the ICS-d/ CES the 'package' (probably a virtual machine) of data and software; (b) Security/privacy : including passing personal information e.g. related to AAAI (Authentication, authorization, accounting Infrastructure); (c) financial and legal : such as payment, licence conditions; Appropriate interfaces from ICS-C to ICS-d are being designed to accommodate these aspects. The Open Science Cloud is timely because it provides a framework to discuss governance and sustainability for computational resource provision as well as an effective interpretation of federated approach to HPC(High Performance Computing) -HTC (High Throughput Computing). It will be a unique opportunity to share and adopt procurement policies to provide access to computational resources for RIs. The current state of discussions and expected roadmap for the EPOS-Open Science Cloud relationship are presented.
A Digital Knowledge Preservation Platform for Environmental Sciences
NASA Astrophysics Data System (ADS)
Aguilar Gómez, Fernando; de Lucas, Jesús Marco; Pertinez, Esther; Palacio, Aida; Perez, David
2017-04-01
The Digital Knowledge Preservation Platform is the evolution of a pilot project for Open Data supporting the full research data life cycle. It is currently being evolved at IFCA (Instituto de Física de Cantabria) as a combination of different open tools that have been extended: DMPTool (https://dmptool.org/) with pilot semantics features (RDF export, parameters definition), INVENIO (http://invenio-software.org/ ) customized version to integrate the entire research data life cycle and Jupyter (http://jupyter.org/) as processing tool and reproducibility environment. This complete platform aims to provide an integrated environment for research data management following the FAIR+R principles: -Findable: The Web portal based on Invenio provides a search engine and all elements including metadata to make them easily findable. -Accessible: Both data and software are available online with internal PIDs and DOIs (provided by Datacite). -Interoperable: Datasets can be combined to perform new analysis. The OAI-PMH standard is also integrated. -Re-usable: different licenses types and embargo periods can be defined. -+Reproducible: directly integrated with cloud computing resources. The deployment of the entire system over a Cloud framework helps to build a dynamic and scalable solution, not only for managing open datasets but also as a useful tool for the final user, who is able to directly process and analyse the open data. In parallel, the direct use of semantics and metadata is being explored and integrated in the framework. Ontologies, being a knowledge representation, can contribute to define the elements and relationships of the research data life cycle, including DMP, datasets, software, etc. The first advantage of developing an ontology of a knowledge domain is that they provide a common vocabulary hierarchy (i.e. a conceptual schema) that can be used and standardized by all the agents interested in the domain (either humans or machines). This way of using ontologies is one of the basis of the Semantic Web, where ontologies are set to play a key role in establishing a common terminology between agents. To develop the ontology we are using a graphical tool called Protégé. Protégé is a graphical ontology-development tool which supports a rich knowledge model and it is open-source and freely available. However in order to process and manage the ontology from the web framework, we are using Semantic MediaWiki, which is able to process queries. Semantic MediaWiki is an extension of MediaWiki where we can do semantic search and export data in RDF and CSV format. This system is used as a testbed for the potential use of semantics in a more general environment. This Digital Knowledge Preservation Platform is very closed related to INDIGO-DataCloud project (https://www.indigo-datacloud.eu) since the same data life cycle approach is taking into account (Planning, Collect, Curate, Analyze, Publish, Preserve). INDIGO-DataCloud solutions will be able to support all the different elements in the system, as we showed in the last Research Data Alliance Plenary. This presentation will show the different elements on the system and how they work, as well as the roadmap of their continuous integration.
NASA Astrophysics Data System (ADS)
Lengert, W.; Mondon, E.; Bégin, M. E.; Ferrer, M.; Vallois, F.; DelaMar, J.
2015-12-01
Helix Nebula, a European science cross-domain initiative building on an active PPP, is aiming to implement the concept of an open science commons[1] while using a cloud hybrid model[2] as the proposed implementation solution. This approach allows leveraging and merging of complementary data intensive Earth Science disciplines (e.g. instrumentation[3] and modeling), without introducing significant changes in the contributors' operational set-up. Considering the seamless integration with life-science (e.g. EMBL), scientific exploitation of meteorological, climate, and Earth Observation data and models open an enormous potential for new big data science. The work of Helix Nebula has shown that is it feasible to interoperate publicly funded infrastructures, such as EGI [5] and GEANT [6], with commercial cloud services. Such hybrid systems are in the interest of the existing users of publicly funded infrastructures and funding agencies because they will provide "freedom and choice" over the type of computing resources to be consumed and the manner in which they can be obtained. But to offer such freedom and choice across a spectrum of suppliers, various issues such as intellectual property, legal responsibility, service quality agreements and related issues need to be addressed. Finding solutions to these issues is one of the goals of the Helix Nebula initiative. [1] http://www.egi.eu/news-and-media/publications/OpenScienceCommons_v3.pdf [2] http://www.helix-nebula.eu/events/towards-the-european-open-science-cloud [3] e.g. https://sentinel.esa.int/web/sentinel/sentinel-data-access [5] http://www.egi.eu/ [6] http://www.geant.net/
Venus Clouds: A dirty hydrochloric acid model
NASA Technical Reports Server (NTRS)
Hapke, B.
1971-01-01
The spectral and polarization data for Venus are consistent with micron-sized, aerosol cloud particles of hydrochloric acid containing soluble and insoluble iron compounds, whose source could be volcanic or crustal dust. The ultraviolet features could arise from variations in the Fe-HCl concentration in the cloud particles.
Efficient Open Source Lidar for Desktop Users
NASA Astrophysics Data System (ADS)
Flanagan, Jacob P.
Lidar --- Light Detection and Ranging --- is a remote sensing technology that utilizes a device similar to a rangefinder to determine a distance to a target. A laser pulse is shot at an object and the time it takes for the pulse to return in measured. The distance to the object is easily calculated using the speed property of light. For lidar, this laser is moved (primarily in a rotational movement usually accompanied by a translational movement) and records the distances to objects several thousands of times per second. From this, a 3 dimensional structure can be procured in the form of a point cloud. A point cloud is a collection of 3 dimensional points with at least an x, a y and a z attribute. These 3 attributes represent the position of a single point in 3 dimensional space. Other attributes can be associated with the points that include properties such as the intensity of the return pulse, the color of the target or even the time the point was recorded. Another very useful, post processed attribute is point classification where a point is associated with the type of object the point represents (i.e. ground.). Lidar has gained popularity and advancements in the technology has made its collection easier and cheaper creating larger and denser datasets. The need to handle this data in a more efficiently manner has become a necessity; The processing, visualizing or even simply loading lidar can be computationally intensive due to its very large size. Standard remote sensing and geographical information systems (GIS) software (ENVI, ArcGIS, etc.) was not originally built for optimized point cloud processing and its implementation is an afterthought and therefore inefficient. Newer, more optimized software for point cloud processing (QTModeler, TopoDOT, etc.) usually lack more advanced processing tools, requires higher end computers and are very costly. Existing open source lidar approaches the loading and processing of lidar in an iterative fashion that requires implementing batch coding and processing time that could take months for a standard lidar dataset. This project attempts to build a software with the best approach for creating, importing and exporting, manipulating and processing lidar, especially in the environmental field. Development of this software is described in 3 sections - (1) explanation of the search methods for efficiently extracting the "area of interest" (AOI) data from disk (file space), (2) using file space (for storage), budgeting memory space (for efficient processing) and moving between the two, and (3) method development for creating lidar products (usually raster based) used in environmental modeling and analysis (i.e.: hydrology feature extraction, geomorphological studies, ecology modeling, etc.).
The StratusLab cloud distribution: Use-cases and support for scientific applications
NASA Astrophysics Data System (ADS)
Floros, E.
2012-04-01
The StratusLab project is integrating an open cloud software distribution that enables organizations to setup and provide their own private or public IaaS (Infrastructure as a Service) computing clouds. StratusLab distribution capitalizes on popular infrastructure virtualization solutions like KVM, the OpenNebula virtual machine manager, Claudia service manager and SlipStream deployment platform, which are further enhanced and expanded with additional components developed within the project. The StratusLab distribution covers the core aspects of a cloud IaaS architecture, namely Computing (life-cycle management of virtual machines), Storage, Appliance management and Networking. The resulting software stack provides a packaged turn-key solution for deploying cloud computing services. The cloud computing infrastructures deployed using StratusLab can support a wide range of scientific and business use cases. Grid computing has been the primary use case pursued by the project and for this reason the initial priority has been the support for the deployment and operation of fully virtualized production-level grid sites; a goal that has already been achieved by operating such a site as part of EGI's (European Grid Initiative) pan-european grid infrastructure. In this area the project is currently working to provide non-trivial capabilities like elastic and autonomic management of grid site resources. Although grid computing has been the motivating paradigm, StratusLab's cloud distribution can support a wider range of use cases. Towards this direction, we have developed and currently provide support for setting up general purpose computing solutions like Hadoop, MPI and Torque clusters. For what concerns scientific applications the project is collaborating closely with the Bioinformatics community in order to prepare VM appliances and deploy optimized services for bioinformatics applications. In a similar manner additional scientific disciplines like Earth Science can take advantage of StratusLab cloud solutions. Interested users are welcomed to join StratusLab's user community by getting access to the reference cloud services deployed by the project and offered to the public.
NASA Astrophysics Data System (ADS)
Wilson, T. L.; Hanson, M. M.; Muders, D.
2003-06-01
We present fully sampled images in the C18O J=2-1 line extending over 13'×23', made with the Heinrich Hertz Telescope (HHT) on Mount Graham, AZ. The HHT has a resolution of 35" at the line frequency. This region includes two molecular clouds. Cloud A, to the north, is more compact, while cloud B is to the west of the H II region M17. Cloud B contains the well-known source M17SW. In C18O we find 13 maxima in cloud A and 39 in cloud B. Sixteen sources in cloud B are in M17SW, mapped previously with higher resolution. In cloud B, sources outside M17SW have line widths comparable to those in M17SW. In comparison, cloud A has lower C18O line intensities and smaller line widths but comparable densities and sizes. Maps of the cores of these clouds were also obtained in the J=5-4 line of CS, which traces higher H2 densities. Our images of the cores of clouds A and B show that for VLSR<=20 km s-1, the peaks of the CS emission are shifted closer to the H II region than the C18O maxima, so higher densities are found toward the H II region. Our CS data give additional support to the already strong evidence that M17SW and nearby regions are heated and compressed by the H II region. Our data show that cloud A has a smaller interaction with the H II region. We surmise that M17SW was an initially denser region, and the turn-on of the H II region will make this the next region of massive star formation. Outside of M17SW, the only other obvious star formation region may be in cloud A, since there is an intense millimeter dust continuum peak found by Henning et al. (1998) but no corresponding C18O maximum. If the CO/H2 ratio is constant, the dust must have a temperature of ~100 K or the H2 density is greater than 106 cm-3 or both to reconcile the C18O and dust data. Alternatively, if the CO/H2 ratio is low, perhaps much of the CO is depleted.
Biogenic influence on cloud microphysics in the 'clean' oceanic atmosphere
NASA Astrophysics Data System (ADS)
Lana, A.; Simó, R.; Vallina, S. M.; Jurado, E.; Dachs, J.
2009-12-01
A 20 years old hypothesis postulates a feedback relationship between marine biota and climate through the emission of dimethylsulfide (DMS) as the principal natural source of Sulfate Secondary Aerosols (S-DMS) that are very efficient as cloud condensation nuclei (CCN). In recent years, the biological influence on cloud microphysics have been expanded to other potential biogenic cloud precursors: (i) volatile organic compounds produced by plankton and emitted to the atmosphere to form Secondary Organic Aerosols (SOA); (ii) biological particles and biogenic polymers, lifted with the seaspray by wind friction and bubble-bursting processes, that act as Primary Organic Aerosols (POA). Besides these biogenic aerosols, also seaspray-associated Sea Salt (SS) emissions, which are the dominant contribution to aerosol mass in the remote mixed boundary layer, also contribute to cloud condensation. All these aerosols affect cloud microphysics by providing new CCN, reducing the size of cloud droplets, and increasing cloud albedo. We have compared the seasonalities of the parameterized source functions of these natural cloud precursors with that of the satellite-derived cloud droplet effective radius (CLEFRA) over large regions of the ocean. Regions where big loads of continental aerosols (including anthropogenic -industrial, urban, and biomass burning) dominate during a significant part of the year were identified by use of remote sensing aerosol optical properties and excluded from our analysis. Our results show that the seasonality of cloud droplet effective radius matches those of S-DMS and SOA in the clean marine atmosphere, whereas SS and chlorophyll-associated POA on their own do not seem to play a major role in driving cloud droplet size.
MASYS: The AKARI Spectroscopic Survey of Symbiotic Stars in the Magellanic Clouds
NASA Astrophysics Data System (ADS)
Angeloni, R.; Ciroi, S.; Marigo, P.; Contini, M.; Di Mille, F.; Rafanelli, P.
2009-12-01
MASYS is the AKARI spectroscopic survey of Symbiotic Stars in the Magellanic Clouds, and one of the European Open Time Observing Programmes approved for the AKARI (Post-Helium) Phase-3. It is providing the first ever near-IR spectra of extragalactic symbiotic stars. The observations are scheduled to be completed in July 2009.
FOODIE: Farm-Oriented Open Data in Europe
NASA Astrophysics Data System (ADS)
Ángel Esbri Palomares, Miguel; Charvat, Karel; Campos, Antonio Manuel; Palma, Raúl
2014-05-01
The agriculture sector is a unique sector due to its strategic importance for both European citizens (consumers) and European economy (regional and global) which, ideally, should make the whole sector a network of interacting organizations. Rural areas are of particular importance with respect to the agro-food sector and should be specifically addressed within this scope. The different groups of stakeholders involved in the agricultural activities have to manage many different and heterogeneous sources of information that need to be combined in order to make economically and environmentally sound decisions, which include (among others) the definition of policies (subsidies, standardisation and regulation, national strategies for rural development, climate change), valuation of ecological performances, development of sustainable agriculture, crop recollection timing and pricing, plagues detection, etc. Such processes are very labour intensive because most parts have to be executed manually and the necessary information is not always available or easily accessible. In this context, future agriculture knowledge management systems have to support not only direct profitability of agriculture or environment protection, but also activities of individuals and groups allowing effective collaboration among groups in agri-food industry, consumers, public administrations and wider stakeholders communities, especially in rural domain. To that end FOODIE project aims at building an open and interoperable agricultural specialized platform hub on the cloud for the management of spatial and non-spatial data relevant for farming production; for discovery of spatial and non-spatial agriculture related data from heterogeneous sources; integration of existing and valuable European open datasets related to agriculture; data publication and data linking of external agriculture data sources contributed by different public and private stakeholders allowing to provide specific and high-value applications and services for the support in the planning and decision-making processes of different stakeholders groups related to the agricultural and environmental domains.
Alternative IT Sourcing Strategies: From the Campus to the Cloud. ECAR Key Findings
ERIC Educational Resources Information Center
Goldstein, Philip J.
2009-01-01
This document presents the key findings from the 2009 ECAR (EDUCAUSE Center for Applied Research) study, "Alternative IT Sourcing Strategies: From the Campus to the Cloud," by Philip J. Goldstein. The study explores a multitude of strategies used by colleges and university information technology organizations to deliver the breadth of technologies…
NASA Astrophysics Data System (ADS)
Gopalan, A.; Doelling, D. R.; Scarino, B. R.; Chee, T.; Haney, C.; Bhatt, R.
2016-12-01
The CERES calibration group at NASA/LaRC has developed and deployed a suite of online data exploration and visualization tools targeted towards a range of spaceborne VIS/IR imager calibration applications for the Earth Science community. These web-based tools are driven by the open-source R (Language for Statistical Computing and Visualization) with a web interface for the user to customize the results according to their application. The tool contains a library of geostationary and sun-synchronous imager spectral response functions (SRF), incoming solar spectra, SCIAMACHY and Hyperion Earth reflected visible hyper-spectral data, and IASI IR hyper-spectral data. The suite of six specific web-based tools was designed to provide critical information necessary for sensor cross-calibration. One of the challenges of sensor cross-calibration is accounting for spectral band differences and may introduce biases if not handled properly. The spectral band adjustment factors (SBAF) are a function of the earth target, atmospheric and cloud conditions or scene type and angular conditions, when obtaining sensor radiance pairs. The SBAF will need to be customized for each inter-calibration target and sensor pair. The advantages of having a community open source tool are: 1) only one archive of SCIAMACHY, Hyperion, and IASI datasets needs to be maintained, which is on the order of 50TB. 2) the framework will allow easy incorporation of new satellite SRFs and hyper-spectral datasets and associated coincident atmospheric and cloud properties, such as PW. 3) web tool or SBAF algorithm improvements or suggestions when incorporated can benefit the community at large. 4) The customization effort is on the user rather than on the host. In this paper we discuss each of these tools in detail and explore the variety of advanced options that can be used to constrain the results along with specific use cases to highlight the value-added by these datasets.
Hybrid Cloud Computing Environment for EarthCube and Geoscience Community
NASA Astrophysics Data System (ADS)
Yang, C. P.; Qin, H.
2016-12-01
The NSF EarthCube Integration and Test Environment (ECITE) has built a hybrid cloud computing environment to provides cloud resources from private cloud environments by using cloud system software - OpenStack and Eucalyptus, and also manages public cloud - Amazon Web Service that allow resource synchronizing and bursting between private and public cloud. On ECITE hybrid cloud platform, EarthCube and geoscience community can deploy and manage the applications by using base virtual machine images or customized virtual machines, analyze big datasets by using virtual clusters, and real-time monitor the virtual resource usage on the cloud. Currently, a number of EarthCube projects have deployed or started migrating their projects to this platform, such as CHORDS, BCube, CINERGI, OntoSoft, and some other EarthCube building blocks. To accomplish the deployment or migration, administrator of ECITE hybrid cloud platform prepares the specific needs (e.g. images, port numbers, usable cloud capacity, etc.) of each project in advance base on the communications between ECITE and participant projects, and then the scientists or IT technicians in those projects launch one or multiple virtual machines, access the virtual machine(s) to set up computing environment if need be, and migrate their codes, documents or data without caring about the heterogeneity in structure and operations among different cloud platforms.
Aerosols in polluted versus nonpolluted air masses Long-range transport and effects on clouds
NASA Technical Reports Server (NTRS)
Pueschel, R. F.; Van Valin, C. C.; Castillo, R. C.; Kadlecek, J. A.; Ganor, E.
1986-01-01
To assess the influence of anthropogenic aerosols on the physics and chemistry of clouds in the northeastern United States, aerosol and cloud-drop size distributions, elemental composition of aerosols as a function of size, and ionic content of cloud water were measured on Whiteface Mountain, NY, during the summers of 1981 and 1982. In several case studies, the data were cross-correlated with different air mass types - background continental, polluted continental, and maritime - that were advected to the sampling site. The results are the following: (1) Anthropogenic sources hundreds of kilometers upwind cause the small-particle (accumulation) mode number to increase from hundreds of thousands per cubic centimeter and the mass loading to increase from a few to several tens of micrograms per cubic meter, mostly in the form of sulfur aerosols. (2) A significant fraction of anthropogenic sulfur appears to act as cloud condensation nuclei (CCN) to affect the cloud drop concentration. (3) Clouds in Atlantic maritime air masses have cloud drop spectra that are markedly different from those measured in continental clouds. The drop concentration is significantly lower, and the drop size spectra are heavily skewed toward large drops. (4) Effects of anthropogenic pollutants on cloud water ionic composition are an increase of nitrate by a factor of 50, an increase of sulfate by more than one order of magnitude, and an increase of ammonium ion by a factor of 7. The net effect of the changes in ionic concentrations is an increase in cloud water acidity. An anion deficit even in maritime clouds suggests an unknown, possibly biogenic, source that could be responsible for a pH below neutral, which is frequently observed in nonpolluted clouds.
NASA Astrophysics Data System (ADS)
Cook, Ryan D.; Lin, Ying-Hsuan; Peng, Zhuoyu; Boone, Eric; Chu, Rosalie K.; Dukett, James E.; Gunsch, Matthew J.; Zhang, Wuliang; Tolic, Nikola; Laskin, Alexander; Pratt, Kerri A.
2017-12-01
Organic aerosol formation and transformation occurs within aqueous aerosol and cloud droplets, yet little is known about the composition of high molecular weight organic compounds in cloud water. Cloud water samples collected at Whiteface Mountain, New York, during August-September 2014 were analyzed by ultra-high-resolution mass spectrometry to investigate the molecular composition of dissolved organic carbon, with a focus on sulfur- and nitrogen-containing compounds. Organic molecular composition was evaluated in the context of cloud water inorganic ion concentrations, pH, and total organic carbon concentrations to gain insights into the sources and aqueous-phase processes of the observed high molecular weight organic compounds. Cloud water acidity was positively correlated with the average oxygen : carbon ratio of the organic constituents, suggesting the possibility for aqueous acid-catalyzed (prior to cloud droplet activation or during/after cloud droplet evaporation) and/or radical (within cloud droplets) oxidation processes. Many tracer compounds recently identified in laboratory studies of bulk aqueous-phase reactions were identified in the cloud water. Organosulfate compounds, with both biogenic and anthropogenic volatile organic compound precursors, were detected for cloud water samples influenced by air masses that had traveled over forested and populated areas. Oxidation products of long-chain (C10-12) alkane precursors were detected during urban influence. Influence of Canadian wildfires resulted in increased numbers of identified sulfur-containing compounds and oligomeric species, including those formed through aqueous-phase reactions involving methylglyoxal. Light-absorbing aqueous-phase products of syringol and guaiacol oxidation were observed in the wildfire-influenced samples, and dinitroaromatic compounds were observed in all cloud water samples (wildfire, biogenic, and urban-influenced). Overall, the cloud water molecular composition depended on air mass source influence and reflected aqueous-phase reactions involving biogenic, urban, and biomass burning precursors.
NASA Astrophysics Data System (ADS)
Ratner, Jacqueline; Pyle, David; Mather, Tamsin
2015-04-01
Structure-from-motion (SfM) techniques are now widely available to quickly and cheaply generate digital terrain models (DTMs) from optical imagery. Topography can change rapidly during disaster scenarios and change the nature of local hazards, making ground-based SfM a particularly useful tool in hazard studies due to its low cost, accessibility, and potential for immediate deployment. Our study is designed to serve as an analogue to potential real-world use of the SfM method if employed for disaster risk reduction purposes. Experiments at a volcanic crater in Santorini, Greece, used crowd-sourced data collection to demonstrate the impact of user expertise and randomization of SfM data on the resultant DTM. Three groups of participants representing variable expertise levels utilized 16 different camera models, including four camera phones, to collect 1001 total photos in one hour of data collection. Datasets collected by each group were processed using the free and open source software VisualSFM. The point densities and overall quality of the resultant SfM point clouds were compared against each other and also against a LiDAR dataset for reference to the industry standard. Our results show that the point clouds are resilient to changes in user expertise and collection method and are comparable or even preferable in data density to LiDAR. We find that 'crowd-sourced' data collected by a moderately informed general public yields topography results comparable to those produced with data collected by experts. This means that in a real-world scenario involving participants with a diverse range of expertise levels, topography models could be produced from crowd-sourced data quite rapidly and to a very high standard. This could be beneficial to disaster risk reduction as a relatively quick, simple, and low-cost method to attain a rapidly updated knowledge of terrain attributes, useful for the prediction and mitigation of many natural hazards.
An Approach of Web-based Point Cloud Visualization without Plug-in
NASA Astrophysics Data System (ADS)
Ye, Mengxuan; Wei, Shuangfeng; Zhang, Dongmei
2016-11-01
With the advances in three-dimensional laser scanning technology, the demand for visualization of massive point cloud is increasingly urgent, but a few years ago point cloud visualization was limited to desktop-based solutions until the introduction of WebGL, several web renderers are available. This paper addressed the current issues in web-based point cloud visualization, and proposed a method of web-based point cloud visualization without plug-in. The method combines ASP.NET and WebGL technologies, using the spatial database PostgreSQL to store data and the open web technologies HTML5 and CSS3 to implement the user interface, a visualization system online for 3D point cloud is developed by Javascript with the web interactions. Finally, the method is applied to the real case. Experiment proves that the new model is of great practical value which avoids the shortcoming of the existing WebGIS solutions.
High-performance scientific computing in the cloud
NASA Astrophysics Data System (ADS)
Jorissen, Kevin; Vila, Fernando; Rehr, John
2011-03-01
Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.
Design and Development of ChemInfoCloud: An Integrated Cloud Enabled Platform for Virtual Screening.
Karthikeyan, Muthukumarasamy; Pandit, Deepak; Bhavasar, Arvind; Vyas, Renu
2015-01-01
The power of cloud computing and distributed computing has been harnessed to handle vast and heterogeneous data required to be processed in any virtual screening protocol. A cloud computing platorm ChemInfoCloud was built and integrated with several chemoinformatics and bioinformatics tools. The robust engine performs the core chemoinformatics tasks of lead generation, lead optimisation and property prediction in a fast and efficient manner. It has also been provided with some of the bioinformatics functionalities including sequence alignment, active site pose prediction and protein ligand docking. Text mining, NMR chemical shift (1H, 13C) prediction and reaction fingerprint generation modules for efficient lead discovery are also implemented in this platform. We have developed an integrated problem solving cloud environment for virtual screening studies that also provides workflow management, better usability and interaction with end users using container based virtualization, OpenVz.
NASA Astrophysics Data System (ADS)
Bikkina, Srinivas; Kawamura, Kimitaka; Miyazaki, Yuzo; Fu, Pingqing
2014-05-01
Atmospheric dicarboxylic acids (DCA) are a ubiquitous water-soluble component of secondary organic aerosols (SOA), which can act as cloud condensation nuclei (CCN), affecting the Earth's climate. Despite the high abundances of oxalic acid and related compounds in the marine aerosols, there is no consensus on what controls their distributions over the open ocean. Marine biological productivity could play a role in the production of DCA, but there is no substantial evidence to support this hypothesis. Here we present latitudinal distributions of DCA, oxoacids and α-dicarbonyls in the marine aerosols from the remote Pacific. Their concentrations were found several times higher in more biologically influenced aerosols (MBA) than less biologically influenced aerosols. We propose isoprene and unsaturated fatty acids as sources of DCA as inferred from significantly higher abundances of isoprene-SOA tracers and azelaic acid in MBA. These results have implications toward the reassessment of climate forcing feedbacks of marine-derived SOA.
From GCode to STL: Reconstruct Models from 3D Printing as a Service
NASA Astrophysics Data System (ADS)
Baumann, Felix W.; Schuermann, Martin; Odefey, Ulrich; Pfeil, Markus
2017-12-01
The authors present a method to reverse engineer 3D printer specific machine instructions (GCode) to a point cloud representation and then a STL (Stereolithography) file format. GCode is a machine code that is used for 3D printing among other applications, such as CNC routers. Such code files contain instructions for the 3D printer to move and control its actuator, in case of Fused Deposition Modeling (FDM), the printhead that extrudes semi-molten plastics. The reverse engineering method presented here is based on the digital simulation of the extrusion process of FDM type 3D printing. The reconstructed models and pointclouds do not accommodate for hollow structures, such as holes or cavities. The implementation is performed in Python and relies on open source software and libraries, such as Matplotlib and OpenCV. The reconstruction is performed on the model’s extrusion boundary and considers mechanical imprecision. The complete reconstruction mechanism is available as a RESTful (Representational State Transfer) Web service.
Visualization and Quality Control Web Tools for CERES Products
NASA Astrophysics Data System (ADS)
Mitrescu, C.; Doelling, D. R.; Rutan, D. A.
2016-12-01
The CERES project continues to provide the scientific community a wide variety of satellite-derived data products such as observed TOA broadband shortwave and longwave observed fluxes, computed TOA and Surface fluxes, as well as cloud, aerosol, and other atmospheric parameters. They encompass a wide range of temporal and spatial resolutions, suited to specific applications. Now in its 16-year, CERES products are mostly used by climate modeling communities that focus on global mean energetics, meridianal heat transport, and climate trend studies. In order to serve all our users, we developed a web-based Ordering and Visualization Tool (OVT). Using Opens Source Software such as Eclipse, java, javascript, OpenLayer, Flot, Google Maps, python, and others, the OVT Team developed a series of specialized functions to be used in the process of CERES Data Quality Control (QC). We mention 1- and 2-D histogram, anomaly, deseasonalization, temporal and spatial averaging, side-by-side parameter comparison, and others that made the process of QC far easier and faster, but more importantly far more portable. We are now in the process of integrating ground site observed surface fluxes to further facilitate the CERES project to QC the CERES computed surface fluxes. These features will give users the opportunity to perform their own comparisons of the CERES computed surface fluxes and observed ground site fluxes. An overview of the CERES OVT basic functions using Open Source Software, as well as future steps in expanding its capabilities will be presented at the meeting.
Study of Molecular Clouds, Variable Stars and Related Topics at NUU and UBAI
NASA Astrophysics Data System (ADS)
Hojaev, A. S.
2017-07-01
The search of young PMS stars made by our team at Maidanak, Lulin and Beijing observatories, especially in NGC 6820/23 area, as well as monitoring of a sample of open clusters will be described and results will be presented. We consider physical conditions in different star forming regions, particularly in TDC and around Vul OB1, estimate SFE and SFR, energy balance and instability processes in these regions. We also reviewed all data on molecular clouds in the Galaxy and in other galaxies where the clouds were observed to prepare general catalog of molecular clouds, to study physical conditions, unsteadiness and possible star formation in them, the formation and evolution of molecular cloud systems, to analyze their role in formation of different types of galaxies and structural features therein.
Supernova Remnant Kesteven 27: Interaction with A Neighbor HI Cloud Viewed by Fermi
NASA Astrophysics Data System (ADS)
Xing, Yi; Wang, Zhongxiang; Zhang, Xiao; Chen, Yang
2015-05-01
We report on the likely detection of γ-ray emission from the supernova remnant (SNR) Kesteven 27 (Kes 27). We analyze 5.7 yr Fermi Large Area Telescope data of the SNR region and find an unresolved source at a position consistent with the radio brightness peak and the X-ray knot of Kes 27, which is located in the eastern region of the SNR and caused by its interaction with a nearby Hi cloud. The source’s emission is best fit with a power-law spectrum with a photon index of 2.5 ± 0.1 and a >0.2 GeV luminosity of 5.8× {{10}34} erg s-1 assuming a distance of 4.3 kpc, as derived from radio observations of the nearby Hi cloud. Comparing the properties of the source with that of other SNRs that are known to be interacting with nearby high-density clouds, we discuss the origin of the source’s emission. The spectral energy distribution of the source can be described by a hadronic model that considers the interaction of energetic protons escaping from the shock front of Kes 27 with a high-density cloud.
Soft X-ray observations of pre-main sequence stars in the chamaeleon dark cloud
NASA Technical Reports Server (NTRS)
Feigelson, Eric D.; Kriss, Gerard A.
1987-01-01
Einstein IPC observations of the nearby Chamaeleon I star forming cloud show 22 well-resolved soft X-ray sources in a 1x2 deg region. Twelve are associated with H-alpha emission line pre-main sequence (PMS) stars, and four with optically selected PMS stars. Several X-ray sources have two or more PMS stars in their error circles. Optical spectra were obtained at CTIO of possible stellar counterparts of the remaining X-ray sources. They reveal 5 probable new cloud members, K7-MO stars with weak or absent emission lines. These naked X-ray selected PMS stars are similar to those found in the Taurus-Auriga cloud. The spatial distributions and H-R diagrams of the X-ray and optically selected PMS stars in the cloud are very similar. Luminosity functions indicate the Chamaeleon stars are on average approximately 5 times more X-ray luminous than Pleiad dwarfs. A significant correlation between L sub x and optical magnitude suggests this trend may continue within the PMS phase of stellar evolution. The relation of increasing X-ray luminosity with decreasing stellar ages is thus extended to stellar ages as young as 1 million years.
Latent Heating Retrieval from TRMM Observations Using a Simplified Thermodynamic Model
NASA Technical Reports Server (NTRS)
Grecu, Mircea; Olson, William S.
2003-01-01
A procedure for the retrieval of hydrometeor latent heating from TRMM active and passive observations is presented. The procedure is based on current methods for estimating multiple-species hydrometeor profiles from TRMM observations. The species include: cloud water, cloud ice, rain, and graupel (or snow). A three-dimensional wind field is prescribed based on the retrieved hydrometeor profiles, and, assuming a steady-state, the sources and sinks in the hydrometeor conservation equations are determined. Then, the momentum and thermodynamic equations, in which the heating and cooling are derived from the hydrometeor sources and sinks, are integrated one step forward in time. The hydrometeor sources and sinks are reevaluated based on the new wind field, and the momentum and thermodynamic equations are integrated one more step. The reevalution-integration process is repeated until a steady state is reached. The procedure is tested using cloud model simulations. Cloud-model derived fields are used to synthesize TRMM observations, from which hydrometeor profiles are derived. The procedure is applied to the retrieved hydrometeor profiles, and the latent heating estimates are compared to the actual latent heating produced by the cloud model. Examples of procedure's applications to real TRMM data are also provided.
Implementation of Grid Tier 2 and Tier 3 facilities on a Distributed OpenStack Cloud
NASA Astrophysics Data System (ADS)
Limosani, Antonio; Boland, Lucien; Coddington, Paul; Crosby, Sean; Huang, Joanna; Sevior, Martin; Wilson, Ross; Zhang, Shunde
2014-06-01
The Australian Government is making a AUD 100 million investment in Compute and Storage for the academic community. The Compute facilities are provided in the form of 30,000 CPU cores located at 8 nodes around Australia in a distributed virtualized Infrastructure as a Service facility based on OpenStack. The storage will eventually consist of over 100 petabytes located at 6 nodes. All will be linked via a 100 Gb/s network. This proceeding describes the development of a fully connected WLCG Tier-2 grid site as well as a general purpose Tier-3 computing cluster based on this architecture. The facility employs an extension to Torque to enable dynamic allocations of virtual machine instances. A base Scientific Linux virtual machine (VM) image is deployed in the OpenStack cloud and automatically configured as required using Puppet. Custom scripts are used to launch multiple VMs, integrate them into the dynamic Torque cluster and to mount remote file systems. We report on our experience in developing this nation-wide ATLAS and Belle II Tier 2 and Tier 3 computing infrastructure using the national Research Cloud and storage facilities.
GC31G-1182: Opennex, a Private-Public Partnership in Support of the National Climate Assessment
NASA Technical Reports Server (NTRS)
Nemani, Ramakrishna R.; Wang, Weile; Michaelis, Andrew; Votava, Petr; Ganguly, Sangram
2016-01-01
The NASA Earth Exchange (NEX) is a collaborative computing platform that has been developed with the objective of bringing scientists together with the software tools, massive global datasets, and supercomputing resources necessary to accelerate research in Earth systems science and global change. NEX is funded as an enabling tool for sustaining the national climate assessment. Over the past five years, researchers have used the NEX platform and produced a number of data sets highly relevant to the National Climate Assessment. These include high-resolution climate projections using different downscaling techniques and trends in historical climate from satellite data. To enable a broader community in exploiting the above datasets, the NEX team partnered with public cloud providers to create the OpenNEX platform. OpenNEX provides ready access to NEX data holdings on a number of public cloud platforms along with pertinent analysis tools and workflows in the form of Machine Images and Docker Containers, lectures and tutorials by experts. We will showcase some of the applications of OpenNEX data and tools by the community on Amazon Web Services, Google Cloud and the NEX Sandbox.
NASA Astrophysics Data System (ADS)
Schwantes, Adam Christopher
Stratocumuli are a type of low clouds composed of individual convective elements that together form a continuous layer of clouds. Stratocumuli cover large regions of the Earth's surface, which make them important components in the Earth's radiation budget. Stratocumuli strongly reflect solar shortwave radiation, while weakly affecting outgoing longwave radiation. This leads to a strong radiative cooling effect that affects the Earth's radiation budget. Therefore it is important to investigate the mechanisms that affect the longevity of stratocumuli, so that their impact on the Earth's radiation budget can be fully understood. One mechanism that is currently being studied as influencing the lifetime of such cloud layers is boundary layer/surface coupling. It has been shown than in some regions (i.e. the west coast of South America) stratocumuli tend to break up when the boundary layer is decoupled with the surface, because they are cut off from their moisture source. This study will investigate the macro- and micro-physical properties of stratocumuli when boundary layers are either coupled to or decoupled from the surface. This will help advance understanding of the effects these macro- and micro-physical properties have on the lifetime of stratocumuli under different boundary layer conditions. This study used the Department of Energy Atmospheric Radiation Measurement (DOE ARM) mobile measurements facility (AMF) at the Azores site from June 2009 to December 2010. The measurements that were used include temperature profiles from radiosondes, cloud liquid water path (LWP) retrieved from the Microwave radiometer, and cloud base and top heights derived from W-band ARM Cloud Radar and lidar. Satellite images provided by the NASA Langley Research Center were also used to visually decipher cloud types over the region so that only single-layered stratocumuli cases are used in the study. To differentiate between coupled and decoupled cloud layers, two methods are used. The first method compares cloud base height and lifting condensation level (LCL) for surface air parcels. The second method uses potential temperature profiles to indicate whether a boundary layer is coupled or decoupled from the surface. The results from these two methods were then compared using select cases/samples when both methods classified a sample as coupled or decoupled. In this study, a total of seven coupled or decoupled cases (2-3 days long each) have been selected from the 19 month AMF dataset. Characteristics of the coupled and decoupled cases have been studied to identify similarities and differences. Furthermore, comparison results from this study have shown that there are similarities and differences between drizzling/non-drizzling stratocumulus clouds and decoupled/coupled stratocumulus clouds. Drizzling/decoupled stratocumuli tend to have higher LWP, cloud-droplet effective radius (re), cloud-top height, and cloud thickness values while non-drizzling/coupled stratocumuli have higher cloud-droplet number concentration (Nd) and cloud condensation nuclei concentration (NCCN) values. It was also determined that during daytime hours when stratocumuli are decoupled, they tend to be open cells, while coupled stratocumuli tend to be closed cells. Finally, decoupled nighttime stratocumuli were found to have higher LWPs compared to decoupled daytime stratocumuli, which resulted in the significant amount of heavy drizzle events occurring at night.
A study on strategic provisioning of cloud computing services.
Whaiduzzaman, Md; Haque, Mohammad Nazmul; Rejaul Karim Chowdhury, Md; Gani, Abdullah
2014-01-01
Cloud computing is currently emerging as an ever-changing, growing paradigm that models "everything-as-a-service." Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified.
A Study on Strategic Provisioning of Cloud Computing Services
Rejaul Karim Chowdhury, Md
2014-01-01
Cloud computing is currently emerging as an ever-changing, growing paradigm that models “everything-as-a-service.” Virtualised physical resources, infrastructure, and applications are supplied by service provisioning in the cloud. The evolution in the adoption of cloud computing is driven by clear and distinct promising features for both cloud users and cloud providers. However, the increasing number of cloud providers and the variety of service offerings have made it difficult for the customers to choose the best services. By employing successful service provisioning, the essential services required by customers, such as agility and availability, pricing, security and trust, and user metrics can be guaranteed by service provisioning. Hence, continuous service provisioning that satisfies the user requirements is a mandatory feature for the cloud user and vitally important in cloud computing service offerings. Therefore, we aim to review the state-of-the-art service provisioning objectives, essential services, topologies, user requirements, necessary metrics, and pricing mechanisms. We synthesize and summarize different provision techniques, approaches, and models through a comprehensive literature review. A thematic taxonomy of cloud service provisioning is presented after the systematic review. Finally, future research directions and open research issues are identified. PMID:25032243
Molecular Anions in Protostars, Prestellar Cores and Dark Clouds
NASA Technical Reports Server (NTRS)
Cordiner, Martin; Charnley, Steven; Buckle, Jane; Wash, Catherine; Millar, Tom
2011-01-01
From our recent survey work using the Green Bank Telescope, microwave emission lines from the hydrocarbon anion C6H(-) and its parent neutral C6H have been detected in six new sources. Using HC3N = 10(exp -9) emission maps, we targeted the most carbon-chain-rich sources for our anion survey, which included the low-mass Class 0 protostar L1251A-IRS3, the prestellar cores L1389-SMM1 and L1512, and the interstellar clouds Ll172A, TMC-1C and L1495B. Derived [C6H(-)]/[C6H] anion-to-neutral ratios are approximately 1-10. The greatest C6H(-) column densities are found in the quiescent clouds TMC-1C and L1495B, but the anion-to-neutral ratios are greatest in the prestellar cores and protostars. These results are interpreted in terms of the physical and chemical properties of the sources, and the implications for molecular cloud chemistry are discussed.
NASA Astrophysics Data System (ADS)
Michaelis, A.; Ganguly, S.; Nemani, R. R.; Votava, P.; Wang, W.; Lee, T. J.; Dungan, J. L.
2014-12-01
Sharing community-valued codes, intermediary datasets and results from individual efforts with others that are not in a direct funded collaboration can be a challenge. Cross organization collaboration is often impeded due to infrastructure security constraints, rigid financial controls, bureaucracy, and workforce nationalities, etc., which can force groups to work in a segmented fashion and/or through awkward and suboptimal web services. We show how a focused community may come together, share modeling and analysis codes, computing configurations, scientific results, knowledge and expertise on a public cloud platform; diverse groups of researchers working together at "arms length". Through the OpenNEX experimental workshop, users can view short technical "how-to" videos and explore encapsulated working environment. Workshop participants can easily instantiate Amazon Machine Images (AMI) or launch full cluster and data processing configurations within minutes. Enabling users to instantiate computing environments from configuration templates on large public cloud infrastructures, such as Amazon Web Services, may provide a mechanism for groups to easily use each others work and collaborate indirectly. Moreover, using the public cloud for this workshop allowed a single group to host a large read only data archive, making datasets of interest to the community widely available on the public cloud, enabling other groups to directly connect to the data and reduce the costs of the collaborative work by freeing other individual groups from redundantly retrieving, integrating or financing the storage of the datasets of interest.
NASA Astrophysics Data System (ADS)
Corsaro, Enrico; Lee, Yueh-Ning; García, Rafael A.; Hennebelle, Patrick; Mathur, Savita; Beck, Paul G.; Mathis, Stephane; Stello, Dennis; Bouvier, Jérôme
2017-10-01
Stars originate by the gravitational collapse of a turbulent molecular cloud of a diffuse medium, and are often observed to form clusters. Stellar clusters therefore play an important role in our understanding of star formation and of the dynamical processes at play. However, investigating the cluster formation is diffcult because the density of the molecular cloud undergoes a change of many orders of magnitude. Hierarchical-step approaches to decompose the problem into different stages are therefore required, as well as reliable assumptions on the initial conditions in the clouds. We report for the first time the use of the full potential of NASA Kepler asteroseismic observations coupled with 3D numerical simulations, to put strong constraints on the early formation stages of open clusters. Thanks to a Bayesian peak bagging analysis of about 50 red giant members of NGC 6791 and NGC 6819, the two most populated open clusters observed in the nominal Kepler mission, we derive a complete set of detailed oscillation mode properties for each star, with thousands of oscillation modes characterized. We therefore show how these asteroseismic properties lead us to a discovery about the rotation history of stellar clusters. Finally, our observational findings will be compared with hydrodynamical simulations for stellar cluster formation to constrain the physical processes of turbulence, rotation, and magnetic fields that are in action during the collapse of the progenitor cloud into a proto-cluster.
LESTO: an Open Source GIS-based toolbox for LiDAR analysis
NASA Astrophysics Data System (ADS)
Franceschi, Silvia; Antonello, Andrea; Tonon, Giustino
2015-04-01
During the last five years different research institutes and private companies stared to implement new algorithms to analyze and extract features from LiDAR data but only a few of them also created a public available software. In the field of forestry there are different examples of software that can be used to extract the vegetation parameters from LiDAR data, unfortunately most of them are closed source (even if free), which means that the source code is not shared with the public for anyone to look at or make changes to. In 2014 we started the development of the library LESTO (LiDAR Empowered Sciences Toolbox Opensource): a set of modules for the analysis of LiDAR point cloud with an Open Source approach with the aim of improving the performance of the extraction of the volume of biomass and other vegetation parameters on large areas for mixed forest structures. LESTO contains a set of modules for data handling and analysis implemented within the JGrassTools spatial processing library. The main subsections are dedicated to 1) preprocessing of LiDAR raw data mainly in LAS format (utilities and filtering); 2) creation of raster derived products; 3) flight-lines identification and normalization of the intensity values; 4) tools for extraction of vegetation and buildings. The core of the LESTO library is the extraction of the vegetation parameters. We decided to follow the single tree based approach starting with the implementation of some of the most used algorithms in literature. These have been tweaked and applied on LiDAR derived raster datasets (DTM, DSM) as well as point clouds of raw data. The methods range between the simple extraction of tops and crowns from local maxima, the region growing method, the watershed method and individual tree segmentation on point clouds. The validation procedure consists in finding the matching between field and LiDAR-derived measurements at individual tree and plot level. An automatic validation procedure has been developed considering an Optimizer Algorithm based on Particle Swarm (PS) and a matching procedure which takes the position and the height of the extracted trees respect to the measured ones and iteratively tries to improve the candidate solution changing the models' parameters. Example of application of the LESTO tools will be presented on test sites. Test area consists in a series of circular sampling plots randomly selected from a 50x50 m regular grid within a buffer zone of 150 m from the forest road. Other studies on the same sites take as reference measurements of position, diameter, species and height and proposed allometric relationships. These allometric relationship were obtained for each species deriving the stem volume of single trees based on height and diameter at breast height. LESTO is integrated in the JGrassTools project and available for download at www.jgrasstools.org. A simple and easy to use graphical interface to run the models is available at https://github.com/moovida/STAGE/releases.
Global circulation as the main source of cloud activity on Titan
Rodriguez, S.; Le, Mouelic S.; Rannou, P.; Tobie, G.; Baines, K.H.; Barnes, J.W.; Griffith, C.A.; Hirtzig, M.; Pitman, K.M.; Sotin, Christophe; Brown, R.H.; Buratti, B.J.; Clark, R.N.; Nicholson, P.D.
2009-01-01
Clouds on Titan result from the condensation of methane and ethane and, as on other planets, are primarily structured by circulation of the atmosphere. At present, cloud activity mainly occurs in the southern (summer) hemisphere, arising near the pole and at mid-latitudes from cumulus updrafts triggered by surface heating and/or local methane sources, and at the north (winter) pole, resulting from the subsidence and condensation of ethane-rich air into the colder troposphere. General circulation models predict that this distribution should change with the seasons on a 15-year timescale, and that clouds should develop under certain circumstances at temperate latitudes (40??) in the winter hemisphere. The models, however, have hitherto been poorly constrained and their long-term predictions have not yet been observationally verified. Here we report that the global spatial cloud coverage on Titan is in general agreement with the models, confirming that cloud activity is mainly controlled by the global circulation. The non-detection of clouds at latitude 40??N and the persistence of the southern clouds while the southern summer is ending are, however, both contrary to predictions. This suggests that Titans equator-to-pole thermal contrast is overestimated in the models and that its atmosphere responds to the seasonal forcing with a greater inertia than expected. ?? 2009 Macmillan Publishers Limited. All rights reserved.
VizieR Online Data Catalog: Very Low-Luminosity Objects (VeLLOs) from 1.25-850um (Kim+, 2016)
NASA Astrophysics Data System (ADS)
Kim, M.-R.; Lee, C. W.; Dunham, M. M.; Evans, N. J., II; Kim, G.; Allen, L. E.
2016-10-01
The Spitzer Gould Belt Survey (GBS) is a project to survey about 21 square degrees of 11 nearby molecular clouds at 3.6-160um to provide a census of star formation in nearby large clouds (P.I. L. Allen). Spitzer has mapped a total of 11 molecular clouds, CMC, Chamaeleon I, Chamaeleon III, Musca, Lupus V, Lupus VI, Ophiuchus North, Aquila, CrA, Cepheus, and IC 5146 with the IRAC and MIPS between 2004 March and 2008 October. We utilized the data provided by the c2d/GBS projects (Evans et al. 2009, J/ApJS/181/321; Dunham et al. 2015, J/ApJS/220/11). There are two cloud complexes which were not listed in the c2d/GBS projects, but observed by other projects, the Taurus molecular clouds and the Orion molecular clouds. The Taurus molecular clouds have been observed over an area of ~44 square degrees by one of the GTO programs (P.I. D. Padgett) with the IRAC and the MIPS instruments. The Orion molecular clouds have been surveyed in ~9°2 area by Spitzer (P.I. T. Megeath). See section 2.1 for further details. Complementary archive infrared data were retrieved from 2MASS and Herschel PACS and SPIRE and JCMT SCUBA-2; see section 2.2. We observed our sources with the N2H+(1-0) line with the Korean Very Long Baseline Interferometry Network (KVN) 21m radio telescopes from 2011 October to 2016 May for the northern hemisphere sources, and the Mopra 22m telescope in 2012 April for the southern hemisphere sources. See section 2.3 for further explanations. (8 data files).
NASA Technical Reports Server (NTRS)
Mahesh, Ashwin; Spinhirne, James D.; Duda, David P.; Eloranta, Edwin W.; Starr, David O'C (Technical Monitor)
2001-01-01
The altimetry bias in GLAS (Geoscience Laser Altimeter System) or other laser altimeters resulting from atmospheric multiple scattering is studied in relationship to current knowledge of cloud properties over the Antarctic Plateau. Estimates of seasonal and interannual changes in the bias are presented. Results show the bias in altitude from multiple scattering in clouds would be a significant error source without correction. The selective use of low optical depth clouds or cloudfree observations, as well as improved analysis of the return pulse such as by the Gaussian method used here, are necessary to minimize the surface altitude errors. The magnitude of the bias is affected by variations in cloud height, cloud effective particle size and optical depth. Interannual variations in these properties as well as in cloud cover fraction could lead to significant year-to-year variations in the altitude bias. Although cloud-free observations reduce biases in surface elevation measurements from space, over Antarctica these may often include near-surface blowing snow, also a source of scattering-induced delay. With careful selection and analysis of data, laser altimetry specifications can be met.
TRIDEC Cloud - a Web-based Platform for Tsunami Early Warning tested with NEAMWave14 Scenarios
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Spazier, Johannes; Reißland, Sven; Necmioglu, Ocal; Comoglu, Mustafa; Ozer Sozdinler, Ceren; Carrilho, Fernando; Wächter, Joachim
2015-04-01
In times of cloud computing and ubiquitous computing the use of concepts and paradigms introduced by information and communications technology (ICT) have to be considered even for early warning systems (EWS). Based on the experiences and the knowledge gained in research projects new technologies are exploited to implement a cloud-based and web-based platform - the TRIDEC Cloud - to open up new prospects for EWS. The platform in its current version addresses tsunami early warning and mitigation. It merges several complementary external and in-house cloud-based services for instant tsunami propagation calculations and automated background computation with graphics processing units (GPU), for web-mapping of hazard specific geospatial data, and for serving relevant functionality to handle, share, and communicate threat specific information in a collaborative and distributed environment. The TRIDEC Cloud can be accessed in two different modes, the monitoring mode and the exercise-and-training mode. The monitoring mode provides important functionality required to act in a real event. So far, the monitoring mode integrates historic and real-time sea level data and latest earthquake information. The integration of sources is supported by a simple and secure interface. The exercise and training mode enables training and exercises with virtual scenarios. This mode disconnects real world systems and connects with a virtual environment that receives virtual earthquake information and virtual sea level data re-played by a scenario player. Thus operators and other stakeholders are able to train skills and prepare for real events and large exercises. The GFZ German Research Centre for Geosciences (GFZ), the Kandilli Observatory and Earthquake Research Institute (KOERI), and the Portuguese Institute for the Sea and Atmosphere (IPMA) have used the opportunity provided by NEAMWave14 to test the TRIDEC Cloud as a collaborative activity based on previous partnership and commitments at the European scale. The TRIDEC Cloud has not been involved officially in Part B of the NEAMWave14 scenarios. However, the scenarios have been used by GFZ, KOERI, and IPMA for testing in exercise runs on October 27-28, 2014. Additionally, the Greek NEAMWave14 scenario has been tested in an exercise run by GFZ only on October 29, 2014 (see ICG/NEAMTWS-XI/13). The exercise runs demonstrated that operators in warning centres and stakeholders of other involved parties just need a standard web browser to access a full-fledged TEWS. The integration of GPU accelerated tsunami simulation computations have been an integral part to foster early warning with on-demand tsunami predictions based on actual source parameters. Thus tsunami travel times, estimated times of arrival and estimated wave heights are available immediately for visualization and for further analysis and processing. The generation of warning messages is based on internationally agreed message structures and includes static and dynamic information based on earthquake information, instant computations of tsunami simulations, and actual measurements. Generated messages are served for review, modification, and addressing in one simple form for dissemination via Cloud Messages, Shared Maps, e-mail, FTP/GTS, SMS, and FAX. Cloud Messages and Shared Maps are complementary channels and integrate interactive event and simulation data. Thus recipients are enabled to interact dynamically with a map and diagrams beyond traditional text information.
Experience of the JPL Exploratory Data Analysis Team at validating HIRS2/MSU cloud parameters
NASA Technical Reports Server (NTRS)
Kahn, Ralph; Haskins, Robert D.; Granger-Gallegos, Stephanie; Pursch, Andrew; Delgenio, Anthony
1992-01-01
Validation of the HIRS2/MSU cloud parameters began with the cloud/climate feedback problem. The derived effective cloud amount is less sensitive to surface temperature for higher clouds. This occurs because as the cloud elevation increases, the difference between surface temperature and cloud temperature increases, so only a small change in cloud amount is needed to effect a large change in radiance at the detector. By validating the cloud parameters it is meant 'developing a quantitative sense for the physical meaning of the measured parameters', by: (1) identifying the assumptions involved in deriving parameters from the measured radiances, (2) testing the input data and derived parameters for statistical error, sensitivity, and internal consistency, and (3) comparing with similar parameters obtained from other sources using other techniques.
Matsu: An Elastic Cloud Connected to a SensorWeb for Disaster Response
NASA Technical Reports Server (NTRS)
Mandl, Daniel
2011-01-01
This slide presentation reviews the use of cloud computing combined with the SensorWeb in aiding disaster response planning. Included is an overview of the architecture of the SensorWeb, and overviews of the phase 1 of the EO-1 system and the steps to improve it to transform it to an On-demand product cloud as part of the Open Cloud Consortium (OCC). The effectiveness of this system is demonstrated in the SensorWeb for the Namibia flood in 2010, using information blended from MODIS, TRMM, River Gauge data, and the Google Earth version of Namibia the system enabled river surge predictions and could enable planning for future disaster responses.
NASA Astrophysics Data System (ADS)
Rasch, Philip J.; Wood, Robert; Ackerman, Thomas P.
2017-04-01
Anthropogenic aerosol impacts on clouds constitute the largest source of uncertainty in radiative forcing of climate, confounding estimates of climate sensitivity to increases in greenhouse gases. Projections of future warming are also thus strongly dependent on estimates of aerosol effects on clouds. I will discuss the opportunities for improving estimates of aerosol effects on clouds from controlled field experiments where aerosol with well understood size, composition, amount, and injection altitude could be introduced to deliberately change cloud properties. This would allow scientific investigation to be performed in a manner much closer to a lab environment, and facilitate the use of models to predict cloud responses ahead of time, testing our understanding of aerosol cloud interactions.