High-performance scientific computing in the cloud
NASA Astrophysics Data System (ADS)
Jorissen, Kevin; Vila, Fernando; Rehr, John
2011-03-01
Cloud computing has the potential to open up high-performance computational science to a much broader class of researchers, owing to its ability to provide on-demand, virtualized computational resources. However, before such approaches can become commonplace, user-friendly tools must be developed that hide the unfamiliar cloud environment and streamline the management of cloud resources for many scientific applications. We have recently shown that high-performance cloud computing is feasible for parallelized x-ray spectroscopy calculations. We now present benchmark results for a wider selection of scientific applications focusing on electronic structure and spectroscopic simulation software in condensed matter physics. These applications are driven by an improved portable interface that can manage virtual clusters and run various applications in the cloud. We also describe a next generation of cluster tools, aimed at improved performance and a more robust cluster deployment. Supported by NSF grant OCI-1048052.
Evaluating the Efficacy of the Cloud for Cluster Computation
NASA Technical Reports Server (NTRS)
Knight, David; Shams, Khawaja; Chang, George; Soderstrom, Tom
2012-01-01
Computing requirements vary by industry, and it follows that NASA and other research organizations have computing demands that fall outside the mainstream. While cloud computing made rapid inroads for tasks such as powering web applications, performance issues on highly distributed tasks hindered early adoption for scientific computation. One venture to address this problem is Nebula, NASA's homegrown cloud project tasked with delivering science-quality cloud computing resources. However, another industry development is Amazon's high-performance computing (HPC) instances on Elastic Cloud Compute (EC2) that promises improved performance for cluster computation. This paper presents results from a series of benchmarks run on Amazon EC2 and discusses the efficacy of current commercial cloud technology for running scientific applications across a cluster. In particular, a 240-core cluster of cloud instances achieved 2 TFLOPS on High-Performance Linpack (HPL) at 70% of theoretical computational performance. The cluster's local network also demonstrated sub-100 ?s inter-process latency with sustained inter-node throughput in excess of 8 Gbps. Beyond HPL, a real-world Hadoop image processing task from NASA's Lunar Mapping and Modeling Project (LMMP) was run on a 29 instance cluster to process lunar and Martian surface images with sizes on the order of tens of gigapixels. These results demonstrate that while not a rival of dedicated supercomputing clusters, commercial cloud technology is now a feasible option for moderately demanding scientific workloads.
RAPPORT: running scientific high-performance computing applications on the cloud.
Cohen, Jeremy; Filippis, Ioannis; Woodbridge, Mark; Bauer, Daniela; Hong, Neil Chue; Jackson, Mike; Butcher, Sarah; Colling, David; Darlington, John; Fuchs, Brian; Harvey, Matt
2013-01-28
Cloud computing infrastructure is now widely used in many domains, but one area where there has been more limited adoption is research computing, in particular for running scientific high-performance computing (HPC) software. The Robust Application Porting for HPC in the Cloud (RAPPORT) project took advantage of existing links between computing researchers and application scientists in the fields of bioinformatics, high-energy physics (HEP) and digital humanities, to investigate running a set of scientific HPC applications from these domains on cloud infrastructure. In this paper, we focus on the bioinformatics and HEP domains, describing the applications and target cloud platforms. We conclude that, while there are many factors that need consideration, there is no fundamental impediment to the use of cloud infrastructure for running many types of HPC applications and, in some cases, there is potential for researchers to benefit significantly from the flexibility offered by cloud platforms.
Model-as-a-service (MaaS) using the cloud service innovation platform (CSIP)
USDA-ARS?s Scientific Manuscript database
Cloud infrastructures for modelling activities such as data processing, performing environmental simulations, or conducting model calibrations/optimizations provide a cost effective alternative to traditional high performance computing approaches. Cloud-based modelling examples emerged into the more...
HPC on Competitive Cloud Resources
NASA Astrophysics Data System (ADS)
Bientinesi, Paolo; Iakymchuk, Roman; Napper, Jeff
Computing as a utility has reached the mainstream. Scientists can now easily rent time on large commercial clusters that can be expanded and reduced on-demand in real-time. However, current commercial cloud computing performance falls short of systems specifically designed for scientific applications. Scientific computing needs are quite different from those of the web applications that have been the focus of cloud computing vendors. In this chapter we demonstrate through empirical evaluation the computational efficiency of high-performance numerical applications in a commercial cloud environment when resources are shared under high contention. Using the Linpack benchmark as a case study, we show that cache utilization becomes highly unpredictable and similarly affects computation time. For some problems, not only is it more efficient to underutilize resources, but the solution can be reached sooner in realtime (wall-time). We also show that the smallest, cheapest (64-bit) instance on the studied environment is the best for price to performance ration. In light of the high-contention we witness, we believe that alternative definitions of efficiency for commercial cloud environments should be introduced where strong performance guarantees do not exist. Concepts like average, expected performance and execution time, expected cost to completion, and variance measures--traditionally ignored in the high-performance computing context--now should complement or even substitute the standard definitions of efficiency.
Arctic Boreal Vulnerability Experiment (ABoVE) Science Cloud
NASA Astrophysics Data System (ADS)
Duffy, D.; Schnase, J. L.; McInerney, M.; Webster, W. P.; Sinno, S.; Thompson, J. H.; Griffith, P. C.; Hoy, E.; Carroll, M.
2014-12-01
The effects of climate change are being revealed at alarming rates in the Arctic and Boreal regions of the planet. NASA's Terrestrial Ecology Program has launched a major field campaign to study these effects over the next 5 to 8 years. The Arctic Boreal Vulnerability Experiment (ABoVE) will challenge scientists to take measurements in the field, study remote observations, and even run models to better understand the impacts of a rapidly changing climate for areas of Alaska and western Canada. The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center (GSFC) has partnered with the Terrestrial Ecology Program to create a science cloud designed for this field campaign - the ABoVE Science Cloud. The cloud combines traditional high performance computing with emerging technologies to create an environment specifically designed for large-scale climate analytics. The ABoVE Science Cloud utilizes (1) virtualized high-speed InfiniBand networks, (2) a combination of high-performance file systems and object storage, and (3) virtual system environments tailored for data intensive, science applications. At the center of the architecture is a large object storage environment, much like a traditional high-performance file system, that supports data proximal processing using technologies like MapReduce on a Hadoop Distributed File System (HDFS). Surrounding the storage is a cloud of high performance compute resources with many processing cores and large memory coupled to the storage through an InfiniBand network. Virtual systems can be tailored to a specific scientist and provisioned on the compute resources with extremely high-speed network connectivity to the storage and to other virtual systems. In this talk, we will present the architectural components of the science cloud and examples of how it is being used to meet the needs of the ABoVE campaign. In our experience, the science cloud approach significantly lowers the barriers and risks to organizations that require high performance computing solutions and provides the NCCS with the agility required to meet our customers' rapidly increasing and evolving requirements.
A high performance scientific cloud computing environment for materials simulations
NASA Astrophysics Data System (ADS)
Jorissen, K.; Vila, F. D.; Rehr, J. J.
2012-09-01
We describe the development of a scientific cloud computing (SCC) platform that offers high performance computation capability. The platform consists of a scientific virtual machine prototype containing a UNIX operating system and several materials science codes, together with essential interface tools (an SCC toolset) that offers functionality comparable to local compute clusters. In particular, our SCC toolset provides automatic creation of virtual clusters for parallel computing, including tools for execution and monitoring performance, as well as efficient I/O utilities that enable seamless connections to and from the cloud. Our SCC platform is optimized for the Amazon Elastic Compute Cloud (EC2). We present benchmarks for prototypical scientific applications and demonstrate performance comparable to local compute clusters. To facilitate code execution and provide user-friendly access, we have also integrated cloud computing capability in a JAVA-based GUI. Our SCC platform may be an alternative to traditional HPC resources for materials science or quantum chemistry applications.
Automating NEURON Simulation Deployment in Cloud Resources.
Stockton, David B; Santamaria, Fidel
2017-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the OpenStack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon's proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model.
Automating NEURON Simulation Deployment in Cloud Resources
Santamaria, Fidel
2016-01-01
Simulations in neuroscience are performed on local servers or High Performance Computing (HPC) facilities. Recently, cloud computing has emerged as a potential computational platform for neuroscience simulation. In this paper we compare and contrast HPC and cloud resources for scientific computation, then report how we deployed NEURON, a widely used simulator of neuronal activity, in three clouds: Chameleon Cloud, a hybrid private academic cloud for cloud technology research based on the Open-Stack software; Rackspace, a public commercial cloud, also based on OpenStack; and Amazon Elastic Cloud Computing, based on Amazon’s proprietary software. We describe the manual procedures and how to automate cloud operations. We describe extending our simulation automation software called NeuroManager (Stockton and Santamaria, Frontiers in Neuroinformatics, 2015), so that the user is capable of recruiting private cloud, public cloud, HPC, and local servers simultaneously with a simple common interface. We conclude by performing several studies in which we examine speedup, efficiency, total session time, and cost for sets of simulations of a published NEURON model. PMID:27655341
Scaling predictive modeling in drug development with cloud computing.
Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola
2015-01-26
Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.
GPU based cloud system for high-performance arrhythmia detection with parallel k-NN algorithm.
Tae Joon Jun; Hyun Ji Park; Hyuk Yoo; Young-Hak Kim; Daeyoung Kim
2016-08-01
In this paper, we propose an GPU based Cloud system for high-performance arrhythmia detection. Pan-Tompkins algorithm is used for QRS detection and we optimized beat classification algorithm with K-Nearest Neighbor (K-NN). To support high performance beat classification on the system, we parallelized beat classification algorithm with CUDA to execute the algorithm on virtualized GPU devices on the Cloud system. MIT-BIH Arrhythmia database is used for validation of the algorithm. The system achieved about 93.5% of detection rate which is comparable to previous researches while our algorithm shows 2.5 times faster execution time compared to CPU only detection algorithm.
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin; ...
2015-02-19
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
Understanding the Performance and Potential of Cloud Computing for Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadooghi, Iman; Martin, Jesus Hernandez; Li, Tonglin
In this paper, commercial clouds bring a great opportunity to the scientific computing area. Scientific applications usually require significant resources, however not all scientists have access to sufficient high-end computing systems, may of which can be found in the Top500 list. Cloud Computing has gained the attention of scientists as a competitive resource to run HPC applications at a potentially lower cost. But as a different infrastructure, it is unclear whether clouds are capable of running scientific applications with a reasonable performance per money spent. This work studies the performance of public clouds and places this performance in context tomore » price. We evaluate the raw performance of different services of AWS cloud in terms of the basic resources, such as compute, memory, network and I/O. We also evaluate the performance of the scientific applications running in the cloud. This paper aims to assess the ability of the cloud to perform well, as well as to evaluate the cost of the cloud running scientific applications. We developed a full set of metrics and conducted a comprehensive performance evlauation over the Amazon cloud. We evaluated EC2, S3, EBS and DynamoDB among the many Amazon AWS services. We evaluated the memory sub-system performance with CacheBench, the network performance with iperf, processor and network performance with the HPL benchmark application, and shared storage with NFS and PVFS in addition to S3. We also evaluated a real scientific computing application through the Swift parallel scripting system at scale. Armed with both detailed benchmarks to gauge expected performance and a detailed monetary cost analysis, we expect this paper will be a recipe cookbook for scientists to help them decide where to deploy and run their scientific applications between public clouds, private clouds, or hybrid clouds.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Blanchard, Yann; Royer, Alain; O'Neill, Norman T.
Multiband downwelling thermal measurements of zenith sky radiance, along with cloud boundary heights, were used in a retrieval algorithm to estimate cloud optical depth and effective particle diameter of thin ice clouds in the Canadian High Arctic. Ground-based thermal infrared (IR) radiances for 150 semitransparent ice clouds cases were acquired at the Polar Environment Atmospheric Research Laboratory (PEARL) in Eureka, Nunavut, Canada (80° N, 86° W). We analyzed and quantified the sensitivity of downwelling thermal radiance to several cloud parameters including optical depth, effective particle diameter and shape, water vapor content, cloud geometric thickness and cloud base altitude. A lookupmore » table retrieval method was used to successfully extract, through an optimal estimation method, cloud optical depth up to a maximum value of 2.6 and to separate thin ice clouds into two classes: (1) TIC1 clouds characterized by small crystals (effective particle diameter ≤ 30 µm), and (2) TIC2 clouds characterized by large ice crystals (effective particle diameter > 30 µm). The retrieval technique was validated using data from the Arctic High Spectral Resolution Lidar (AHSRL) and Millimeter Wave Cloud Radar (MMCR). Inversions were performed over three polar winters and results showed a significant correlation ( R 2 = 0.95) for cloud optical depth retrievals and an overall accuracy of 83 % for the classification of TIC1 and TIC2 clouds. A partial validation relative to an algorithm based on high spectral resolution downwelling IR radiance measurements between 8 and 21µm was also performed. It confirms the robustness of the optical depth retrieval and the fact that the broadband thermal radiometer retrieval was sensitive to small particle (TIC1) sizes.« less
NASA Astrophysics Data System (ADS)
Blanchard, Yann; Royer, Alain; O'Neill, Norman T.; Turner, David D.; Eloranta, Edwin W.
2017-06-01
Multiband downwelling thermal measurements of zenith sky radiance, along with cloud boundary heights, were used in a retrieval algorithm to estimate cloud optical depth and effective particle diameter of thin ice clouds in the Canadian High Arctic. Ground-based thermal infrared (IR) radiances for 150 semitransparent ice clouds cases were acquired at the Polar Environment Atmospheric Research Laboratory (PEARL) in Eureka, Nunavut, Canada (80° N, 86° W). We analyzed and quantified the sensitivity of downwelling thermal radiance to several cloud parameters including optical depth, effective particle diameter and shape, water vapor content, cloud geometric thickness and cloud base altitude. A lookup table retrieval method was used to successfully extract, through an optimal estimation method, cloud optical depth up to a maximum value of 2.6 and to separate thin ice clouds into two classes: (1) TIC1 clouds characterized by small crystals (effective particle diameter ≤ 30 µm), and (2) TIC2 clouds characterized by large ice crystals (effective particle diameter > 30 µm). The retrieval technique was validated using data from the Arctic High Spectral Resolution Lidar (AHSRL) and Millimeter Wave Cloud Radar (MMCR). Inversions were performed over three polar winters and results showed a significant correlation (R2 = 0.95) for cloud optical depth retrievals and an overall accuracy of 83 % for the classification of TIC1 and TIC2 clouds. A partial validation relative to an algorithm based on high spectral resolution downwelling IR radiance measurements between 8 and 21 µm was also performed. It confirms the robustness of the optical depth retrieval and the fact that the broadband thermal radiometer retrieval was sensitive to small particle (TIC1) sizes.
Blanchard, Yann; Royer, Alain; O'Neill, Norman T.; ...
2017-06-09
Multiband downwelling thermal measurements of zenith sky radiance, along with cloud boundary heights, were used in a retrieval algorithm to estimate cloud optical depth and effective particle diameter of thin ice clouds in the Canadian High Arctic. Ground-based thermal infrared (IR) radiances for 150 semitransparent ice clouds cases were acquired at the Polar Environment Atmospheric Research Laboratory (PEARL) in Eureka, Nunavut, Canada (80° N, 86° W). We analyzed and quantified the sensitivity of downwelling thermal radiance to several cloud parameters including optical depth, effective particle diameter and shape, water vapor content, cloud geometric thickness and cloud base altitude. A lookupmore » table retrieval method was used to successfully extract, through an optimal estimation method, cloud optical depth up to a maximum value of 2.6 and to separate thin ice clouds into two classes: (1) TIC1 clouds characterized by small crystals (effective particle diameter ≤ 30 µm), and (2) TIC2 clouds characterized by large ice crystals (effective particle diameter > 30 µm). The retrieval technique was validated using data from the Arctic High Spectral Resolution Lidar (AHSRL) and Millimeter Wave Cloud Radar (MMCR). Inversions were performed over three polar winters and results showed a significant correlation ( R 2 = 0.95) for cloud optical depth retrievals and an overall accuracy of 83 % for the classification of TIC1 and TIC2 clouds. A partial validation relative to an algorithm based on high spectral resolution downwelling IR radiance measurements between 8 and 21µm was also performed. It confirms the robustness of the optical depth retrieval and the fact that the broadband thermal radiometer retrieval was sensitive to small particle (TIC1) sizes.« less
Liu, Zheng; Muhlbauer, Andreas; Ackerman, Thomas
2015-11-05
In this paper, we evaluate high-level clouds in a cloud resolving model during two convective cases, ARM9707 and KWAJEX. The simulated joint histograms of cloud occurrence and radar reflectivity compare well with cloud radar and satellite observations when using a two-moment microphysics scheme. However, simulations performed with a single moment microphysical scheme exhibit low biases of approximately 20 dB. During convective events, two-moment microphysical overestimate the amount of high-level cloud and one-moment microphysics precipitate too readily and underestimate the amount and height of high-level cloud. For ARM9707, persistent large positive biases in high-level cloud are found, which are not sensitivemore » to changes in ice particle fall velocity and ice nuclei number concentration in the two-moment microphysics. These biases are caused by biases in large-scale forcing and maintained by the periodic lateral boundary conditions. The combined effects include significant biases in high-level cloud amount, radiation, and high sensitivity of cloud amount to nudging time scale in both convective cases. The high sensitivity of high-level cloud amount to the thermodynamic nudging time scale suggests that thermodynamic nudging can be a powerful ‘‘tuning’’ parameter for the simulated cloud and radiation but should be applied with caution. The role of the periodic lateral boundary conditions in reinforcing the biases in cloud and radiation suggests that reducing the uncertainty in the large-scale forcing in high levels is important for similar convective cases and has far reaching implications for simulating high-level clouds in super-parameterized global climate models such as the multiscale modeling framework.« less
Security and Cloud Outsourcing Framework for Economic Dispatch
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi
The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less
Security and Cloud Outsourcing Framework for Economic Dispatch
Sarker, Mushfiqur R.; Wang, Jianhui; Li, Zuyi; ...
2017-04-24
The computational complexity and problem sizes of power grid applications have increased significantly with the advent of renewable resources and smart grid technologies. The current paradigm of solving these issues consist of inhouse high performance computing infrastructures, which have drawbacks of high capital expenditures, maintenance, and limited scalability. Cloud computing is an ideal alternative due to its powerful computational capacity, rapid scalability, and high cost-effectiveness. A major challenge, however, remains in that the highly confidential grid data is susceptible for potential cyberattacks when outsourced to the cloud. In this work, a security and cloud outsourcing framework is developed for themore » Economic Dispatch (ED) linear programming application. As a result, the security framework transforms the ED linear program into a confidentiality-preserving linear program, that masks both the data and problem structure, thus enabling secure outsourcing to the cloud. Results show that for large grid test cases the performance gain and costs outperforms the in-house infrastructure.« less
A simple biota removal algorithm for 35 GHz cloud radar measurements
NASA Astrophysics Data System (ADS)
Kalapureddy, Madhu Chandra R.; Sukanya, Patra; Das, Subrata K.; Deshpande, Sachin M.; Pandithurai, Govindan; Pazamany, Andrew L.; Ambuj K., Jha; Chakravarty, Kaustav; Kalekar, Prasad; Krishna Devisetty, Hari; Annam, Sreenivas
2018-03-01
Cloud radar reflectivity profiles can be an important measurement for the investigation of cloud vertical structure (CVS). However, extracting intended meteorological cloud content from the measurement often demands an effective technique or algorithm that can reduce error and observational uncertainties in the recorded data. In this work, a technique is proposed to identify and separate cloud and non-hydrometeor echoes using the radar Doppler spectral moments profile measurements. The point and volume target-based theoretical radar sensitivity curves are used for removing the receiver noise floor and identified radar echoes are scrutinized according to the signal decorrelation period. Here, it is hypothesized that cloud echoes are observed to be temporally more coherent and homogenous and have a longer correlation period than biota. That can be checked statistically using ˜ 4 s sliding mean and standard deviation value of reflectivity profiles. The above step helps in screen out clouds critically by filtering out the biota. The final important step strives for the retrieval of cloud height. The proposed algorithm potentially identifies cloud height solely through the systematic characterization of Z variability using the local atmospheric vertical structure knowledge besides to the theoretical, statistical and echo tracing tools. Thus, characterization of high-resolution cloud radar reflectivity profile measurements has been done with the theoretical echo sensitivity curves and observed echo statistics for the true cloud height tracking (TEST). TEST showed superior performance in screening out clouds and filtering out isolated insects. TEST constrained with polarimetric measurements was found to be more promising under high-density biota whereas TEST combined with linear depolarization ratio and spectral width perform potentially to filter out biota within the highly turbulent shallow cumulus clouds in the convective boundary layer (CBL). This TEST technique is promisingly simple in realization but powerful in performance due to the flexibility in constraining, identifying and filtering out the biota and screening out the true cloud content, especially the CBL clouds. Therefore, the TEST algorithm is superior for screening out the low-level clouds that are strongly linked to the rainmaking mechanism associated with the Indian Summer Monsoon region's CVS.
An Application-Based Performance Evaluation of NASAs Nebula Cloud Computing Platform
NASA Technical Reports Server (NTRS)
Saini, Subhash; Heistand, Steve; Jin, Haoqiang; Chang, Johnny; Hood, Robert T.; Mehrotra, Piyush; Biswas, Rupak
2012-01-01
The high performance computing (HPC) community has shown tremendous interest in exploring cloud computing as it promises high potential. In this paper, we examine the feasibility, performance, and scalability of production quality scientific and engineering applications of interest to NASA on NASA's cloud computing platform, called Nebula, hosted at Ames Research Center. This work represents the comprehensive evaluation of Nebula using NUTTCP, HPCC, NPB, I/O, and MPI function benchmarks as well as four applications representative of the NASA HPC workload. Specifically, we compare Nebula performance on some of these benchmarks and applications to that of NASA s Pleiades supercomputer, a traditional HPC system. We also investigate the impact of virtIO and jumbo frames on interconnect performance. Overall results indicate that on Nebula (i) virtIO and jumbo frames improve network bandwidth by a factor of 5x, (ii) there is a significant virtualization layer overhead of about 10% to 25%, (iii) write performance is lower by a factor of 25x, (iv) latency for short MPI messages is very high, and (v) overall performance is 15% to 48% lower than that on Pleiades for NASA HPC applications. We also comment on the usability of the cloud platform.
Added Value of Far-Infrared Radiometry for Ice Cloud Remote Sensing
NASA Astrophysics Data System (ADS)
Libois, Q.; Blanchet, J. P.; Ivanescu, L.; S Pelletier, L.; Laurence, C.
2017-12-01
Several cloud retrieval algorithms based on satellite observations in the infrared have been developed in the last decades. However, most of these observations only cover the midinfrared (MIR, λ < 15 μm) part of the spectrum, and none are available in the far-infrared (FIR, λ ≥ 15 μm). Recent developments in FIR sensors technology, though, now make it possible to consider spaceborne remote sensing in the FIR. Here we show that adding a few FIR channels with realistic radiometric performances to existing spaceborne narrowband radiometers would significantly improve their ability to retrieve ice cloud radiative properties. For clouds encountered in the polar regions and the upper troposphere, where the atmosphere above clouds is sufficiently transparent in the FIR, using FIR channels would reduce by more than 50% the uncertainties on retrieved values of optical thickness, effective particle diameter, and cloud top altitude. This would somehow extend the range of applicability of current infrared retrieval methods to the polar regions and to clouds with large optical thickness, where MIR algorithms perform poorly. The high performance of solar reflection-based algorithms would thus be reached in nighttime conditions. Using FIR observations is a promising venue for studying ice cloud microphysics and precipitation processes, which is highly relevant for cirrus clouds and convective towers, and for investigating the water cycle in the driest regions of the atmosphere.
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian; ...
2017-09-29
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
HEPCloud, a New Paradigm for HEP Facilities: CMS Amazon Web Services Investigation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holzman, Burt; Bauerdick, Lothar A. T.; Bockelman, Brian
Historically, high energy physics computing has been performed on large purpose-built computing systems. These began as single-site compute facilities, but have evolved into the distributed computing grids used today. Recently, there has been an exponential increase in the capacity and capability of commercial clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest among the cloud providers to demonstrate the capability to perform large-scale scientific computing. In this paper, we discuss results from the CMS experiment using the Fermilab HEPCloud facility, which utilized bothmore » local Fermilab resources and virtual machines in the Amazon Web Services Elastic Compute Cloud. We discuss the planning, technical challenges, and lessons learned involved in performing physics workflows on a large-scale set of virtualized resources. Additionally, we will discuss the economics and operational efficiencies when executing workflows both in the cloud and on dedicated resources.« less
Translational bioinformatics in the cloud: an affordable alternative
2010-01-01
With the continued exponential expansion of publicly available genomic data and access to low-cost, high-throughput molecular technologies for profiling patient populations, computational technologies and informatics are becoming vital considerations in genomic medicine. Although cloud computing technology is being heralded as a key enabling technology for the future of genomic research, available case studies are limited to applications in the domain of high-throughput sequence data analysis. The goal of this study was to evaluate the computational and economic characteristics of cloud computing in performing a large-scale data integration and analysis representative of research problems in genomic medicine. We find that the cloud-based analysis compares favorably in both performance and cost in comparison to a local computational cluster, suggesting that cloud computing technologies might be a viable resource for facilitating large-scale translational research in genomic medicine. PMID:20691073
AceCloud: Molecular Dynamics Simulations in the Cloud.
Harvey, M J; De Fabritiis, G
2015-05-26
We present AceCloud, an on-demand service for molecular dynamics simulations. AceCloud is designed to facilitate the secure execution of large ensembles of simulations on an external cloud computing service (currently Amazon Web Services). The AceCloud client, integrated into the ACEMD molecular dynamics package, provides an easy-to-use interface that abstracts all aspects of interaction with the cloud services. This gives the user the experience that all simulations are running on their local machine, minimizing the learning curve typically associated with the transition to using high performance computing services.
NASA Astrophysics Data System (ADS)
Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos
2014-05-01
Mobile cloud computing is receiving world-wide momentum for ubiquitous on-demand cloud services for mobile users provided by Amazon, Google etc. with low capital cost. However, Internet-centric clouds introduce wide area network (WAN) delays that are often intolerable for real-time applications such as video streaming. One promising approach to addressing this challenge is to deploy decentralized mini-cloud facility known as cloudlets to enable localized cloud services. When supported by local wireless connectivity, a wireless cloudlet is expected to offer low cost and high performance cloud services for the users. In this work, we implement a realistic framework that comprises both a popular Internet cloud (Amazon Cloud) and a real-world cloudlet (based on Ubuntu Enterprise Cloud (UEC)) for mobile cloud users in a wireless mesh network. We focus on real-time video streaming over the HTTP standard and implement a typical application. We further perform a comprehensive comparative analysis and empirical evaluation of the application's performance when it is delivered over the Internet cloud and the cloudlet respectively. The study quantifies the influence of the two different cloud networking architectures on supporting real-time video streaming. We also enable movement of the users in the wireless mesh network and investigate the effect of user's mobility on mobile cloud computing over the cloudlet and Amazon cloud respectively. Our experimental results demonstrate the advantages of the cloudlet paradigm over its Internet cloud counterpart in supporting the quality of service of real-time applications.
NASA Technical Reports Server (NTRS)
Wind, Galina; Riedi, Jerome; Platnick, Steven; Heidinger, Andrew
2014-01-01
The Cross-platform HIgh resolution Multi-instrument AtmosphEric Retrieval Algorithms (CHIMAERA) system allows us to perform MODIS-like cloud top, optical and microphysical properties retrievals on any sensor that possesses a minimum set of common spectral channels. The CHIMAERA system uses a shared-core architecture that takes retrieval method out of the equation when intercomparisons are made. Here we show an example of such retrieval and a comparison of simultaneous retrievals done using SEVIRI, MODIS and VIIRS sensors. All sensor retrievals are performed using CLAVR-x (or CLAVR-x based) cloud top properties algorithm. SEVIRI uses the SAF_NWC cloud mask. MODIS and VIIRS use the IFF-based cloud mask that is a shared algorithm between MODIS and VIIRS. The MODIS and VIIRS retrievals are performed using a VIIRS branch of CHIMAERA that limits available MODIS channel set. Even though in that mode certain MODIS products such as multilayer cloud map are not available, the cloud retrieval remains fully equivalent to operational Data Collection 6.
Evaluation of the Huawei UDS cloud storage system for CERN specific data
NASA Astrophysics Data System (ADS)
Zotes Resines, M.; Heikkila, S. S.; Duellmann, D.; Adde, G.; Toebbicke, R.; Hughes, J.; Wang, L.
2014-06-01
Cloud storage is an emerging architecture aiming to provide increased scalability and access performance, compared to more traditional solutions. CERN is evaluating this promise using Huawei UDS and OpenStack SWIFT storage deployments, focusing on the needs of high-energy physics. Both deployed setups implement S3, one of the protocols that are emerging as a standard in the cloud storage market. A set of client machines is used to generate I/O load patterns to evaluate the storage system performance. The presented read and write test results indicate scalability both in metadata and data perspectives. Futher the Huawei UDS cloud storage is shown to be able to recover from a major failure of losing 16 disks. Both cloud storages are finally demonstrated to function as back-end storage systems to a filesystem, which is used to deliver high energy physics software.
NASA Astrophysics Data System (ADS)
Torii, K.; Hattori, Y.; Hasegawa, K.; Ohama, A.; Haworth, T. J.; Shima, K.; Habe, A.; Tachihara, K.; Mizuno, N.; Onishi, T.; Mizuno, A.; Fukui, Y.
2017-02-01
Understanding high-mass star formation is one of the top-priority issues in astrophysics. Recent observational studies have revealed that cloud-cloud collisions may play a role in high-mass star formation in several places in the Milky Way and the Large Magellanic Cloud. The Trifid Nebula M20 is a well-known Galactic H II region ionized by a single O7.5 star. In 2011, based on the CO observations with NANTEN2, we reported that the O star was formed by the collision between two molecular clouds ˜0.3 Myr ago. Those observations identified two molecular clouds toward M20, traveling at a relative velocity of 7.5 {km} {{{s}}}-1. This velocity separation implies that the clouds cannot be gravitationally bound to M20, but since the clouds show signs of heating by the stars there they must be spatially coincident with it. A collision is therefore highly possible. In this paper we present the new CO J = 1-0 and J = 3-2 observations of the colliding clouds in M20 performed with the Mopra and ASTE telescopes. The high-resolution observations revealed that the two molecular clouds have peculiar spatial and velocity structures, I.e., a spatially complementary distribution between the two clouds and a bridge feature that connects the two clouds in velocity space. Based on a new comparison with numerical models, we find that this complementary distribution is an expected outcome of cloud-cloud collisions, and that the bridge feature can be interpreted as the turbulent gas excited at the interface of the collision. Our results reinforce the cloud-cloud collision scenario in M20.
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-07
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.
Multiview 3D sensing and analysis for high quality point cloud reconstruction
NASA Astrophysics Data System (ADS)
Satnik, Andrej; Izquierdo, Ebroul; Orjesek, Richard
2018-04-01
Multiview 3D reconstruction techniques enable digital reconstruction of 3D objects from the real world by fusing different viewpoints of the same object into a single 3D representation. This process is by no means trivial and the acquisition of high quality point cloud representations of dynamic 3D objects is still an open problem. In this paper, an approach for high fidelity 3D point cloud generation using low cost 3D sensing hardware is presented. The proposed approach runs in an efficient low-cost hardware setting based on several Kinect v2 scanners connected to a single PC. It performs autocalibration and runs in real-time exploiting an efficient composition of several filtering methods including Radius Outlier Removal (ROR), Weighted Median filter (WM) and Weighted Inter-Frame Average filtering (WIFA). The performance of the proposed method has been demonstrated through efficient acquisition of dense 3D point clouds of moving objects.
Fractal Analyses of High-Resolution Cloud Droplet Measurements.
NASA Astrophysics Data System (ADS)
Malinowski, Szymon P.; Leclerc, Monique Y.; Baumgardner, Darrel G.
1994-02-01
Fractal analyses of individual cloud droplet distributions using aircraft measurements along one-dimensional horizontal cross sections through clouds are performed. Box counting and cluster analyses are used to determine spatial scales of inhomogeneity of cloud droplet spacing. These analyses reveal that droplet spatial distributions do not exhibit a fractal behavior. A high variability in local droplet concentration in cloud volumes undergoing mixing was found. In these regions, thin filaments of cloudy air with droplet concentration close to those observed in cloud cores were found. Results suggest that these filaments may be anisotropic. Additional box counting analyses performed for various classes of cloud droplet diameters indicate that large and small droplets are similarly distributed, except for the larger characteristic spacing of large droplets.A cloud-clear air interface defined by a certain threshold of total droplet count (TDC) was investigated. There are indications that this interface is a convoluted surface of a fractal nature, at least in actively developing cumuliform clouds. In contrast, TDC in the cloud interior does not have fractal or multifractal properties. Finally a random Cantor set (RCS) was introduced as a model of a fractal process with an ill-defined internal scale. A uniform measure associated with the RCS after several generations was introduced to simulate the TDC records. Comparison of the model with real TDC records indicates similar properties of both types of data series.
Integration of High-Performance Computing into Cloud Computing Services
NASA Astrophysics Data System (ADS)
Vouk, Mladen A.; Sills, Eric; Dreher, Patrick
High-Performance Computing (HPC) projects span a spectrum of computer hardware implementations ranging from peta-flop supercomputers, high-end tera-flop facilities running a variety of operating systems and applications, to mid-range and smaller computational clusters used for HPC application development, pilot runs and prototype staging clusters. What they all have in common is that they operate as a stand-alone system rather than a scalable and shared user re-configurable resource. The advent of cloud computing has changed the traditional HPC implementation. In this article, we will discuss a very successful production-level architecture and policy framework for supporting HPC services within a more general cloud computing infrastructure. This integrated environment, called Virtual Computing Lab (VCL), has been operating at NC State since fall 2004. Nearly 8,500,000 HPC CPU-Hrs were delivered by this environment to NC State faculty and students during 2009. In addition, we present and discuss operational data that show that integration of HPC and non-HPC (or general VCL) services in a cloud can substantially reduce the cost of delivering cloud services (down to cents per CPU hour).
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-01-01
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions. PMID:28208684
Liu, Kui; Wei, Sixiao; Chen, Zhijiang; Jia, Bin; Chen, Genshe; Ling, Haibin; Sheaff, Carolyn; Blasch, Erik
2017-02-12
This paper presents the first attempt at combining Cloud with Graphic Processing Units (GPUs) in a complementary manner within the framework of a real-time high performance computation architecture for the application of detecting and tracking multiple moving targets based on Wide Area Motion Imagery (WAMI). More specifically, the GPU and Cloud Moving Target Tracking (GC-MTT) system applied a front-end web based server to perform the interaction with Hadoop and highly parallelized computation functions based on the Compute Unified Device Architecture (CUDA©). The introduced multiple moving target detection and tracking method can be extended to other applications such as pedestrian tracking, group tracking, and Patterns of Life (PoL) analysis. The cloud and GPUs based computing provides an efficient real-time target recognition and tracking approach as compared to methods when the work flow is applied using only central processing units (CPUs). The simultaneous tracking and recognition results demonstrate that a GC-MTT based approach provides drastically improved tracking with low frame rates over realistic conditions.
A Weibull distribution accrual failure detector for cloud computing.
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Dong, Jian; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing.
Making Cloud Computing Available For Researchers and Innovators (Invited)
NASA Astrophysics Data System (ADS)
Winsor, R.
2010-12-01
High Performance Computing (HPC) facilities exist in most academic institutions but are almost invariably over-subscribed. Access is allocated based on academic merit, the only practical method of assigning valuable finite compute resources. Cloud computing on the other hand, and particularly commercial clouds, draw flexibly on an almost limitless resource as long as the user has sufficient funds to pay the bill. How can the commercial cloud model be applied to scientific computing? Is there a case to be made for a publicly available research cloud and how would it be structured? This talk will explore these themes and describe how Cybera, a not-for-profit non-governmental organization in Alberta Canada, aims to leverage its high speed research and education network to provide cloud computing facilities for a much wider user base.
NASA Astrophysics Data System (ADS)
Urbanek, Benedikt; Groß, Silke; Wirth, Martin
2017-04-01
Cirrus clouds impose high uncertainties on weather and climate prediction, as knowledge on important processes is still incomplete. For instance it remains unclear how cloud optical, microphysical, and radiative properties change as the cirrus evolves. To gain better understanding of cirrus clouds, their optical and microphysical properties and their changes with cirrus cloud evolution the ML-CIRRUS campaign was conducted in March and April 2014. Measurements with a combined in-situ and remote sensing payload were performed with the German research aircraft HALO based in Oberpfaffenhofen. 16 research flights with altogether 88 flight hours were performed over the North-Atlantic, western and central Europe to probe different cirrus cloud regimes and cirrus clouds at different stages of evolution. One of the key remotes sensing instruments during ML-CIRRUS was the airborne differential absorption and high spectral lidar system WALES. It measures the 2-dimensional distribution of water vapor inside and outside of cirrus clouds as well as the optical properties of the clouds. Bases on these airborne lidar measurements a novel classification scheme to derive the stage of cirrus cloud evolution was developed. It identifies regions of ice nucleation, particle growth by deposition of water vapor, and ice sublimation. This method is used to investigate differences in the distribution and value of optical properties as well as in the distribution of water vapor and relative humidity depending on the stage of evolution of the cloud. We will present the lidar based classification scheme and its application on a wave driven cirrus cloud case, and we will show first results of the dependence of optical cloud properties and relative humidity distributions on the determined stage of evolution.
NASA Astrophysics Data System (ADS)
Sirch, Tobias; Bugliaro, Luca; Zinner, Tobias; Möhrlein, Matthias; Vazquez-Navarro, Margarita
2017-02-01
A novel approach for the nowcasting of clouds and direct normal irradiance (DNI) based on the Spinning Enhanced Visible and Infrared Imager (SEVIRI) aboard the geostationary Meteosat Second Generation (MSG) satellite is presented for a forecast horizon up to 120 min. The basis of the algorithm is an optical flow method to derive cloud motion vectors for all cloudy pixels. To facilitate forecasts over a relevant time period, a classification of clouds into objects and a weighted triangular interpolation of clear-sky regions are used. Low and high level clouds are forecasted separately because they show different velocities and motion directions. Additionally a distinction in advective and convective clouds together with an intensity correction for quickly thinning convective clouds is integrated. The DNI is calculated from the forecasted optical thickness of the low and high level clouds. In order to quantitatively assess the performance of the algorithm, a forecast validation against MSG/SEVIRI observations is performed for a period of 2 months. Error rates and Hanssen-Kuiper skill scores are derived for forecasted cloud masks. For a forecast of 5 min for most cloud situations more than 95 % of all pixels are predicted correctly cloudy or clear. This number decreases to 80-95 % for a forecast of 2 h depending on cloud type and vertical cloud level. Hanssen-Kuiper skill scores for cloud mask go down to 0.6-0.7 for a 2 h forecast. Compared to persistence an improvement of forecast horizon by a factor of 2 is reached for all forecasts up to 2 h. A comparison of forecasted optical thickness distributions and DNI against observations yields correlation coefficients larger than 0.9 for 15 min forecasts and around 0.65 for 2 h forecasts.
Properties of the electron cloud in a high-energy positron and electron storage ring
Harkay, K. C.; Rosenberg, R. A.
2003-03-20
Low-energy, background electrons are ubiquitous in high-energy particle accelerators. Under certain conditions, interactions between this electron cloud and the high-energy beam can give rise to numerous effects that can seriously degrade the accelerator performance. These effects range from vacuum degradation to collective beam instabilities and emittance blowup. Although electron-cloud effects were first observed two decades ago in a few proton storage rings, they have in recent years been widely observed and intensely studied in positron and proton rings. Electron-cloud diagnostics developed at the Advanced Photon Source enabled for the first time detailed, direct characterization of the electron-cloud properties in amore » positron and electron storage ring. From in situ measurements of the electron flux and energy distribution at the vacuum chamber wall, electron-cloud production mechanisms and details of the beam-cloud interaction can be inferred. A significant longitudinal variation of the electron cloud is also observed, due primarily to geometrical details of the vacuum chamber. Furthermore, such experimental data can be used to provide realistic limits on key input parameters in modeling efforts, leading ultimately to greater confidence in predicting electron-cloud effects in future accelerators.« less
NASA Astrophysics Data System (ADS)
de Laat, Adrianus; Defer, Eric; Delanoë, Julien; Dezitter, Fabien; Gounou, Amanda; Grandin, Alice; Guignard, Anthony; Fokke Meirink, Jan; Moisselin, Jean-Marc; Parol, Frédéric
2017-04-01
We present an evaluation of the ability of passive broadband geostationary satellite measurements to detect high ice water content (IWC > 1 g m-3) as part of the European High Altitude Ice Crystals (HAIC) project for detection of upper-atmospheric high IWC, which can be a hazard for aviation. We developed a high IWC mask based on measurements of cloud properties using the Cloud Physical Properties (CPP) algorithm applied to the geostationary Meteosat Second Generation (MSG) Spinning Enhanced Visible and Infrared Imager (SEVIRI). Evaluation of the high IWC mask with satellite measurements of active remote sensors of cloud properties (CLOUDSAT/CALIPSO combined in the DARDAR (raDAR-liDAR) product) reveals that the high IWC mask is capable of detecting high IWC values > 1 g m-3 in the DARDAR profiles with a probability of detection of 60-80 %. The best CPP predictors of high IWC were the condensed water path, cloud optical thickness, cloud phase, and cloud top height. The evaluation of the high IWC mask against DARDAR provided indications that the MSG-CPP high IWC mask is more sensitive to cloud ice or cloud water in the upper part of the cloud, which is relevant for aviation purposes. Biases in the CPP results were also identified, in particular a solar zenith angle (SZA) dependence that reduces the performance of the high IWC mask for SZAs > 60°. Verification statistics show that for the detection of high IWC a trade-off has to be made between better detection of high IWC scenes and more false detections, i.e., scenes identified by the high IWC mask that do not contain IWC > 1 g m-3. However, the large majority of these detections still contain IWC values between 0.1 and 1 g m-3. Comparison of the high IWC mask against results from the Rapidly Developing Thunderstorm (RDT) algorithm applied to the same geostationary SEVIRI data showed that there are similarities and differences with the high IWC mask: the RDT algorithm is very capable of detecting young/new convective cells and areas, whereas the high IWC mask appears to be better capable of detecting more mature and ageing convection as well as cirrus remnants. The lack of detailed understanding of what causes aviation hazards related to high IWC, as well as the lack of clearly defined user requirements, hampers further tuning of the high IWC mask. Future evaluation of the high IWC mask against field campaign data, as well as obtaining user feedback and user requirements from the aviation industry, should provide more information on the performance of the MSG-CPP high IWC mask and contribute to improving the practical use of the high IWC mask.
Validation of VIIRS Cloud Base Heights at Night Using Ground and Satellite Measurements over Alaska
NASA Astrophysics Data System (ADS)
NOH, Y. J.; Miller, S. D.; Seaman, C.; Forsythe, J. M.; Brummer, R.; Lindsey, D. T.; Walther, A.; Heidinger, A. K.; Li, Y.
2016-12-01
Knowledge of Cloud Base Height (CBH) is critical to describing cloud radiative feedbacks in numerical models and is of practical significance to aviation communities. We have developed a new CBH algorithm constrained by Cloud Top Height (CTH) and Cloud Water Path (CWP) by performing a statistical analysis of A-Train satellite data. It includes an extinction-based method for thin cirrus. In the algorithm, cloud geometric thickness is derived with upstream CTH and CWP input and subtracted from CTH to generate the topmost layer CBH. The CBH information is a key parameter for an improved Cloud Cover/Layers product. The algorithm has been applied to the Visible Infrared Imaging Radiometer Suite (VIIRS) onboard the Suomi NPP spacecraft. Nighttime cloud optical properties for CWP are retrieved from the nighttime lunar cloud optical and microphysical properties (NLCOMP) algorithm based on a lunar reflectance model for the VIIRS Day/Night Band (DNB) measuring nighttime visible light such as moonlight. The DNB has innovative capabilities to fill the polar winter and nighttime gap of cloud observations which has been an important shortfall from conventional radiometers. The CBH products have been intensively evaluated against CloudSat data. The results showed the new algorithm yields significantly improved performance over the original VIIRS CBH algorithm. However, since CloudSat is now operational during daytime only due to a battery anomaly, the nighttime performance has not been fully assessed. This presentation will show our approach to assess the performance of the CBH algorithm at night. VIIRS CBHs are retrieved over the Alaska region from October 2015 to April 2016 using the Clouds from AVHRR Extended (CLAVR-x) processing system. Ground-based measurements from ceilometer and micropulse lidar at the Atmospheric Radiation Measurement (ARM) site on the North Slope of Alaska are used for the analysis. Local weather conditions are checked using temperature and precipitation observations at the site. CALIPSO data with near-simultaneous colocation are added for multi-layered cloud cases which may have high clouds aloft beyond the ground measurements. Multi-month statistics of performance and case studies will be shown. Additional efforts for algorithm refinements will be also discussed.
Aerosols and polar stratospheric clouds measurements during the EASOE campaign
NASA Technical Reports Server (NTRS)
Haner, D.; Godin, S.; Megie, G.; David, C.; Mitev, V.
1992-01-01
Preliminary results of observations performed using two different lidar systems during the EASOE (European Arctic Stratospheric Ozone Experiment), which has taken place in the winter of 1991-1992 in the northern hemisphere lattitude regions, are presented. The first system is a ground based multiwavelength lidar intended to perform measurements of the ozone vertical distribution in the 5 km to 40 km altitude range. It was located in Sodankyla (67 degrees N, 27 degrees E) as part of the ELSA experiment. The objectives of the ELSA cooperative project is to study the relation between polar stratospheric cloud events and ozone depletion with high vertical resolution and temporal continuity, and the evolution of the ozone distribution in relation to the position of the polar vortex. The second system is an airborne backscatter lidar (Leandre) which allows for the study of the 3-D structure and the optical properties of polar stratospheric clouds. The Leandre instrument is a dual-polarization lidar system, emitting at 532 nm, which allows for the determination of the type of clouds observed, according to the usual classification of polar stratospheric clouds. More than 60 hours of flight were performed in Dec. 1991, and Jan. and Feb. 1992 in Kiruna, Sweden. The operation of the Leandre instrument has led to the observation of the short scale variability of the Pinatubo volcanic cloud in the high latitude regions and to several episodes of polar stratospheric clouds. Preliminary analysis of the data is presented.
Exploiting GPUs in Virtual Machine for BioCloud
Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon
2013-01-01
Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment. PMID:23710465
Exploiting GPUs in virtual machine for BioCloud.
Jo, Heeseung; Jeong, Jinkyu; Lee, Myoungho; Choi, Dong Hoon
2013-01-01
Recently, biological applications start to be reimplemented into the applications which exploit many cores of GPUs for better computation performance. Therefore, by providing virtualized GPUs to VMs in cloud computing environment, many biological applications will willingly move into cloud environment to enhance their computation performance and utilize infinite cloud computing resource while reducing expenses for computations. In this paper, we propose a BioCloud system architecture that enables VMs to use GPUs in cloud environment. Because much of the previous research has focused on the sharing mechanism of GPUs among VMs, they cannot achieve enough performance for biological applications of which computation throughput is more crucial rather than sharing. The proposed system exploits the pass-through mode of PCI express (PCI-E) channel. By making each VM be able to access underlying GPUs directly, applications can show almost the same performance as when those are in native environment. In addition, our scheme multiplexes GPUs by using hot plug-in/out device features of PCI-E channel. By adding or removing GPUs in each VM in on-demand manner, VMs in the same physical host can time-share their GPUs. We implemented the proposed system using the Xen VMM and NVIDIA GPUs and showed that our prototype is highly effective for biological GPU applications in cloud environment.
Jungle Computing: Distributed Supercomputing Beyond Clusters, Grids, and Clouds
NASA Astrophysics Data System (ADS)
Seinstra, Frank J.; Maassen, Jason; van Nieuwpoort, Rob V.; Drost, Niels; van Kessel, Timo; van Werkhoven, Ben; Urbani, Jacopo; Jacobs, Ceriel; Kielmann, Thilo; Bal, Henri E.
In recent years, the application of high-performance and distributed computing in scientific practice has become increasingly wide spread. Among the most widely available platforms to scientists are clusters, grids, and cloud systems. Such infrastructures currently are undergoing revolutionary change due to the integration of many-core technologies, providing orders-of-magnitude speed improvements for selected compute kernels. With high-performance and distributed computing systems thus becoming more heterogeneous and hierarchical, programming complexity is vastly increased. Further complexities arise because urgent desire for scalability and issues including data distribution, software heterogeneity, and ad hoc hardware availability commonly force scientists into simultaneous use of multiple platforms (e.g., clusters, grids, and clouds used concurrently). A true computing jungle.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pais Pitta de Lacerda Ruivo, Tiago; Bernabeu Altayo, Gerard; Garzoglio, Gabriele
2014-11-11
has been widely accepted that software virtualization has a big negative impact on high-performance computing (HPC) application performance. This work explores the potential use of Infiniband hardware virtualization in an OpenNebula cloud towards the efficient support of MPI-based workloads. We have implemented, deployed, and tested an Infiniband network on the FermiCloud private Infrastructure-as-a-Service (IaaS) cloud. To avoid software virtualization towards minimizing the virtualization overhead, we employed a technique called Single Root Input/Output Virtualization (SRIOV). Our solution spanned modifications to the Linux’s Hypervisor as well as the OpenNebula manager. We evaluated the performance of the hardware virtualization on up to 56more » virtual machines connected by up to 8 DDR Infiniband network links, with micro-benchmarks (latency and bandwidth) as well as w a MPI-intensive application (the HPL Linpack benchmark).« less
The application of cloud computing to scientific workflows: a study of cost and performance.
Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S
2013-01-28
The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.
A Weibull distribution accrual failure detector for cloud computing
Wu, Zhibo; Wu, Jin; Zhao, Yao; Wen, Dongxin
2017-01-01
Failure detectors are used to build high availability distributed systems as the fundamental component. To meet the requirement of a complicated large-scale distributed system, accrual failure detectors that can adapt to multiple applications have been studied extensively. However, several implementations of accrual failure detectors do not adapt well to the cloud service environment. To solve this problem, a new accrual failure detector based on Weibull Distribution, called the Weibull Distribution Failure Detector, has been proposed specifically for cloud computing. It can adapt to the dynamic and unexpected network conditions in cloud computing. The performance of the Weibull Distribution Failure Detector is evaluated and compared based on public classical experiment data and cloud computing experiment data. The results show that the Weibull Distribution Failure Detector has better performance in terms of speed and accuracy in unstable scenarios, especially in cloud computing. PMID:28278229
Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure
NASA Astrophysics Data System (ADS)
Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei
2011-09-01
Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. This work was presented in part at the 2010 Annual Meeting of the American Association of Physicists in Medicine (AAPM), Philadelphia, PA.
Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.
Trudgian, David C; Mirzaei, Hamid
2012-12-07
We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.
Atmospheric circulations required for thick high-altitude clouds and featureless transit spectra
NASA Astrophysics Data System (ADS)
Wang, H.; Wordsworth, R. D.
2017-12-01
The transmission spectra of exoplanet GJ 1214b and GJ 436b are featureless as measured by current instruments. According to the measured density of these planets, we have reason to believe these planets have atmospheres, and the spectroscopy features of the atmospheres are unexpectedly not shown in the transit spectra. An explanation is high-altitude clouds or hazes are optically thick enough to make the transit spectra flat in the current observed wavelength range. We analyze the atmospheric circulations and vertical mixing that are crucial for the possible existence of the thick high-altitude clouds. We perform a series of GCM simulations with different atmospheric compositions and planetary parameters to reveal the conditions that are required for showing featureless spectra, and study the dynamical processes. We also study the role of cloud particles with different sizes, compositions and spectral characteristics with a radiative transfer model and cloud physics models. Varying the compositions and sizes of the cloud particles results in different requirements for the atmospheric circulations.
Improved Modeling Tools Development for High Penetration Solar
DOE Office of Scientific and Technical Information (OSTI.GOV)
Washom, Byron; Meagher, Kevin
2014-12-11
One of the significant objectives of the High Penetration solar research is to help the DOE understand, anticipate, and minimize grid operation impacts as more solar resources are added to the electric power system. For Task 2.2, an effective, reliable approach to predicting solar energy availability for energy generation forecasts using the University of California, San Diego (UCSD) Sky Imager technology has been demonstrated. Granular cloud and ramp forecasts for the next 5 to 20 minutes over an area of 10 square miles were developed. Sky images taken every 30 seconds are processed to determine cloud locations and cloud motionmore » vectors yielding future cloud shadow locations respective to distributed generation or utility solar power plants in the area. The performance of the method depends on cloud characteristics. On days with more advective cloud conditions, the developed method outperforms persistence forecasts by up to 30% (based on mean absolute error). On days with dynamic conditions, the method performs worse than persistence. Sky Imagers hold promise for ramp forecasting and ramp mitigation in conjunction with inverter controls and energy storage. The pre-commercial Sky Imager solar forecasting algorithm was documented with licensing information and was a Sunshot website highlight.« less
Continuous All-Sky Cloud Measurements: Cloud Fraction Analysis Based on a Newly Developed Instrument
NASA Astrophysics Data System (ADS)
Aebi, C.; Groebner, J.; Kaempfer, N.; Vuilleumier, L.
2017-12-01
Clouds play an important role in the climate system and are also a crucial parameter for the Earth's surface energy budget. Ground-based measurements of clouds provide data in a high temporal resolution in order to quantify its influence on radiation. The newly developed all-sky cloud camera at PMOD/WRC in Davos (Switzerland), the infrared cloud camera (IRCCAM), is a microbolometer sensitive in the 8 - 14 μm wavelength range. To get all-sky information the camera is located on top of a frame looking downward on a spherical gold-plated mirror. The IRCCAM has been measuring continuously (day and nighttime) with a time resolution of one minute in Davos since September 2015. To assess the performance of the IRCCAM, two different visible all-sky cameras (Mobotix Q24M and Schreder VIS-J1006), which can only operate during daytime, are installed in Davos. All three camera systems have different software for calculating fractional cloud coverage from images. Our study analyzes mainly the fractional cloud coverage of the IRCCAM and compares it with the fractional cloud coverage calculated from the two visible cameras. Preliminary results of the measurement accuracy of the IRCCAM compared to the visible camera indicate that 78 % of the data are within ± 1 octa and even 93 % within ± 2 octas. An uncertainty of 1-2 octas corresponds to the measurement uncertainty of human observers. Therefore, the IRCCAM shows similar performance in detection of cloud coverage as the visible cameras and the human observers, with the advantage that continuous measurements with high temporal resolution are possible.
Added value of far-infrared radiometry for remote sensing of ice clouds
NASA Astrophysics Data System (ADS)
Libois, Quentin; Blanchet, Jean-Pierre
2017-06-01
Several cloud retrieval algorithms based on satellite observations in the infrared have been developed in the last decades. However, these observations only cover the midinfrared (MIR, λ < 15 μm) part of the spectrum, and none are available in the far-infrared (FIR, λ≥ 15 μm). Using the optimal estimation method, we show that adding a few FIR channels to existing spaceborne radiometers would significantly improve their ability to retrieve ice cloud radiative properties. For clouds encountered in the polar regions and the upper troposphere, where the atmosphere is sufficiently transparent in the FIR, using FIR channels would reduce by more than 50% the uncertainties on retrieved values of optical thickness, effective particle diameter, and cloud top altitude. Notably, this would extend the range of applicability of current retrieval methods to the polar regions and to clouds with large optical thickness, where MIR algorithms perform poorly. The high performance of solar reflection-based algorithms would thus be reached in nighttime conditions. Since the sensitivity of ice cloud thermal emission to effective particle diameter is approximately 5 times larger in the FIR than in the MIR, using FIR observations is a promising venue for studying ice cloud microphysics and precipitation processes. This is highly relevant for cirrus clouds and convective towers. This is also essential to study precipitation in the driest regions of the atmosphere, where strong feedbacks are at play between clouds and water vapor. The deployment in the near future of a FIR spaceborne radiometer is technologically feasible and should be strongly supported.
Simplified ISCCP cloud regimes for evaluating cloudiness in CMIP5 models
NASA Astrophysics Data System (ADS)
Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin
2017-01-01
We take advantage of ISCCP simulator data available for many models that participated in CMIP5, in order to introduce a framework for comparing model cloud output with corresponding ISCCP observations based on the cloud regime (CR) concept. Simplified global CRs are employed derived from the co-variations of three variables, namely cloud optical thickness, cloud top pressure and cloud fraction ( τ, p c , CF). Following evaluation criteria established in a companion paper of ours (Jin et al. 2016), we assess model cloud simulation performance based on how well the simplified CRs are simulated in terms of similarity of centroids, global values and map correlations of relative-frequency-of-occurrence, and long-term total cloud amounts. Mirroring prior results, modeled clouds tend to be too optically thick and not as extensive as in observations. CRs with high-altitude clouds from storm activity are not as well simulated here compared to the previous study, but other regimes containing near-overcast low clouds show improvement. Models that have performed well in the companion paper against CRs defined by joint τ- p c histograms distinguish themselves again here, but improvements for previously underperforming models are also seen. Averaging across models does not yield a drastically better picture, except for cloud geographical locations. Cloud evaluation with simplified regimes seems thus more forgiving than that using histogram-based CRs while still strict enough to reveal model weaknesses.
Temperature characterisation of the CLOUD chamber at CERN
NASA Astrophysics Data System (ADS)
Dias, A. M.; Almeida, J.; Kirkby, J.; Mathot, S.; Onnela, A.; Vogel, A.; Ehrhart, S.
2014-12-01
Temperature stability, uniformity and absolute scale inside the CLOUD (Cosmics Leaving OUtdoor Droplets) chamber at CERN are important for experiments on aerosol particle nucleation and ice/liquid cloud formation. In order to measure the air temperature, a comprehensive set of arrays ("strings") of platinum resistance thermometers, thermocouples and optical sensors have been installed inside the 26 m3 chamber. The thermal sensors must meet several challenging design requirements: ultra-clean materials, 0.01 K measurement sensitivity, high absolute precision (<0.1 K), 200 K - 373 K range, ability to operate in high electric fields (20 kV/m), and fast response in air (~1 s) in order to measure rapid changes of temperature during ice/liquid cloud formation in the chamber by adiabatic pressure reductions. This presentation will focus on the design of the thermometer strings and the thermal performance of the chamber during the CLOUD8 and CLOUD9 campaigns, 2013-2014, together with the planned upgrades of the CLOUD thermal system.
PACE: Proactively Secure Accumulo with Cryptographic Enforcement
2017-05-27
Abstract—Cloud-hosted databases have many compelling ben- efits, including high availability , flexible resource allocation, and resiliency to attack...infrastructure to the cloud. This move is motivated by the cloud’s increased availability , flexibility, and resilience [1]. Most importantly, the cloud enables...a level of availability and performance that would be impossible for many companies to achieve using their own infrastructure. For example, using a
Challenges and opportunities of cloud computing for atmospheric sciences
NASA Astrophysics Data System (ADS)
Pérez Montes, Diego A.; Añel, Juan A.; Pena, Tomás F.; Wallom, David C. H.
2016-04-01
Cloud computing is an emerging technological solution widely used in many fields. Initially developed as a flexible way of managing peak demand it has began to make its way in scientific research. One of the greatest advantages of cloud computing for scientific research is independence of having access to a large cyberinfrastructure to fund or perform a research project. Cloud computing can avoid maintenance expenses for large supercomputers and has the potential to 'democratize' the access to high-performance computing, giving flexibility to funding bodies for allocating budgets for the computational costs associated with a project. Two of the most challenging problems in atmospheric sciences are computational cost and uncertainty in meteorological forecasting and climate projections. Both problems are closely related. Usually uncertainty can be reduced with the availability of computational resources to better reproduce a phenomenon or to perform a larger number of experiments. Here we expose results of the application of cloud computing resources for climate modeling using cloud computing infrastructures of three major vendors and two climate models. We show how the cloud infrastructure compares in performance to traditional supercomputers and how it provides the capability to complete experiments in shorter periods of time. The monetary cost associated is also analyzed. Finally we discuss the future potential of this technology for meteorological and climatological applications, both from the point of view of operational use and research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karthik, Rajasekar
2014-01-01
In this paper, an architecture for building Scalable And Mobile Environment For High-Performance Computing with spatial capabilities called SAME4HPC is described using cutting-edge technologies and standards such as Node.js, HTML5, ECMAScript 6, and PostgreSQL 9.4. Mobile devices are increasingly becoming powerful enough to run high-performance apps. At the same time, there exist a significant number of low-end and older devices that rely heavily on the server or the cloud infrastructure to do the heavy lifting. Our architecture aims to support both of these types of devices to provide high-performance and rich user experience. A cloud infrastructure consisting of OpenStack withmore » Ubuntu, GeoServer, and high-performance JavaScript frameworks are some of the key open-source and industry standard practices that has been adopted in this architecture.« less
Exploring Cloud Computing for Large-scale Scientific Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lin, Guang; Han, Binh; Yin, Jian
This paper explores cloud computing for large-scale data-intensive scientific applications. Cloud computing is attractive because it provides hardware and software resources on-demand, which relieves the burden of acquiring and maintaining a huge amount of resources that may be used only once by a scientific application. However, unlike typical commercial applications that often just requires a moderate amount of ordinary resources, large-scale scientific applications often need to process enormous amount of data in the terabyte or even petabyte range and require special high performance hardware with low latency connections to complete computation in a reasonable amount of time. To address thesemore » challenges, we build an infrastructure that can dynamically select high performance computing hardware across institutions and dynamically adapt the computation to the selected resources to achieve high performance. We have also demonstrated the effectiveness of our infrastructure by building a system biology application and an uncertainty quantification application for carbon sequestration, which can efficiently utilize data and computation resources across several institutions.« less
Cloud Computing Boosts Business Intelligence of Telecommunication Industry
NASA Astrophysics Data System (ADS)
Xu, Meng; Gao, Dan; Deng, Chao; Luo, Zhiguo; Sun, Shaoling
Business Intelligence becomes an attracting topic in today's data intensive applications, especially in telecommunication industry. Meanwhile, Cloud Computing providing IT supporting Infrastructure with excellent scalability, large scale storage, and high performance becomes an effective way to implement parallel data processing and data mining algorithms. BC-PDM (Big Cloud based Parallel Data Miner) is a new MapReduce based parallel data mining platform developed by CMRI (China Mobile Research Institute) to fit the urgent requirements of business intelligence in telecommunication industry. In this paper, the architecture, functionality and performance of BC-PDM are presented, together with the experimental evaluation and case studies of its applications. The evaluation result demonstrates both the usability and the cost-effectiveness of Cloud Computing based Business Intelligence system in applications of telecommunication industry.
Templet Web: the use of volunteer computing approach in PaaS-style cloud
NASA Astrophysics Data System (ADS)
Vostokin, Sergei; Artamonov, Yuriy; Tsarev, Daniil
2018-03-01
This article presents the Templet Web cloud service. The service is designed for high-performance scientific computing automation. The use of high-performance technology is specifically required by new fields of computational science such as data mining, artificial intelligence, machine learning, and others. Cloud technologies provide a significant cost reduction for high-performance scientific applications. The main objectives to achieve this cost reduction in the Templet Web service design are: (a) the implementation of "on-demand" access; (b) source code deployment management; (c) high-performance computing programs development automation. The distinctive feature of the service is the approach mainly used in the field of volunteer computing, when a person who has access to a computer system delegates his access rights to the requesting user. We developed an access procedure, algorithms, and software for utilization of free computational resources of the academic cluster system in line with the methods of volunteer computing. The Templet Web service has been in operation for five years. It has been successfully used for conducting laboratory workshops and solving research problems, some of which are considered in this article. The article also provides an overview of research directions related to service development.
High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL.
Stone, John E; Messmer, Peter; Sisneros, Robert; Schulten, Klaus
2016-05-01
Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications.
High Performance Molecular Visualization: In-Situ and Parallel Rendering with EGL
Stone, John E.; Messmer, Peter; Sisneros, Robert; Schulten, Klaus
2016-01-01
Large scale molecular dynamics simulations produce terabytes of data that is impractical to transfer to remote facilities. It is therefore necessary to perform visualization tasks in-situ as the data are generated, or by running interactive remote visualization sessions and batch analyses co-located with direct access to high performance storage systems. A significant challenge for deploying visualization software within clouds, clusters, and supercomputers involves the operating system software required to initialize and manage graphics acceleration hardware. Recently, it has become possible for applications to use the Embedded-system Graphics Library (EGL) to eliminate the requirement for windowing system software on compute nodes, thereby eliminating a significant obstacle to broader use of high performance visualization applications. We outline the potential benefits of this approach in the context of visualization applications used in the cloud, on commodity clusters, and supercomputers. We discuss the implementation of EGL support in VMD, a widely used molecular visualization application, and we outline benefits of the approach for molecular visualization tasks on petascale computers, clouds, and remote visualization servers. We then provide a brief evaluation of the use of EGL in VMD, with tests using developmental graphics drivers on conventional workstations and on Amazon EC2 G2 GPU-accelerated cloud instance types. We expect that the techniques described here will be of broad benefit to many other visualization applications. PMID:27747137
High-resolution measurement of cloud microphysics and turbulence at a mountaintop station
NASA Astrophysics Data System (ADS)
Siebert, H.; Shaw, R. A.; Ditas, J.; Schmeissner, T.; Malinowski, S. P.; Bodenschatz, E.; Xu, H.
2015-08-01
Mountain research stations are advantageous not only for long-term sampling of cloud properties but also for measurements that are prohibitively difficult to perform on airborne platforms due to the large true air speed or adverse factors such as weight and complexity of the equipment necessary. Some cloud-turbulence measurements, especially Lagrangian in nature, fall into this category. We report results from simultaneous, high-resolution and collocated measurements of cloud microphysical and turbulence properties during several warm cloud events at the Umweltforschungsstation Schneefernerhaus (UFS) on Zugspitze in the German Alps. The data gathered were found to be representative of observations made with similar instrumentation in free clouds. The observed turbulence shared all features known for high-Reynolds-number flows: it exhibited approximately Gaussian fluctuations for all three velocity components, a clearly defined inertial subrange following Kolmogorov scaling (power spectrum, and second- and third-order Eulerian structure functions), and highly intermittent velocity gradients, as well as approximately lognormal kinetic energy dissipation rates. The clouds were observed to have liquid water contents on the order of 1 g m-3 and size distributions typical of continental clouds, sometimes exhibiting long positive tails indicative of large drop production through turbulent mixing or coalescence growth. Dimensionless parameters relevant to cloud-turbulence interactions, the Stokes number and settling parameter are in the range typically observed in atmospheric clouds. Observed fluctuations in droplet number concentration and diameter suggest a preference for inhomogeneous mixing. Finally, enhanced variance in liquid water content fluctuations is observed at high frequencies, and the scale break occurs at a value consistent with the independently estimated phase relaxation time from microphysical measurements.
NASA Astrophysics Data System (ADS)
Loro, Stephen Lee
This study was designed to examine moon illumination, moon angle, cloud cover, sky glow, and Night Vision Goggle (NVG) flight performance to determine possible effects. The research was a causal-comparative design. The sample consisted of 194 Fort Rucker Initial Entry Rotary Wing NVG flight students being observed by 69 NVG Instructor Pilots. The students participated in NVG flight training from September 1992 through January 1993. Data were collected using a questionnaire. Observations were analyzed using a Kruskal-Wallis one-way analysis of variance and a Wilcox matched pairs signed-ranks test for difference. Correlations were analyzed using Pearson's r. The analyses results indicated that performance at high moon illumination levels is superior to zero moon illumination, and in most task maneuvers, superior to >0%--50% moon illumination. No differences were found in performance at moon illumination levels above 50%. Moon angle had no effect on night vision goggle flight performance. Cloud cover and sky glow have selective effects on different maneuvers. For most task maneuvers, cloud cover does not affect performance. Overcast cloud cover had a significant effect on seven of the 14 task maneuvers. Sky glow did not affect eight out of 14 task maneuvers at any level of sky glow.
Enhancing a Simple MODIS Cloud Mask Algorithm for the Landsat Data Continuity Mission
NASA Technical Reports Server (NTRS)
Wilson, Michael J.; Oreopoulos, Lazarous
2011-01-01
The presence of clouds in images acquired by the Landsat series of satellites is usually an undesirable, but generally unavoidable fact. With the emphasis of the program being on land imaging, the suspended liquid/ice particles of which clouds are made of fully or partially obscure the desired observational target. Knowing the amount and location of clouds in a Landsat scene is therefore valuable information for scene selection, for making clear-sky composites from multiple scenes, and for scheduling future acquisitions. The two instruments in the upcoming Landsat Data Continuity Mission (LDCM) will include new channels that will enhance our ability to detect high clouds which are often also thin in the sense that a large fraction of solar radiation can pass through them. This work studies the potential impact of these new channels on enhancing LDCM's cloud detection capabilities compared to previous Landsat missions. We revisit a previously published scheme for cloud detection and add new tests to capture more of the thin clouds that are harder to detect with the more limited arsenal channels. Since there are no Landsat data yet that include the new LDCM channels, we resort to data from another instrument, MODIS, which has these bands, as well as the other bands of LDCM, to test the capabilities of our new algorithm. By comparing our revised scheme's performance against the performance of the official MODIS cloud detection scheme, we conclude that the new scheme performs better than the earlier scheme which was not very good at thin cloud detection.
NASA Astrophysics Data System (ADS)
Iwabuchi, Hironobu; Saito, Masanori; Tokoro, Yuka; Putri, Nurfiena Sagita; Sekiguchi, Miho
2016-12-01
Satellite remote sensing of the macroscopic, microphysical, and optical properties of clouds are useful for studying spatial and temporal variations of clouds at various scales and constraining cloud physical processes in climate and weather prediction models. Instead of using separate independent algorithms for different cloud properties, a unified, optimal estimation-based cloud retrieval algorithm is developed and applied to moderate resolution imaging spectroradiometer (MODIS) observations using ten thermal infrared bands. The model considers sensor configurations, background surface and atmospheric profile, and microphysical and optical models of ice and liquid cloud particles and radiative transfer in a plane-parallel, multilayered atmosphere. Measurement and model errors are thoroughly quantified from direct comparisons of clear-sky observations over the ocean with model calculations. Performance tests by retrieval simulations show that ice cloud properties are retrieved with high accuracy when cloud optical thickness (COT) is between 0.1 and 10. Cloud-top pressure is inferred with uncertainty lower than 10 % when COT is larger than 0.3. Applying the method to a tropical cloud system and comparing the results with the MODIS Collection 6 cloud product shows good agreement for ice cloud optical thickness when COT is less than about 5. Cloud-top height agrees well with estimates obtained by the CO2 slicing method used in the MODIS product. The present algorithm can detect optically thin parts at the edges of high clouds well in comparison with the MODIS product, in which these parts are recognized as low clouds by the infrared window method. The cloud thermodynamic phase in the present algorithm is constrained by cloud-top temperature, which tends not to produce results with an ice cloud that is too warm and liquid cloud that is too cold.
Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations
NASA Technical Reports Server (NTRS)
Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen
2016-01-01
Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.
Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations
NASA Astrophysics Data System (ADS)
Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.
2016-12-01
Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.
NASA Astrophysics Data System (ADS)
Siebert, H.; Shaw, R. A.; Ditas, J.; Schmeissner, T.; Malinowski, S. P.; Bodenschatz, E.; Xu, H.
2015-01-01
Mountain research stations are advantageous not only for long-term sampling of cloud properties, but also for measurements that prohibitively difficult to perform on airborne platforms due to the true air speed or adverse factors such as weight and complexity of the equipment necessary. Some cloud-turbulence measurements, especially Lagrangian in nature, fall into this category. We report results from simultaneous, high-resolution and collocated measurements of cloud microphysical and turbulence properties during several warm cloud events at the Umweltforschungsstation Schneefernerhaus (UFS) on Zugspitze in the German Alps. The data gathered was found to be representative of observations made with similar instrumentation in free clouds. The turbulence observed, shared all features known for high Reynolds number flows: it exhibited approximately Gaussian fluctuations for all three velocity components, a clearly defined inertial subrange following Kolmogorov scaling (power spectrum, and second and third order Eulerian structure functions), and highly intermittent velocity gradients, as well as approximately lognormal kinetic energy dissipation rates. The clouds were observed to have liquid water contents of order 1 g m-3, and size distributions typical of continental clouds, sometimes exhibiting long positive tails indicative of large drop production through turbulent mixing or coalescence growth. Dimensionless parameters relevant to cloud-turbulence interactions, the Stokes number and settling parameter, are in the range typically observed in atmospheric clouds. Observed fluctuations in droplet number concentration and diameter suggest a preference for inhomogeneous mixing. Finally, enhanced variance in liquid water content fluctuations is observed at high frequencies, and the scale break occurs at a value consistent with the independently estimated phase relaxation time from microphysical measurements.
SSeCloud: Using secret sharing scheme to secure keys
NASA Astrophysics Data System (ADS)
Hu, Liang; Huang, Yang; Yang, Disheng; Zhang, Yuzhen; Liu, Hengchang
2017-08-01
With the use of cloud storage services, one of the concerns is how to protect sensitive data securely and privately. While users enjoy the convenience of data storage provided by semi-trusted cloud storage providers, they are confronted with all kinds of risks at the same time. In this paper, we present SSeCloud, a secure cloud storage system that improves security and usability by applying secret sharing scheme to secure keys. The system encrypts uploading files on the client side and splits encrypted keys into three shares. Each of them is respectively stored by users, cloud storage providers and the alternative third trusted party. Any two of the parties can reconstruct keys. Evaluation results of prototype system show that SSeCloud provides high security without too much performance penalty.
Bent, John M.; Faibish, Sorin; Grider, Gary
2016-04-19
Cloud object storage is enabled for checkpoints of high performance computing applications using a middleware process. A plurality of files, such as checkpoint files, generated by a plurality of processes in a parallel computing system are stored by obtaining said plurality of files from said parallel computing system; converting said plurality of files to objects using a log structured file system middleware process; and providing said objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.
NASA Astrophysics Data System (ADS)
Hutchison, Keith D.; Etherton, Brian J.; Topping, Phillip C.
1996-12-01
Quantitative assessments on the performance of automated cloud analysis algorithms require the creation of highly accurate, manual cloud, no cloud (CNC) images from multispectral meteorological satellite data. In general, the methodology to create ground truth analyses for the evaluation of cloud detection algorithms is relatively straightforward. However, when focus shifts toward quantifying the performance of automated cloud classification algorithms, the task of creating ground truth images becomes much more complicated since these CNC analyses must differentiate between water and ice cloud tops while ensuring that inaccuracies in automated cloud detection are not propagated into the results of the cloud classification algorithm. The process of creating these ground truth CNC analyses may become particularly difficult when little or no spectral signature is evident between a cloud and its background, as appears to be the case when thin cirrus is present over snow-covered surfaces. In this paper, procedures are described that enhance the researcher's ability to manually interpret and differentiate between thin cirrus clouds and snow-covered surfaces in daytime AVHRR imagery. The methodology uses data in up to six AVHRR spectral bands, including an additional band derived from the daytime 3.7 micron channel, which has proven invaluable for the manual discrimination between thin cirrus clouds and snow. It is concluded that while the 1.6 micron channel remains essential to differentiate between thin ice clouds and snow. However, this capability that may be lost if the 3.7 micron data switches to a nighttime-only transmission with the launch of future NOAA satellites.
Context-aware distributed cloud computing using CloudScheduler
NASA Astrophysics Data System (ADS)
Seuster, R.; Leavett-Brown, CR; Casteels, K.; Driemel, C.; Paterson, M.; Ring, D.; Sobie, RJ; Taylor, RP; Weldon, J.
2017-10-01
The distributed cloud using the CloudScheduler VM provisioning service is one of the longest running systems for HEP workloads. It has run millions of jobs for ATLAS and Belle II over the past few years using private and commercial clouds around the world. Our goal is to scale the distributed cloud to the 10,000-core level, with the ability to run any type of application (low I/O, high I/O and high memory) on any cloud. To achieve this goal, we have been implementing changes that utilize context-aware computing designs that are currently employed in the mobile communication industry. Context-awareness makes use of real-time and archived data to respond to user or system requirements. In our distributed cloud, we have many opportunistic clouds with no local HEP services, software or storage repositories. A context-aware design significantly improves the reliability and performance of our system by locating the nearest location of the required services. We describe how we are collecting and managing contextual information from our workload management systems, the clouds, the virtual machines and our services. This information is used not only to monitor the system but also to carry out automated corrective actions. We are incrementally adding new alerting and response services to our distributed cloud. This will enable us to scale the number of clouds and virtual machines. Further, a context-aware design will enable us to run analysis or high I/O application on opportunistic clouds. We envisage an open-source HTTP data federation (for example, the DynaFed system at CERN) as a service that would provide us access to existing storage elements used by the HEP experiments.
NASA Astrophysics Data System (ADS)
Hoose, C.; Lohmann, U.; Stier, P.; Verheggen, B.; Weingartner, E.
2008-04-01
The global aerosol-climate model ECHAM5-HAM has been extended by an explicit treatment of cloud-borne particles. Two additional modes for in-droplet and in-crystal particles are introduced, which are coupled to the number of cloud droplet and ice crystal concentrations simulated by the ECHAM5 double-moment cloud microphysics scheme. Transfer, production, and removal of cloud-borne aerosol number and mass by cloud droplet activation, collision scavenging, aqueous-phase sulfate production, freezing, melting, evaporation, sublimation, and precipitation formation are taken into account. The model performance is demonstrated and validated with observations of the evolution of total and interstitial aerosol concentrations and size distributions during three different mixed-phase cloud events at the alpine high-altitude research station Jungfraujoch (Switzerland). Although the single-column simulations cannot be compared one-to-one with the observations, the governing processes in the evolution of the cloud and aerosol parameters are captured qualitatively well. High scavenged fractions are found during the presence of liquid water, while the release of particles during the Bergeron-Findeisen process results in low scavenged fractions after cloud glaciation. The observed coexistence of liquid and ice, which might be related to cloud heterogeneity at subgrid scales, can only be simulated in the model when assuming nonequilibrium conditions.
Midekisa, Alemayehu; Holl, Felix; Savory, David J; Andrade-Pacheco, Ricardo; Gething, Peter W; Bennett, Adam; Sturrock, Hugh J W
2017-01-01
Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth's land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources.
Holl, Felix; Savory, David J.; Andrade-Pacheco, Ricardo; Gething, Peter W.; Bennett, Adam; Sturrock, Hugh J. W.
2017-01-01
Quantifying and monitoring the spatial and temporal dynamics of the global land cover is critical for better understanding many of the Earth’s land surface processes. However, the lack of regularly updated, continental-scale, and high spatial resolution (30 m) land cover data limit our ability to better understand the spatial extent and the temporal dynamics of land surface changes. Despite the free availability of high spatial resolution Landsat satellite data, continental-scale land cover mapping using high resolution Landsat satellite data was not feasible until now due to the need for high-performance computing to store, process, and analyze this large volume of high resolution satellite data. In this study, we present an approach to quantify continental land cover and impervious surface changes over a long period of time (15 years) using high resolution Landsat satellite observations and Google Earth Engine cloud computing platform. The approach applied here to overcome the computational challenges of handling big earth observation data by using cloud computing can help scientists and practitioners who lack high-performance computational resources. PMID:28953943
2009-03-22
indirect effect (AIE) index determined from the slope of the fitted linear equation involving cloud particle size vs. aerosol optical depth is about a... raindrop . The model simulations were performed for a 48-hour period, starting at 00Z on 29 March 2007, about 20 hours prior to ABL test flight time...UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) MS. KRISTEN LUND UNIV OF CALIFORNIA LOS ANGELES, CA 90095 8. PERFORMING
Yokohama, Noriya
2013-07-01
This report was aimed at structuring the design of architectures and studying performance measurement of a parallel computing environment using a Monte Carlo simulation for particle therapy using a high performance computing (HPC) instance within a public cloud-computing infrastructure. Performance measurements showed an approximately 28 times faster speed than seen with single-thread architecture, combined with improved stability. A study of methods of optimizing the system operations also indicated lower cost.
Large-scale parallel genome assembler over cloud computing environment.
Das, Arghya Kusum; Koppa, Praveen Kumar; Goswami, Sayan; Platania, Richard; Park, Seung-Jong
2017-06-01
The size of high throughput DNA sequencing data has already reached the terabyte scale. To manage this huge volume of data, many downstream sequencing applications started using locality-based computing over different cloud infrastructures to take advantage of elastic (pay as you go) resources at a lower cost. However, the locality-based programming model (e.g. MapReduce) is relatively new. Consequently, developing scalable data-intensive bioinformatics applications using this model and understanding the hardware environment that these applications require for good performance, both require further research. In this paper, we present a de Bruijn graph oriented Parallel Giraph-based Genome Assembler (GiGA), as well as the hardware platform required for its optimal performance. GiGA uses the power of Hadoop (MapReduce) and Giraph (large-scale graph analysis) to achieve high scalability over hundreds of compute nodes by collocating the computation and data. GiGA achieves significantly higher scalability with competitive assembly quality compared to contemporary parallel assemblers (e.g. ABySS and Contrail) over traditional HPC cluster. Moreover, we show that the performance of GiGA is significantly improved by using an SSD-based private cloud infrastructure over traditional HPC cluster. We observe that the performance of GiGA on 256 cores of this SSD-based cloud infrastructure closely matches that of 512 cores of traditional HPC cluster.
Signal and image processing algorithm performance in a virtual and elastic computing environment
NASA Astrophysics Data System (ADS)
Bennett, Kelly W.; Robertson, James
2013-05-01
The U.S. Army Research Laboratory (ARL) supports the development of classification, detection, tracking, and localization algorithms using multiple sensing modalities including acoustic, seismic, E-field, magnetic field, PIR, and visual and IR imaging. Multimodal sensors collect large amounts of data in support of algorithm development. The resulting large amount of data, and their associated high-performance computing needs, increases and challenges existing computing infrastructures. Purchasing computer power as a commodity using a Cloud service offers low-cost, pay-as-you-go pricing models, scalability, and elasticity that may provide solutions to develop and optimize algorithms without having to procure additional hardware and resources. This paper provides a detailed look at using a commercial cloud service provider, such as Amazon Web Services (AWS), to develop and deploy simple signal and image processing algorithms in a cloud and run the algorithms on a large set of data archived in the ARL Multimodal Signatures Database (MMSDB). Analytical results will provide performance comparisons with existing infrastructure. A discussion on using cloud computing with government data will discuss best security practices that exist within cloud services, such as AWS.
Enabling Large-Scale Biomedical Analysis in the Cloud
Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen
2013-01-01
Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665
Cloud Computing for Protein-Ligand Binding Site Comparison
2013-01-01
The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery. PMID:23762824
Cloud computing for protein-ligand binding site comparison.
Hung, Che-Lun; Hua, Guan-Jie
2013-01-01
The proteome-wide analysis of protein-ligand binding sites and their interactions with ligands is important in structure-based drug design and in understanding ligand cross reactivity and toxicity. The well-known and commonly used software, SMAP, has been designed for 3D ligand binding site comparison and similarity searching of a structural proteome. SMAP can also predict drug side effects and reassign existing drugs to new indications. However, the computing scale of SMAP is limited. We have developed a high availability, high performance system that expands the comparison scale of SMAP. This cloud computing service, called Cloud-PLBS, combines the SMAP and Hadoop frameworks and is deployed on a virtual cloud computing platform. To handle the vast amount of experimental data on protein-ligand binding site pairs, Cloud-PLBS exploits the MapReduce paradigm as a management and parallelizing tool. Cloud-PLBS provides a web portal and scalability through which biologists can address a wide range of computer-intensive questions in biology and drug discovery.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud.
Cianfrocco, Michael A; Leschziner, Andres E
2015-05-08
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available 'off-the-shelf' computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16-480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM.
Theoretical Models of Protostellar Binary and Multiple Systems with AMR Simulations
NASA Astrophysics Data System (ADS)
Matsumoto, Tomoaki; Tokuda, Kazuki; Onishi, Toshikazu; Inutsuka, Shu-ichiro; Saigo, Kazuya; Takakuwa, Shigehisa
2017-05-01
We present theoretical models for protostellar binary and multiple systems based on the high-resolution numerical simulation with an adaptive mesh refinement (AMR) code, SFUMATO. The recent ALMA observations have revealed early phases of the binary and multiple star formation with high spatial resolutions. These observations should be compared with theoretical models with high spatial resolutions. We present two theoretical models for (1) a high density molecular cloud core, MC27/L1521F, and (2) a protobinary system, L1551 NE. For the model for MC27, we performed numerical simulations for gravitational collapse of a turbulent cloud core. The cloud core exhibits fragmentation during the collapse, and dynamical interaction between the fragments produces an arc-like structure, which is one of the prominent structures observed by ALMA. For the model for L1551 NE, we performed numerical simulations of gas accretion onto protobinary. The simulations exhibit asymmetry of a circumbinary disk. Such asymmetry has been also observed by ALMA in the circumbinary disk of L1551 NE.
Unraveling the martian water cycle with high-resolution global climate simulations
NASA Astrophysics Data System (ADS)
Pottier, Alizée; Forget, François; Montmessin, Franck; Navarro, Thomas; Spiga, Aymeric; Millour, Ehouarn; Szantai, André; Madeleine, Jean-Baptiste
2017-07-01
Global climate modeling of the Mars water cycle is usually performed at relatively coarse resolution (200 - 300km), which may not be sufficient to properly represent the impact of waves, fronts, topography effects on the detailed structure of clouds and surface ice deposits. Here, we present new numerical simulations of the annual water cycle performed at a resolution of 1° × 1° (∼ 60 km in latitude). The model includes the radiative effects of clouds, whose influence on the thermal structure and atmospheric dynamics is significant, thus we also examine simulations with inactive clouds to distinguish the direct impact of resolution on circulation and winds from the indirect impact of resolution via water ice clouds. To first order, we find that the high resolution does not dramatically change the behavior of the system, and that simulations performed at ∼ 200 km resolution capture well the behavior of the simulated water cycle and Mars climate. Nevertheless, a detailed comparison between high and low resolution simulations, with reference to observations, reveal several significant changes that impact our understanding of the water cycle active today on Mars. The key northern cap edge dynamics are affected by an increase in baroclinic wave strength, with a complication of northern summer dynamics. South polar frost deposition is modified, with a westward longitudinal shift, since southern dynamics are also influenced. Baroclinic wave mode transitions are observed. New transient phenomena appear, like spiral and streak clouds, already documented in the observations. Atmospheric circulation cells in the polar region exhibit a large variability and are fine structured, with slope winds. Most modeled phenomena affected by high resolution give a picture of a more turbulent planet, inducing further variability. This is challenging for long-period climate studies.
Performance Evaluation of Cloud Service Considering Fault Recovery
NASA Astrophysics Data System (ADS)
Yang, Bo; Tan, Feng; Dai, Yuan-Shun; Guo, Suchang
In cloud computing, cloud service performance is an important issue. To improve cloud service reliability, fault recovery may be used. However, the use of fault recovery could have impact on the performance of cloud service. In this paper, we conduct a preliminary study on this issue. Cloud service performance is quantified by service response time, whose probability density function as well as the mean is derived.
Cloud cover detection combining high dynamic range sky images and ceilometer measurements
NASA Astrophysics Data System (ADS)
Román, R.; Cazorla, A.; Toledano, C.; Olmo, F. J.; Cachorro, V. E.; de Frutos, A.; Alados-Arboledas, L.
2017-11-01
This paper presents a new algorithm for cloud detection based on high dynamic range images from a sky camera and ceilometer measurements. The algorithm is also able to detect the obstruction of the sun. This algorithm, called CPC (Camera Plus Ceilometer), is based on the assumption that under cloud-free conditions the sky field must show symmetry. The symmetry criteria are applied depending on ceilometer measurements of the cloud base height. CPC algorithm is applied in two Spanish locations (Granada and Valladolid). The performance of CPC retrieving the sun conditions (obstructed or unobstructed) is analyzed in detail using as reference pyranometer measurements at Granada. CPC retrievals are in agreement with those derived from the reference pyranometer in 85% of the cases (it seems that this agreement does not depend on aerosol size or optical depth). The agreement percentage goes down to only 48% when another algorithm, based on Red-Blue Ratio (RBR), is applied to the sky camera images. The retrieved cloud cover at Granada and Valladolid is compared with that registered by trained meteorological observers. CPC cloud cover is in agreement with the reference showing a slight overestimation and a mean absolute error around 1 okta. A major advantage of the CPC algorithm with respect to the RBR method is that the determined cloud cover is independent of aerosol properties. The RBR algorithm overestimates cloud cover for coarse aerosols and high loads. Cloud cover obtained only from ceilometer shows similar results than CPC algorithm; but the horizontal distribution cannot be obtained. In addition, it has been observed that under quick and strong changes on cloud cover ceilometers retrieve a cloud cover fitting worse with the real cloud cover.
Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing.
Shatil, Anwar S; Younas, Sohail; Pourreza, Hossein; Figley, Chase R
2015-01-01
With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications.
NASA Astrophysics Data System (ADS)
Schmidt, T.; Kalisch, J.; Lorenz, E.; Heinemann, D.
2015-10-01
Clouds are the dominant source of variability in surface solar radiation and uncertainty in its prediction. However, the increasing share of solar energy in the world-wide electric power supply increases the need for accurate solar radiation forecasts. In this work, we present results of a shortest-term global horizontal irradiance (GHI) forecast experiment based on hemispheric sky images. A two month dataset with images from one sky imager and high resolutive GHI measurements from 99 pyranometers distributed over 10 km by 12 km is used for validation. We developed a multi-step model and processed GHI forecasts up to 25 min with an update interval of 15 s. A cloud type classification is used to separate the time series in different cloud scenarios. Overall, the sky imager based forecasts do not outperform the reference persistence forecasts. Nevertheless, we find that analysis and forecast performance depend strongly on the predominant cloud conditions. Especially convective type clouds lead to high temporal and spatial GHI variability. For cumulus cloud conditions, the analysis error is found to be lower than that introduced by a single pyranometer if it is used representatively for the whole area in distances from the camera larger than 1-2 km. Moreover, forecast skill is much higher for these conditions compared to overcast or clear sky situations causing low GHI variability which is easier to predict by persistence. In order to generalize the cloud-induced forecast error, we identify a variability threshold indicating conditions with positive forecast skill.
Heads in the Cloud: A Primer on Neuroimaging Applications of High Performance Computing
Shatil, Anwar S.; Younas, Sohail; Pourreza, Hossein; Figley, Chase R.
2015-01-01
With larger data sets and more sophisticated analyses, it is becoming increasingly common for neuroimaging researchers to push (or exceed) the limitations of standalone computer workstations. Nonetheless, although high-performance computing platforms such as clusters, grids and clouds are already in routine use by a small handful of neuroimaging researchers to increase their storage and/or computational power, the adoption of such resources by the broader neuroimaging community remains relatively uncommon. Therefore, the goal of the current manuscript is to: 1) inform prospective users about the similarities and differences between computing clusters, grids and clouds; 2) highlight their main advantages; 3) discuss when it may (and may not) be advisable to use them; 4) review some of their potential problems and barriers to access; and finally 5) give a few practical suggestions for how interested new users can start analyzing their neuroimaging data using cloud resources. Although the aim of cloud computing is to hide most of the complexity of the infrastructure management from end-users, we recognize that this can still be an intimidating area for cognitive neuroscientists, psychologists, neurologists, radiologists, and other neuroimaging researchers lacking a strong computational background. Therefore, with this in mind, we have aimed to provide a basic introduction to cloud computing in general (including some of the basic terminology, computer architectures, infrastructure and service models, etc.), a practical overview of the benefits and drawbacks, and a specific focus on how cloud resources can be used for various neuroimaging applications. PMID:27279746
ERIC Educational Resources Information Center
Fredette, Michelle
2012-01-01
"Rent or buy?" is a question people ask about everything from housing to textbooks. It is also a question universities must consider when it comes to high-performance computing (HPC). With the advent of Amazon's Elastic Compute Cloud (EC2), Microsoft Windows HPC Server, Rackspace's OpenStack, and other cloud-based services, researchers now have…
NASA Astrophysics Data System (ADS)
Nakatsuji, Noriaki; Matsushima, Kyoji
2017-03-01
Full-parallax high-definition CGHs composed of more than billion pixels were so far created only by the polygon-based method because of its high performance. However, GPUs recently allow us to generate CGHs much faster by the point cloud. In this paper, we measure computation time of object fields for full-parallax high-definition CGHs, which are composed of 4 billion pixels and reconstruct the same scene, by using the point cloud with GPU and the polygon-based method with CPU. In addition, we compare the optical and simulated reconstructions between CGHs created by these techniques to verify the image quality.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Michael J.; Hayes, Daniel J
2014-01-01
Use of Landsat data to answer ecological questions is contingent on the effective removal of cloud and cloud shadow from satellite images. We develop a novel algorithm to identify and classify clouds and cloud shadow, \\textsc{sparcs}: Spacial Procedures for Automated Removal of Cloud and Shadow. The method uses neural networks to determine cloud, cloud-shadow, water, snow/ice, and clear-sky membership of each pixel in a Landsat scene, and then applies a set of procedures to enforce spatial rules. In a comparison to FMask, a high-quality cloud and cloud-shadow classification algorithm currently available, \\textsc{sparcs} performs favorably, with similar omission errors for cloudsmore » (0.8% and 0.9%, respectively), substantially lower omission error for cloud-shadow (8.3% and 1.1%), and fewer errors of commission (7.8% and 5.0%). Additionally, textsc{sparcs} provides a measure of uncertainty in its classification that can be exploited by other processes that use the cloud and cloud-shadow detection. To illustrate this, we present an application that constructs obstruction-free composites of images acquired on different dates in support of algorithms detecting vegetation change.« less
Are Cloud Environments Ready for Scientific Applications?
NASA Astrophysics Data System (ADS)
Mehrotra, P.; Shackleford, K.
2011-12-01
Cloud computing environments are becoming widely available both in the commercial and government sectors. They provide flexibility to rapidly provision resources in order to meet dynamic and changing computational needs without the customers incurring capital expenses and/or requiring technical expertise. Clouds also provide reliable access to resources even though the end-user may not have in-house expertise for acquiring or operating such resources. Consolidation and pooling in a cloud environment allow organizations to achieve economies of scale in provisioning or procuring computing resources and services. Because of these and other benefits, many businesses and organizations are migrating their business applications (e.g., websites, social media, and business processes) to cloud environments-evidenced by the commercial success of offerings such as the Amazon EC2. In this paper, we focus on the feasibility of utilizing cloud environments for scientific workloads and workflows particularly of interest to NASA scientists and engineers. There is a wide spectrum of such technical computations. These applications range from small workstation-level computations to mid-range computing requiring small clusters to high-performance simulations requiring supercomputing systems with high bandwidth/low latency interconnects. Data-centric applications manage and manipulate large data sets such as satellite observational data and/or data previously produced by high-fidelity modeling and simulation computations. Most of the applications are run in batch mode with static resource requirements. However, there do exist situations that have dynamic demands, particularly ones with public-facing interfaces providing information to the general public, collaborators and partners, as well as to internal NASA users. In the last few months we have been studying the suitability of cloud environments for NASA's technical and scientific workloads. We have ported several applications to multiple cloud environments including NASA's Nebula environment, Amazon's EC2, Magellan at NERSC, and SGI's Cyclone system. We critically examined the performance of the applications on these systems. We also collected information on the usability of these cloud environments. In this talk we will present the results of our study focusing on the efficacy of using clouds for NASA's scientific applications.
Bent, John M.; Faibish, Sorin; Grider, Gary
2015-06-30
Cloud object storage is enabled for archived data, such as checkpoints and results, of high performance computing applications using a middleware process. A plurality of archived files, such as checkpoint files and results, generated by a plurality of processes in a parallel computing system are stored by obtaining the plurality of archived files from the parallel computing system; converting the plurality of archived files to objects using a log structured file system middleware process; and providing the objects for storage in a cloud object storage system. The plurality of processes may run, for example, on a plurality of compute nodes. The log structured file system middleware process may be embodied, for example, as a Parallel Log-Structured File System (PLFS). The log structured file system middleware process optionally executes on a burst buffer node.
Improving Individual Acceptance of Health Clouds through Confidentiality Assurance.
Ermakova, Tatiana; Fabian, Benjamin; Zarnekow, Rüdiger
2016-10-26
Cloud computing promises to essentially improve healthcare delivery performance. However, shifting sensitive medical records to third-party cloud providers could create an adoption hurdle because of security and privacy concerns. This study examines the effect of confidentiality assurance in a cloud-computing environment on individuals' willingness to accept the infrastructure for inter-organizational sharing of medical data. We empirically investigate our research question by a survey with over 260 full responses. For the setting with a high confidentiality assurance, we base on a recent multi-cloud architecture which provides very high confidentiality assurance through a secret-sharing mechanism: Health information is cryptographically encoded and distributed in a way that no single and no small group of cloud providers is able to decode it. Our results indicate the importance of confidentiality assurance in individuals' acceptance of health clouds for sensitive medical data. Specifically, this finding holds for a variety of practically relevant circumstances, i.e., in the absence and despite the presence of conventional offline alternatives and along with pseudonymization. On the other hand, we do not find support for the effect of confidentiality assurance in individuals' acceptance of health clouds for non-sensitive medical data. These results could support the process of privacy engineering for health-cloud solutions.
Improving Individual Acceptance of Health Clouds through Confidentiality Assurance
Fabian, Benjamin; Zarnekow, Rüdiger
2016-01-01
Summary Background Cloud computing promises to essentially improve healthcare delivery performance. However, shifting sensitive medical records to third-party cloud providers could create an adoption hurdle because of security and privacy concerns. Objectives This study examines the effect of confidentiality assurance in a cloud-computing environment on individuals’ willingness to accept the infrastructure for inter-organizational sharing of medical data. Methods We empirically investigate our research question by a survey with over 260 full responses. For the setting with a high confidentiality assurance, we base on a recent multi-cloud architecture which provides very high confidentiality assurance through a secret-sharing mechanism: Health information is cryptographically encoded and distributed in a way that no single and no small group of cloud providers is able to decode it. Results Our results indicate the importance of confidentiality assurance in individuals’ acceptance of health clouds for sensitive medical data. Specifically, this finding holds for a variety of practically relevant circumstances, i.e., in the absence and despite the presence of conventional offline alternatives and along with pseudonymization. On the other hand, we do not find support for the effect of confidentiality assurance in individuals’ acceptance of health clouds for non-sensitive medical data. These results could support the process of privacy engineering for health-cloud solutions. PMID:27781238
A Robust Multi-Scale Modeling System for the Study of Cloud and Precipitation Processes
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2012-01-01
During the past decade, numerical weather and global non-hydrostatic models have started using more complex microphysical schemes originally developed for high resolution cloud resolving models (CRMs) with 1-2 km or less horizontal resolutions. These microphysical schemes affect the dynamic through the release of latent heat (buoyancy loading and pressure gradient) the radiation through the cloud coverage (vertical distribution of cloud species), and surface processes through rainfall (both amount and intensity). Recently, several major improvements of ice microphysical processes (or schemes) have been developed for cloud-resolving model (Goddard Cumulus Ensemble, GCE, model) and regional scale (Weather Research and Forecast, WRF) model. These improvements include an improved 3-ICE (cloud ice, snow and graupel) scheme (Lang et al. 2010); a 4-ICE (cloud ice, snow, graupel and hail) scheme and a spectral bin microphysics scheme and two different two-moment microphysics schemes. The performance of these schemes has been evaluated by using observational data from TRMM and other major field campaigns. In this talk, we will present the high-resolution (1 km) GeE and WRF model simulations and compared the simulated model results with observation from recent field campaigns [i.e., midlatitude continental spring season (MC3E; 2010), high latitude cold-season (C3VP, 2007; GCPEx, 2012), and tropical oceanic (TWP-ICE, 2006)].
Hao, Liqing; Romakkaniemi, Sami; Kortelainen, Aki; Jaatinen, Antti; Portin, Harri; Miettinen, Pasi; Komppula, Mika; Leskinen, Ari; Virtanen, Annele; Smith, James N; Sueper, Donna; Worsnop, Douglas R; Lehtinen, Kari E J; Laaksonen, Ari
2013-03-19
This study presents results of direct observations of aerosol chemical composition in clouds. A high-resolution time-of-flight aerosol mass spectrometer was used to make measurements of cloud interstitial particles (INT) and mixed cloud interstitial and droplet residual particles (TOT). The differences between these two are the cloud droplet residuals (RES). Positive matrix factorization analysis of high-resolution mass spectral data sets and theoretical calculations were performed to yield distributions of chemical composition of the INT and RES particles. We observed that less oxidized hydrocarbon-like organic aerosols (HOA) were mainly distributed into the INT particles, whereas more oxidized low-volatile oxygenated OA (LVOOA) mainly in the RES particles. Nitrates existed as organic nitrate and in chemical form of NH(4)NO(3). Organic nitrates accounted for 45% of total nitrates in the INT particles, in clear contrast to 26% in the RES particles. Meanwhile, sulfates coexist in forms of acidic NH(4)HSO(4) and neutralized (NH(4))(2)SO(4). Acidic sulfate made up 64.8% of total sulfates in the INT particles, much higher than 10.7% in the RES particles. The results indicate a possible joint effect of activation ability of aerosol particles, cloud processing, and particle size effects on cloud formation.
Large-scale high-throughput computer-aided discovery of advanced materials using cloud computing
NASA Astrophysics Data System (ADS)
Bazhirov, Timur; Mohammadi, Mohammad; Ding, Kevin; Barabash, Sergey
Recent advances in cloud computing made it possible to access large-scale computational resources completely on-demand in a rapid and efficient manner. When combined with high fidelity simulations, they serve as an alternative pathway to enable computational discovery and design of new materials through large-scale high-throughput screening. Here, we present a case study for a cloud platform implemented at Exabyte Inc. We perform calculations to screen lightweight ternary alloys for thermodynamic stability. Due to the lack of experimental data for most such systems, we rely on theoretical approaches based on first-principle pseudopotential density functional theory. We calculate the formation energies for a set of ternary compounds approximated by special quasirandom structures. During an example run we were able to scale to 10,656 CPUs within 7 minutes from the start, and obtain results for 296 compounds within 38 hours. The results indicate that the ultimate formation enthalpy of ternary systems can be negative for some of lightweight alloys, including Li and Mg compounds. We conclude that compared to traditional capital-intensive approach that requires in on-premises hardware resources, cloud computing is agile and cost-effective, yet scalable and delivers similar performance.
A Highly Scalable Data Service (HSDS) using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.; Henderson, J.; Willmore, F.
2017-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy, security mechanisms and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and legacy software systems developed for online data repositories within the federal government were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Moreover, services bases on object storage are well established and provided through all the leading cloud service providers (Amazon Web Service, Microsoft Azure, Google Cloud, etc…) of which can often provide unmatched "scale-out" capabilities and data availability to a large and growing consumer base at a price point unachievable from in-house solutions. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows a performance advantage for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
An efficient global energy optimization approach for robust 3D plane segmentation of point clouds
NASA Astrophysics Data System (ADS)
Dong, Zhen; Yang, Bisheng; Hu, Pingbo; Scherer, Sebastian
2018-03-01
Automatic 3D plane segmentation is necessary for many applications including point cloud registration, building information model (BIM) reconstruction, simultaneous localization and mapping (SLAM), and point cloud compression. However, most of the existing 3D plane segmentation methods still suffer from low precision and recall, and inaccurate and incomplete boundaries, especially for low-quality point clouds collected by RGB-D sensors. To overcome these challenges, this paper formulates the plane segmentation problem as a global energy optimization because it is robust to high levels of noise and clutter. First, the proposed method divides the raw point cloud into multiscale supervoxels, and considers planar supervoxels and individual points corresponding to nonplanar supervoxels as basic units. Then, an efficient hybrid region growing algorithm is utilized to generate initial plane set by incrementally merging adjacent basic units with similar features. Next, the initial plane set is further enriched and refined in a mutually reinforcing manner under the framework of global energy optimization. Finally, the performances of the proposed method are evaluated with respect to six metrics (i.e., plane precision, plane recall, under-segmentation rate, over-segmentation rate, boundary precision, and boundary recall) on two benchmark datasets. Comprehensive experiments demonstrate that the proposed method obtained good performances both in high-quality TLS point clouds (i.e., http://SEMANTIC3D.NET)
Cloud4Psi: cloud computing for 3D protein structure similarity searching.
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur
2014-10-01
Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. © The Author 2014. Published by Oxford University Press.
Changes in cloud properties over East Asia deduced from the CLARA-A2 satellite data record
NASA Astrophysics Data System (ADS)
Benas, Nikos; Fokke Meirink, Jan; Hollmann, Rainer; Karlsson, Karl-Göran; Stengel, Martin
2017-04-01
Studies on cloud properties and processes, and their role in the Earth's changing climate, have advanced during the past decades. A significant part of this advance was enabled by satellite measurements, which offer global and continuous monitoring. Lately, a new satellite-based cloud data record was released: the CM SAF cLoud, Albedo and surface RAdiation dataset from AVHRR data - second edition (CLARA-A2) includes high resolution cloud macro- and micro-physical properties derived from the AVHRR instruments on board NOAA and MetOp polar orbiters. Based on this data record, an analysis of cloud property changes over East Asia during the 12-year period 2004-2015 was performed. Significant changes were found in both optical and geometric cloud properties, including increases in cloud liquid water path and top height. The Cloud Droplet Number Concentration (CDNC) was specifically studied in order to gain further insight into possible connections between aerosol and cloud processes. To this end, aerosol and cloud observations from MODIS, covering the same area and period, were included in the analysis.
Cloud4Psi: cloud computing for 3D protein structure similarity searching
Mrozek, Dariusz; Małysiak-Mrozek, Bożena; Kłapciński, Artur
2014-01-01
Summary: Popular methods for 3D protein structure similarity searching, especially those that generate high-quality alignments such as Combinatorial Extension (CE) and Flexible structure Alignment by Chaining Aligned fragment pairs allowing Twists (FATCAT) are still time consuming. As a consequence, performing similarity searching against large repositories of structural data requires increased computational resources that are not always available. Cloud computing provides huge amounts of computational power that can be provisioned on a pay-as-you-go basis. We have developed the cloud-based system that allows scaling of the similarity searching process vertically and horizontally. Cloud4Psi (Cloud for Protein Similarity) was tested in the Microsoft Azure cloud environment and provided good, almost linearly proportional acceleration when scaled out onto many computational units. Availability and implementation: Cloud4Psi is available as Software as a Service for testing purposes at: http://cloud4psi.cloudapp.net/. For source code and software availability, please visit the Cloud4Psi project home page at http://zti.polsl.pl/dmrozek/science/cloud4psi.htm. Contact: dariusz.mrozek@polsl.pl PMID:24930141
CloudMC: a cloud computing application for Monte Carlo simulation.
Miras, H; Jiménez, R; Miras, C; Gomà, C
2013-04-21
This work presents CloudMC, a cloud computing application-developed in Windows Azure®, the platform of the Microsoft® cloud-for the parallelization of Monte Carlo simulations in a dynamic virtual cluster. CloudMC is a web application designed to be independent of the Monte Carlo code in which the simulations are based-the simulations just need to be of the form: input files → executable → output files. To study the performance of CloudMC in Windows Azure®, Monte Carlo simulations with penelope were performed on different instance (virtual machine) sizes, and for different number of instances. The instance size was found to have no effect on the simulation runtime. It was also found that the decrease in time with the number of instances followed Amdahl's law, with a slight deviation due to the increase in the fraction of non-parallelizable time with increasing number of instances. A simulation that would have required 30 h of CPU on a single instance was completed in 48.6 min when executed on 64 instances in parallel (speedup of 37 ×). Furthermore, the use of cloud computing for parallel computing offers some advantages over conventional clusters: high accessibility, scalability and pay per usage. Therefore, it is strongly believed that cloud computing will play an important role in making Monte Carlo dose calculation a reality in future clinical practice.
Supernovae-generated high-velocity compact clouds
NASA Astrophysics Data System (ADS)
Yalinewich, A.; Beniamini, P.
2018-05-01
Context. A previous study claimed the discovery of an intermediate-mass black hole (IMBH). This hypothetical black hole was invoked in order to explain the high-velocity dispersion in one of several dense molecular clouds near the Galactic center. The same study considered the possibility that this cloud was due to a supernova explosion, but disqualified this scenario because no X-rays were detected. Aims: We here check whether a supernova explosion could have produced that cloud, and whether this explanation is more likely than an IMBH. More specifically, we wish to determine whether a supernova inside a dense molecular cloud would emit in the X-rays. Methods: We have approached this problem from two different directions. First, we performed an analytic calculation to determine the cooling rate by thermal bremsstrahlung and compared this time to the lifetime of the cloud. Second, we estimated the creation rate of these dense clouds in the central molecular zone (CMZ) region near the Galactic center, where they were observed. Based on this rate, we can place lower bounds on the total mass of IMBHs and clouds and compare this to the masses of the components of the CMZ. Results: We find that the cooling time of the supernova remnant inside a molecular cloud is shorter than its dynamical time. This means that the temperature in such a remnant would be much lower than that of a typical supernova remnant. At such a low temperature, the remnant is not expected to emit in the X-rays. We also find that to explain the rate at which such dense clouds are created requires fine-tuning the number of IMBHs. Conclusions: We find the supernova model to be a more likely explanation for the formation of high-velocity compact clouds than an IMBH.
Spontaneous Ad Hoc Mobile Cloud Computing Network
Lacuesta, Raquel; Sendra, Sandra; Peñalver, Lourdes
2014-01-01
Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes. PMID:25202715
Spontaneous ad hoc mobile cloud computing network.
Lacuesta, Raquel; Lloret, Jaime; Sendra, Sandra; Peñalver, Lourdes
2014-01-01
Cloud computing helps users and companies to share computing resources instead of having local servers or personal devices to handle the applications. Smart devices are becoming one of the main information processing devices. Their computing features are reaching levels that let them create a mobile cloud computing network. But sometimes they are not able to create it and collaborate actively in the cloud because it is difficult for them to build easily a spontaneous network and configure its parameters. For this reason, in this paper, we are going to present the design and deployment of a spontaneous ad hoc mobile cloud computing network. In order to perform it, we have developed a trusted algorithm that is able to manage the activity of the nodes when they join and leave the network. The paper shows the network procedures and classes that have been designed. Our simulation results using Castalia show that our proposal presents a good efficiency and network performance even by using high number of nodes.
Cloud and Aerosol Measurements from the GLAS Polar Orbiting Lidar: First Year Results
NASA Technical Reports Server (NTRS)
Spinhirne, J. D.; Palm, S. P.; Hlavka, D. L.; Hart, W. D.; Mahesh, A.; Welton, E. J.
2004-01-01
The Geoscience Laser Altimeter System (GLAS) launched in 2003 is the first polar orbiting satellite lidar. The instrument was designed for high performance observations of the distribution and optical scattering cross sections of clouds and aerosol. GLAS is approaching six months of on orbit data operation. These data from thousands of orbits illustrate the ability of space lidar to accurately and dramatically measure the height distribution of global cloud and aerosol to an unprecedented degree. There were many intended science applications of the GLAS data and significant results have already been realized. One application is the accurate height distribution and coverage of global cloud cover with one goal of defining the limitation and inaccuracies of passive retrievals. Comparison to MODIS cloud retrievals shows notable discrepancies. Initial comparisons to NOAA 14&15 satellite cloud retrievals show basic similarity in overall cloud coverage, but important differences in height distribution. Because of the especially poor performance of passive cloud retrievals in polar regions, and partly because of high orbit track densities, the GLAS measurements are by far the most accurate measurement of Arctic and Antarctica cloud cover from space to date. Global aerosol height profiling is a fundamentally new measurement from space with multiple applications. A most important aerosol application is providing input to global aerosol generation and transport models. Another is improved measurement of aerosol optical depth. Oceanic surface energy flux derivation from PBL and LCL height measurements is another application of GLAS data that is being pursued. A special area of work for GLAS data is the correction and application of multiple scattering effects. Stretching of surface return pulses in excess of 40 m from cloud propagation effects and other interesting multiple scattering phenomena have been observed. As an EOS project instrument, GLAS data products are openly available to the science community. First year results from GLAS are summarized.
MaMR: High-performance MapReduce programming model for material cloud applications
NASA Astrophysics Data System (ADS)
Jing, Weipeng; Tong, Danyu; Wang, Yangang; Wang, Jingyuan; Liu, Yaqiu; Zhao, Peng
2017-02-01
With the increasing data size in materials science, existing programming models no longer satisfy the application requirements. MapReduce is a programming model that enables the easy development of scalable parallel applications to process big data on cloud computing systems. However, this model does not directly support the processing of multiple related data, and the processing performance does not reflect the advantages of cloud computing. To enhance the capability of workflow applications in material data processing, we defined a programming model for material cloud applications that supports multiple different Map and Reduce functions running concurrently based on hybrid share-memory BSP called MaMR. An optimized data sharing strategy to supply the shared data to the different Map and Reduce stages was also designed. We added a new merge phase to MapReduce that can efficiently merge data from the map and reduce modules. Experiments showed that the model and framework present effective performance improvements compared to previous work.
NASA Astrophysics Data System (ADS)
Stillinger, T.; Dozier, J.; Phares, N.; Rittger, K.
2015-12-01
Discrimination between snow and clouds poses a serious but tractable challenge to the consistent delivery of high-quality information on mountain snow from remote sensing. Clouds obstruct the surface from the sensor's view, and the similar optical properties of clouds and snow make accurate discrimination difficult. We assess the performance of the current Landsat 8 operational snow and cloud mask products (LDCM CCA and CFmask), along with a new method, using over one million manually identified snow and clouds pixels in Landsat 8 scenes. The new method uses physically based scattering models to generate spectra in each Landsat 8 band, at that scene's solar illumination, for snow and cloud particle sizes that cover the plausible range for each. The modeled spectra are compared to pixels' spectra via several independent ways to identify snow and clouds. The results are synthesized to create a final snow/cloud mask, and the method can be applied to any multispectral imager with bands covering the visible, near-infrared, and shortwave-infrared regions. Each algorithm we tested misidentifies snow and clouds in both directions to varying degrees. We assess performance with measures of Precision, Recall, and the F statistic, which are based on counts of true and false positives and negatives. Tests for significance in differences between spectra in the measured and modeled values among incorrectly identified pixels help ascertain reasons for misidentification. A cloud mask specifically designed to separate snow from clouds is a valuable tool for those interested in remotely sensing snow cover. Given freely available remote sensing datasets and computational tools to feasibly process entire mission histories for an area of interest, enabling researchers to reliably identify and separate snow and clouds increases the usability of the data for hydrological and climatological studies.
NASA Technical Reports Server (NTRS)
Barrett, E. C.; Grant, C. K. (Principal Investigator)
1977-01-01
The author has identified the following significant results. It was demonstrated that satellites with sufficiently high resolution capability in the visible region of the electromagnetic spectrum could be used to check the accuracy of estimates of total cloud amount assessed subjectively from the ground, and to reveal areas of performance in which corrections should be made. It was also demonstrated that, in middle latitude in summer, cloud shadow may obscure at least half as much again of the land surface covered by an individual LANDSAT frame as the cloud itself. That proportion would increase with latitude and/or time of year towards the winter solstice. Analyses of sample multispectral images for six different categories of clouds in summer revealed marked differences between the reflectance characteristics of cloud fields in the visible/near infrared region of the spectrum.
Grids and clouds in the Czech NGI
NASA Astrophysics Data System (ADS)
Kundrát, Jan; Adam, Martin; Adamová, Dagmar; Chudoba, Jiří; Kouba, Tomáš; Lokajíček, Miloš; Mikula, Alexandr; Říkal, Václav; Švec, Jan; Vohnout, Rudolf
2016-09-01
There are several infrastructure operators within the Czech Republic NGI (National Grid Initiative) which provide users with access to high-performance computing facilities over a grid and cloud interface. This article focuses on those where the primary author has personal first-hand experience. We cover some operational issues as well as the history of these facilities.
Climbing the Slope of Enlightenment during NASA's Arctic Boreal Vulnerability Experiment
NASA Astrophysics Data System (ADS)
Griffith, P. C.; Hoy, E.; Duffy, D.; McInerney, M.
2015-12-01
The Arctic Boreal Vulnerability Experiment (ABoVE) is a new field campaign sponsored by NASA's Terrestrial Ecology Program and designed to improve understanding of the vulnerability and resilience of Arctic and boreal social-ecological systems to environmental change (http://above.nasa.gov). ABoVE is integrating field-based studies, modeling, and data from airborne and satellite remote sensing. The NASA Center for Climate Simulation (NCCS) has partnered with the NASA Carbon Cycle and Ecosystems Office (CCEO) to create a high performance science cloud for this field campaign. The ABoVE Science Cloud combines high performance computing with emerging technologies and data management with tools for analyzing and processing geographic information to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage for "big data" with integrated data management, and integration of core variables from in-situ networks. The ABoVE Science Cloud is a collaboration that is accelerating the pace of new Arctic science for researchers participating in the field campaign. Specific examples of the utilization of the ABoVE Science Cloud by several funded projects will be presented.
NASA Astrophysics Data System (ADS)
Snyder, P. L.; Brown, V. W.
2017-12-01
IBM has created a general purpose, data-agnostic solution that provides high performance, low data latency, high availability, scalability, and persistent access to the captured data, regardless of source or type. This capability is hosted on commercially available cloud environments and uses much faster, more efficient, reliable, and secure data transfer protocols than the more typically used FTP. The design incorporates completely redundant data paths at every level, including at the cloud data center level, in order to provide the highest assurance of data availability to the data consumers. IBM has been successful in building and testing a Proof of Concept instance on our IBM Cloud platform to receive and disseminate actual GOES-16 data as it is being downlinked. This solution leverages the inherent benefits of a cloud infrastructure configured and tuned for continuous, stable, high-speed data dissemination to data consumers worldwide at the downlink rate. It also is designed to ingest data from multiple simultaneous sources and disseminate data to multiple consumers. Nearly linear scalability is achieved by adding servers and storage.The IBM Proof of Concept system has been tested with our partners to achieve in excess of 5 Gigabits/second over public internet infrastructure. In tests with live GOES-16 data, the system routinely achieved 2.5 Gigabits/second pass-through to The Weather Company from the University of Wisconsin-Madison SSEC. Simulated data was also transferred from the Cooperative Institute for Climate and Satellites — North Carolina to The Weather Company, as well. The storage node allocated to our Proof of Concept system as tested was sized at 480 Terabytes of RAID protected disk as a worst case sizing to accommodate the data from four GOES-16 class satellites for 30 days in a circular buffer. This shows that an abundance of performance and capacity headroom exists in the IBM design that can be applied to additional missions.
NASA Astrophysics Data System (ADS)
Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young
2016-07-01
Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.
Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young
2016-07-28
Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes.
Yang, Hui; Zhang, Jie; Ji, Yuefeng; He, Yongqi; Lee, Young
2016-01-01
Cloud radio access network (C-RAN) becomes a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing in 5G area. However, the radio network, optical network and processing unit cloud have been decoupled from each other, so that their resources are controlled independently. Traditional architecture cannot implement the resource optimization and scheduling for the high-level service guarantee due to the communication obstacle among them with the growing number of mobile internet users. In this paper, we report a study on multi-dimensional resources integration (MDRI) for service provisioning in cloud radio over fiber network (C-RoFN). A resources integrated provisioning (RIP) scheme using an auxiliary graph is introduced based on the proposed architecture. The MDRI can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical network and processing resources effectively to maximize radio coverage. The feasibility of the proposed architecture is experimentally verified on OpenFlow-based enhanced SDN testbed. The performance of RIP scheme under heavy traffic load scenario is also quantitatively evaluated to demonstrate the efficiency of the proposal based on MDRI architecture in terms of resource utilization, path blocking probability, network cost and path provisioning latency, compared with other provisioning schemes. PMID:27465296
NASA Astrophysics Data System (ADS)
Schmidt, Thomas; Kalisch, John; Lorenz, Elke; Heinemann, Detlev
2016-03-01
Clouds are the dominant source of small-scale variability in surface solar radiation and uncertainty in its prediction. However, the increasing share of solar energy in the worldwide electric power supply increases the need for accurate solar radiation forecasts. In this work, we present results of a very short term global horizontal irradiance (GHI) forecast experiment based on hemispheric sky images. A 2-month data set with images from one sky imager and high-resolution GHI measurements from 99 pyranometers distributed over 10 km by 12 km is used for validation. We developed a multi-step model and processed GHI forecasts up to 25 min with an update interval of 15 s. A cloud type classification is used to separate the time series into different cloud scenarios. Overall, the sky-imager-based forecasts do not outperform the reference persistence forecasts. Nevertheless, we find that analysis and forecast performance depends strongly on the predominant cloud conditions. Especially convective type clouds lead to high temporal and spatial GHI variability. For cumulus cloud conditions, the analysis error is found to be lower than that introduced by a single pyranometer if it is used representatively for the whole area in distances from the camera larger than 1-2 km. Moreover, forecast skill is much higher for these conditions compared to overcast or clear sky situations causing low GHI variability, which is easier to predict by persistence. In order to generalize the cloud-induced forecast error, we identify a variability threshold indicating conditions with positive forecast skill.
A High Performance Cloud-Based Protein-Ligand Docking Prediction Algorithm
Chen, Jui-Le; Yang, Chu-Sing
2013-01-01
The potential of predicting druggability for a particular disease by integrating biological and computer science technologies has witnessed success in recent years. Although the computer science technologies can be used to reduce the costs of the pharmaceutical research, the computation time of the structure-based protein-ligand docking prediction is still unsatisfied until now. Hence, in this paper, a novel docking prediction algorithm, named fast cloud-based protein-ligand docking prediction algorithm (FCPLDPA), is presented to accelerate the docking prediction algorithm. The proposed algorithm works by leveraging two high-performance operators: (1) the novel migration (information exchange) operator is designed specially for cloud-based environments to reduce the computation time; (2) the efficient operator is aimed at filtering out the worst search directions. Our simulation results illustrate that the proposed method outperforms the other docking algorithms compared in this paper in terms of both the computation time and the quality of the end result. PMID:23762864
Zhu, Hai-Zhen; Liu, Wei; Mao, Jian-Wei; Yang, Ming-Min
2008-04-28
4-Amino-4'-nitrobiphenyl, which is formed by catalytic effect of trichlorfon on sodium perborate oxidizing benzidine, is extracted with a cloud point extraction method and then detected using a high performance liquid chromatography with ultraviolet detection (HPLC-UV). Under the optimum experimental conditions, there was a linear relationship between trichlorfon in the concentration range of 0.01-0.2 mgL(-1) and the peak areas of 4-amino-4'-nitrobiphenyl (r=0.996). Limit of detection was 2.0 microgL(-1), recoveries of spiked water and cabbage samples ranged between 95.4-103 and 85.2-91.2%, respectively. It was proved that the cloud point extraction (CPE) method was simple, cheap, and environment friendly than extraction with organic solvents and had more effective extraction yield.
NASA Astrophysics Data System (ADS)
Huang, Qian
2014-09-01
Scientific computing often requires the availability of a massive number of computers for performing large-scale simulations, and computing in mineral physics is no exception. In order to investigate physical properties of minerals at extreme conditions in computational mineral physics, parallel computing technology is used to speed up the performance by utilizing multiple computer resources to process a computational task simultaneously thereby greatly reducing computation time. Traditionally, parallel computing has been addressed by using High Performance Computing (HPC) solutions and installed facilities such as clusters and super computers. Today, it has been seen that there is a tremendous growth in cloud computing. Infrastructure as a Service (IaaS), the on-demand and pay-as-you-go model, creates a flexible and cost-effective mean to access computing resources. In this paper, a feasibility report of HPC on a cloud infrastructure is presented. It is found that current cloud services in IaaS layer still need to improve performance to be useful to research projects. On the other hand, Software as a Service (SaaS), another type of cloud computing, is introduced into an HPC system for computing in mineral physics, and an application of which is developed. In this paper, an overall description of this SaaS application is presented. This contribution can promote cloud application development in computational mineral physics, and cross-disciplinary studies.
Bao, Shunxing; Damon, Stephen M; Landman, Bennett A; Gokhale, Aniruddha
2016-02-27
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
NASA Astrophysics Data System (ADS)
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-03-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical- Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for- use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline.
Bao, Shunxing; Damon, Stephen M.; Landman, Bennett A.; Gokhale, Aniruddha
2016-01-01
Adopting high performance cloud computing for medical image processing is a popular trend given the pressing needs of large studies. Amazon Web Services (AWS) provide reliable, on-demand, and inexpensive cloud computing services. Our research objective is to implement an affordable, scalable and easy-to-use AWS framework for the Java Image Science Toolkit (JIST). JIST is a plugin for Medical-Image Processing, Analysis, and Visualization (MIPAV) that provides a graphical pipeline implementation allowing users to quickly test and develop pipelines. JIST is DRMAA-compliant allowing it to run on portable batch system grids. However, as new processing methods are implemented and developed, memory may often be a bottleneck for not only lab computers, but also possibly some local grids. Integrating JIST with the AWS cloud alleviates these possible restrictions and does not require users to have deep knowledge of programming in Java. Workflow definition/management and cloud configurations are two key challenges in this research. Using a simple unified control panel, users have the ability to set the numbers of nodes and select from a variety of pre-configured AWS EC2 nodes with different numbers of processors and memory storage. Intuitively, we configured Amazon S3 storage to be mounted by pay-for-use Amazon EC2 instances. Hence, S3 storage is recognized as a shared cloud resource. The Amazon EC2 instances provide pre-installs of all necessary packages to run JIST. This work presents an implementation that facilitates the integration of JIST with AWS. We describe the theoretical cost/benefit formulae to decide between local serial execution versus cloud computing and apply this analysis to an empirical diffusion tensor imaging pipeline. PMID:27127335
Cloud computing applications for biomedical science: A perspective.
Navale, Vivek; Bourne, Philip E
2018-06-01
Biomedical research has become a digital data-intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research.
Cloud computing applications for biomedical science: A perspective
2018-01-01
Biomedical research has become a digital data–intensive endeavor, relying on secure and scalable computing, storage, and network infrastructure, which has traditionally been purchased, supported, and maintained locally. For certain types of biomedical applications, cloud computing has emerged as an alternative to locally maintained traditional computing approaches. Cloud computing offers users pay-as-you-go access to services such as hardware infrastructure, platforms, and software for solving common biomedical computational problems. Cloud computing services offer secure on-demand storage and analysis and are differentiated from traditional high-performance computing by their rapid availability and scalability of services. As such, cloud services are engineered to address big data problems and enhance the likelihood of data and analytics sharing, reproducibility, and reuse. Here, we provide an introductory perspective on cloud computing to help the reader determine its value to their own research. PMID:29902176
NASA Astrophysics Data System (ADS)
di Girolamo, P.; Summa, D.; Lin, R.-F.; Maestri, T.; Rizzi, R.; Masiello, G.
2009-11-01
Raman lidar measurements performed in Potenza by the Raman lidar system BASIL in the presence of cirrus clouds are discussed. Measurements were performed on 6 September 2004 in the frame of the Italian phase of the EAQUATE Experiment. The major feature of BASIL is represented by its capability to perform high-resolution and accurate measurements of atmospheric temperature and water vapour, and consequently relative humidity, both in daytime and night-time, based on the application of the rotational and vibrational Raman lidar techniques in the UV. BASIL is also capable to provide measurements of the particle backscatter and extinction coefficient, and consequently lidar ratio (at the time of these measurements, only at one wavelength), which are fundamental to infer geometrical and microphysical properties of clouds. A case study is discussed in order to assess the capability of Raman lidars to measure humidity in presence of cirrus clouds, both below and inside the cloud. While air inside the cloud layers is observed to be always under-saturated with respect to water, both ice super-saturation and under-saturation conditions are found inside these clouds. Upper tropospheric moistening is observed below the lower cloud layer. The synergic use of the data derived from the ground based Raman Lidar and of spectral radiances measured by the NAST-I Airborne Spectrometer allows the determination of the temporal evolution of the atmospheric cooling/heating rates due to the presence of the cirrus cloud. Lidar measurements beneath the cirrus cloud layer have been interpreted using a 1-D cirrus cloud model with explicit microphysics. The 1-D simulations indicate that sedimentation-moistening has contributed significantly to the moist anomaly, but other mechanisms are also contributing. This result supports the hypothesis that the observed mid-tropospheric humidification is a real feature which is strongly influenced by the sublimation of precipitating ice crystals. Results illustrated in this study demonstrate that Raman lidars, like the one used in this study, can resolve the spatial and temporal scales required for the study of cirrus cloud microphysical processes and appear sensitive enough to reveal and quantify upper tropospheric humidification associated with cirrus cloud sublimation.
NASA Astrophysics Data System (ADS)
di Girolamo, P.; Summa, D.; Lin, R.-F.; Maestri, T.; Rizzi, R.; Masiello, G.
2009-07-01
Raman lidar measurements performed in Potenza by the Raman lidar system BASIL in the presence of cirrus clouds are discussed. Measurements were performed on 6 September 2004 in the frame of Italian phase of the EAQUATE Experiment. The major feature of BASIL is represented by its capability to perform high-resolution and accurate measurements of atmospheric temperature and water vapour, and consequently relative humidity, both in daytime and night-time, based on the application of the rotational and vibrational Raman lidar techniques in the UV. BASIL is also capable to provide measurements of the particle backscatter and extinction coefficient, and consequently lidar ratio (at the time of these measurements only at one wavelength), which are fundamental to infer geometrical and microphysical properties of clouds. A case study is discussed in order to assess the capability of Raman lidars to measure humidity in presence of cirrus clouds, both below and inside the cloud. While air inside the cloud layers is observed to be always under-saturated with respect to water, both ice super-saturation and under-saturation conditions are found inside these clouds. Upper tropospheric moistening is observed below the lower cloud layer. The synergic use of the data derived from the ground based Raman Lidar and of spectral radiances measured by the NAST-I Airborne Spectrometer allows to determine the temporal evolution of the atmospheric cooling/heating rates due to the presence of the cirrus cloud anvil. Lidar measurements beneath the cirrus cloud layer have been interpreted using a 1-D cirrus cloud model with explicit microphysics. The 1-D simulations indicates that sedimentation-moistening has contributed significantly to the moist anomaly, but other mechanisms are also contributing. This result supports the hypothesis that the observed mid-tropospheric humidification is a real feature which is strongly influenced by the sublimation of precipitating ice crystals. Results illustrated in this study demonstrate that Raman lidars, like the one used in this study, can resolve the spatial and temporal scales required for the study of cirrus cloud microphysical processes and appears sensitive enough to reveal and quantify upper tropospheric humidification associated with cirrus cloud sublimation.
NASA Astrophysics Data System (ADS)
Qiu, Yanmei; Zhao, Chuanfeng; Guo, Jianping; Li, Jiming
2017-09-01
Previous studies have shown the negative or positive relationship between cloud droplet effective radius (re) and aerosol amount based on limited observations, indicative of the uncertainties of this relationship caused by many factors. Using 8-year ground-based cloud and aerosol observations at Southern Great Plain (SGP) site in Oklahoma, US, we here analyze the seasonal variation of aerosol effect on low liquid cloud re . It shows positive instead of negative AOD- re relationship in all seasons except summer. Potential contribution to AOD- re relationship from the precipitable water vapor (PWV) has been analyzed. Results show that the AOD- re relationship is indeed negative in low PWV condition regardless of seasonality, but it turns positive in high PWV condition for all seasons other than summer. The most likely explanation for the positive AOD-re relationship in high PWV condition for spring, fall and winter is that high PWV could promote the growth of cloud droplets by providing sufficient water vapor. The different performance of AOD- re relationship in summer could be related to the much heavier aerosol loading, which makes the PWV not sufficient any more so that the droplets compete water with each other. By limiting the variation of other meteorological conditions such as low tropospheric stability and wind speed near cloud bases, further analysis shows that higher PWVs not only help change AOD- re relationship from negative to positive, but also make cloud depth and cloud top height higher.
Modeled Impact of Cirrus Cloud Increases Along Aircraft Flight Paths
NASA Technical Reports Server (NTRS)
Rind, David; Lonergan, P.; Shah, K.
1999-01-01
The potential impact of contrails and alterations in the lifetime of background cirrus due to subsonic airplane water and aerosol emissions has been investigated in a set of experiments using the GISS GCM connected to a q-flux ocean. Cirrus clouds at a height of 12-15km, with an optical thickness of 0.33, were input to the model "x" percentage of clear-sky occasions along subsonic aircraft flight paths, where x is varied from .05% to 6%. Two types of experiments were performed: one with the percentage cirrus cloud increase independent of flight density, as long as a certain minimum density was exceeded; the other with the percentage related to the density of fuel expenditure. The overall climate impact was similar with the two approaches, due to the feedbacks of the climate system. Fifty years were run for eight such experiments, with the following conclusions based on the stable results from years 30-50 for each. The experiments show that adding cirrus to the upper troposphere results in a stabilization of the atmosphere, which leads to some decrease in cloud cover at levels below the insertion altitude. Considering then the total effect on upper level cloud cover (above 5 km altitude), the equilibrium global mean temperature response shows that altering high level clouds by 1% changes the global mean temperature by 0.43C. The response is highly linear (linear correlation coefficient of 0.996) for high cloud cover changes between 0. 1% and 5%. The effect is amplified in the Northern Hemisphere, more so with greater cloud cover change. The temperature effect maximizes around 10 km (at greater than 40C warming with a 4.8% increase in upper level clouds), again more so with greater warming. The high cloud cover change shows the flight path influence most clearly with the smallest warming magnitudes; with greater warming, the model feedbacks introduce a strong tropical response. Similarly, the surface temperature response is dominated by the feedbacks, and shows little geographical relationship to the high cloud input. Considering whether these effects would be observable, changing upper level cloud cover by as little as 0.4% produces warming greater than 2 standard deviations in the Microwave Sounding Unit (MSU) channels 4, 2 and 2r, in flight path regions and in the subtropics. Despite the simplified nature of these experiments, the results emphasize the sensitivity of the modeled climate to high level cloud cover changes, and thus the potential ability of aircraft to influence climate by altering clouds in the upper troposphere.
Low cost, high performance processing of single particle cryo-electron microscopy data in the cloud
Cianfrocco, Michael A; Leschziner, Andres E
2015-01-01
The advent of a new generation of electron microscopes and direct electron detectors has realized the potential of single particle cryo-electron microscopy (cryo-EM) as a technique to generate high-resolution structures. Calculating these structures requires high performance computing clusters, a resource that may be limiting to many likely cryo-EM users. To address this limitation and facilitate the spread of cryo-EM, we developed a publicly available ‘off-the-shelf’ computing environment on Amazon's elastic cloud computing infrastructure. This environment provides users with single particle cryo-EM software packages and the ability to create computing clusters with 16–480+ CPUs. We tested our computing environment using a publicly available 80S yeast ribosome dataset and estimate that laboratories could determine high-resolution cryo-EM structures for $50 to $1500 per structure within a timeframe comparable to local clusters. Our analysis shows that Amazon's cloud computing environment may offer a viable computing environment for cryo-EM. DOI: http://dx.doi.org/10.7554/eLife.06664.001 PMID:25955969
Initial Performance Assessment of CALIOP
NASA Technical Reports Server (NTRS)
Winker, David; Hunt, Bill; McGill, Matthew
2007-01-01
The Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP, pronounced the same as "calliope") is a spaceborne two-wavelength polarizatio n lidar that has been acquiring global data since June 2006. CALIOP p rovides high resolution vertical profiles of clouds and aerosols, and has been designed with a very large linear dynamic range to encompas s the full range of signal returns from aerosols and clouds. CALIOP is the primary instrument carried by the Cloud-Aerosol Lidar and Infrar ed Pathfinder Satellite Observations (CALIPSO) satellite, which was l aunched on April, 28 2006. CALIPSO was developed within the framework of a collaboration between NASA and the French space agency, CNES. I nitial data analysis and validation intercomparisons indicate the qua lity of data from CALIOP meets or exceeds expectations. This paper presents a description of the CALIPSO mission, the CALIOP instrument, an d an initial assessment of on-orbit measurement performance.
Development of a New Model for Accurate Prediction of Cloud Water Deposition on Vegetation
NASA Astrophysics Data System (ADS)
Katata, G.; Nagai, H.; Wrzesinsky, T.; Klemm, O.; Eugster, W.; Burkard, R.
2006-12-01
Scarcity of water resources in arid and semi-arid areas is of great concern in the light of population growth and food shortages. Several experiments focusing on cloud (fog) water deposition on the land surface suggest that cloud water plays an important role in water resource in such regions. A one-dimensional vegetation model including the process of cloud water deposition on vegetation has been developed to better predict cloud water deposition on the vegetation. New schemes to calculate capture efficiency of leaf, cloud droplet size distribution, and gravitational flux of cloud water were incorporated in the model. Model calculations were compared with the data acquired at the Norway spruce forest at the Waldstein site, Germany. High performance of the model was confirmed by comparisons of calculated net radiation, sensible and latent heat, and cloud water fluxes over the forest with measurements. The present model provided a better prediction of measured turbulent and gravitational fluxes of cloud water over the canopy than the Lovett model, which is a commonly used cloud water deposition model. Detailed calculations of evapotranspiration and of turbulent exchange of heat and water vapor within the canopy and the modifications are necessary for accurate prediction of cloud water deposition. Numerical experiments to examine the dependence of cloud water deposition on the vegetation species (coniferous and broad-leaved trees, flat and cylindrical grasses) and structures (Leaf Area Index (LAI) and canopy height) are performed using the presented model. The results indicate that the differences of leaf shape and size have a large impact on cloud water deposition. Cloud water deposition also varies with the growth of vegetation and seasonal change of LAI. We found that the coniferous trees whose height and LAI are 24 m and 2.0 m2m-2, respectively, produce the largest amount of cloud water deposition in all combinations of vegetation species and structures in the experiments.
An origin of arc structures deeply embedded in dense molecular cloud cores
NASA Astrophysics Data System (ADS)
Matsumoto, Tomoaki; Onishi, Toshikazu; Tokuda, Kazuki; Inutsuka, Shu-ichiro
2015-04-01
We investigated the formation of arc-like structures in the infalling envelope around protostars, motivated by the recent Atacama Large Millimeter/Submillimeter Array (ALMA) observations of the high-density molecular cloud core, MC27/L1521F. We performed self-gravitational hydrodynamical numerical simulations with an adaptive mesh refinement code. A filamentary cloud with a 0.1 pc width fragments into cloud cores because of perturbations due to weak turbulence. The cloud core undergoes gravitational collapse to form multiple protostars, and gravitational torque from the orbiting protostars produces arc structures extending up to a 1000 au scale. As well as on a spatial extent, the velocity ranges of the arc structures, ˜0.5 km s-1, are in agreement with the ALMA observations. We also found that circumstellar discs are often misaligned in triple system. The misalignment is caused by the tidal interaction between the protostars when they undergo close encounters because of a highly eccentric orbit of the tight binary pair.
The Confluence of GIS, Cloud and Open Source, Enabling Big Raster Data Applications
NASA Astrophysics Data System (ADS)
Plesea, L.; Emmart, C. B.; Boller, R. A.; Becker, P.; Baynes, K.
2016-12-01
The rapid evolution of available cloud services is profoundly changing the way applications are being developed and used. Massive object stores, service scalability, continuous integration are some of the most important cloud technology advances that directly influence science applications and GIS. At the same time, more and more scientists are using GIS platforms in their day to day research. Yet with new opportunities there are always some challenges. Given the large amount of data commonly required in science applications, usually large raster datasets, connectivity is one of the biggest problems. Connectivity has two aspects, one being the limited bandwidth and latency of the communication link due to the geographical location of the resources, the other one being the interoperability and intrinsic efficiency of the interface protocol used to connect. NASA and Esri are actively helping each other and collaborating on a few open source projects, aiming to provide some of the core technology components to directly address the GIS enabled data connectivity problems. Last year Esri contributed LERC, a very fast and efficient compression algorithm to the GDAL/MRF format, which itself is a NASA/Esri collaboration project. The MRF raster format has some cloud aware features that make it possible to build high performance web services on cloud platforms, as some of the Esri projects demonstrate. Currently, another NASA open source project, the high performance OnEarth WMTS server is being refactored and enhanced to better integrate with MRF, GDAL and Esri software. Taken together, the GDAL, MRF and OnEarth form the core of an open source CloudGIS toolkit that is already showing results. Since it is well integrated with GDAL, which is the most common interoperability component of GIS applications, this approach should improve the connectivity and performance of many science and GIS applications in the cloud.
Galaxy CloudMan: delivering cloud compute clusters.
Afgan, Enis; Baker, Dannon; Coraor, Nate; Chapman, Brad; Nekrutenko, Anton; Taylor, James
2010-12-21
Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is "cloud computing", which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate "as is" use by experimental biologists. We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon's EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge.
Galaxy CloudMan: delivering cloud compute clusters
2010-01-01
Background Widespread adoption of high-throughput sequencing has greatly increased the scale and sophistication of computational infrastructure needed to perform genomic research. An alternative to building and maintaining local infrastructure is “cloud computing”, which, in principle, offers on demand access to flexible computational infrastructure. However, cloud computing resources are not yet suitable for immediate “as is” use by experimental biologists. Results We present a cloud resource management system that makes it possible for individual researchers to compose and control an arbitrarily sized compute cluster on Amazon’s EC2 cloud infrastructure without any informatics requirements. Within this system, an entire suite of biological tools packaged by the NERC Bio-Linux team (http://nebc.nerc.ac.uk/tools/bio-linux) is available for immediate consumption. The provided solution makes it possible, using only a web browser, to create a completely configured compute cluster ready to perform analysis in less than five minutes. Moreover, we provide an automated method for building custom deployments of cloud resources. This approach promotes reproducibility of results and, if desired, allows individuals and labs to add or customize an otherwise available cloud system to better meet their needs. Conclusions The expected knowledge and associated effort with deploying a compute cluster in the Amazon EC2 cloud is not trivial. The solution presented in this paper eliminates these barriers, making it possible for researchers to deploy exactly the amount of computing power they need, combined with a wealth of existing analysis software, to handle the ongoing data deluge. PMID:21210983
Cloud Impacts on Pavement Temperature in Energy Balance Models
NASA Astrophysics Data System (ADS)
Walker, C. L.
2013-12-01
Forecast systems provide decision support for end-users ranging from the solar energy industry to municipalities concerned with road safety. Pavement temperature is an important variable when considering vehicle response to various weather conditions. A complex, yet direct relationship exists between tire and pavement temperatures. Literature has shown that as tire temperature increases, friction decreases which affects vehicle performance. Many forecast systems suffer from inaccurate radiation forecasts resulting in part from the inability to model different types of clouds and their influence on radiation. This research focused on forecast improvement by determining how cloud type impacts the amount of shortwave radiation reaching the surface and subsequent pavement temperatures. The study region was the Great Plains where surface solar radiation data were obtained from the High Plains Regional Climate Center's Automated Weather Data Network stations. Road pavement temperature data were obtained from the Meteorological Assimilation Data Ingest System. Cloud properties and radiative transfer quantities were obtained from the Clouds and Earth's Radiant Energy System mission via Aqua and Terra Moderate Resolution Imaging Spectroradiometer satellite products. An additional cloud data set was incorporated from the Naval Research Laboratory Cloud Classification algorithm. Statistical analyses using a modified nearest neighbor approach were first performed relating shortwave radiation variability with road pavement temperature fluctuations. Then statistical associations were determined between the shortwave radiation and cloud property data sets. Preliminary results suggest that substantial pavement forecasting improvement is possible with the inclusion of cloud-specific information. Future model sensitivity testing seeks to quantify the magnitude of forecast improvement.
Information content of OCO-2 oxygen A-band channels for retrieving marine liquid cloud properties
NASA Astrophysics Data System (ADS)
Richardson, Mark; Stephens, Graeme L.
2018-03-01
Information content analysis is used to select channels for a marine liquid cloud retrieval using the high-spectral-resolution oxygen A-band instrument on NASA's Orbiting Carbon Observatory-2 (OCO-2). Desired retrieval properties are cloud optical depth, cloud-top pressure and cloud pressure thickness, which is the geometric thickness expressed in hectopascals. Based on information content criteria we select a micro-window of 75 of the 853 functioning OCO-2 channels spanning 763.5-764.6 nm and perform a series of synthetic retrievals with perturbed initial conditions. We estimate posterior errors from the sample standard deviations and obtain ±0.75 in optical depth and ±12.9 hPa in both cloud-top pressure and cloud pressure thickness, although removing the 10 % of samples with the highest χ2 reduces posterior error in cloud-top pressure to ±2.9 hPa and cloud pressure thickness to ±2.5 hPa. The application of this retrieval to real OCO-2 measurements is briefly discussed, along with limitations and the greatest caution is urged regarding the assumption of a single homogeneous cloud layer, which is often, but not always, a reasonable approximation for marine boundary layer clouds.
Nocturnal low-level clouds over southern West Africa analysed using high-resolution simulations
NASA Astrophysics Data System (ADS)
Adler, Bianca; Kalthoff, Norbert; Gantner, Leonhard
2017-01-01
We performed a high-resolution numerical simulation to study the development of extensive low-level clouds that frequently form over southern West Africa during the monsoon season. This study was made in preparation for a field campaign in 2016 within the Dynamics-aerosol-chemistry-cloud interactions in West Africa (DACCIWA) project and focuses on an area around the city of Savè in southern Benin. Nocturnal low-level clouds evolve a few hundred metres above the ground around the same level as a distinct low-level jet. Several processes are found to determine the spatio-temporal evolution of these clouds including (i) significant cooling of the nocturnal atmosphere caused by horizontal advection with the south-westerly monsoon flow during the first half of the night, (ii) vertical cold air advection due to gravity waves leading to clouds in the wave crests and (iii) enhanced convergence and upward motion upstream of existing clouds that trigger new clouds. The latter is caused by an upward shift of the low-level jet in cloudy areas leading to horizontal convergence in the lower part and to horizontal divergence in the upper part of the cloud layer. Although this single case study hardly allows for a generalisation of the processes found, the results added to the optimisation of the measurements strategy for the field campaign and the observations will be used to test the hypotheses for cloud formation resulting from this study.
2011-08-01
5 Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis...classification of streaming data. Example input images (top left). All digit prototypes (cluster centers) found, with size proportional to frequency (top...Figure 4 Architetural diagram of running Blender on Amazon EC2 through Nimbis 1 http
Evaluation of Decision Trees for Cloud Detection from AVHRR Data
NASA Technical Reports Server (NTRS)
Shiffman, Smadar; Nemani, Ramakrishna
2005-01-01
Automated cloud detection and tracking is an important step in assessing changes in radiation budgets associated with global climate change via remote sensing. Data products based on satellite imagery are available to the scientific community for studying trends in the Earth's atmosphere. The data products include pixel-based cloud masks that assign cloud-cover classifications to pixels. Many cloud-mask algorithms have the form of decision trees. The decision trees employ sequential tests that scientists designed based on empirical astrophysics studies and simulations. Limitations of existing cloud masks restrict our ability to accurately track changes in cloud patterns over time. In a previous study we compared automatically learned decision trees to cloud masks included in Advanced Very High Resolution Radiometer (AVHRR) data products from the year 2000. In this paper we report the replication of the study for five-year data, and for a gold standard based on surface observations performed by scientists at weather stations in the British Islands. For our sample data, the accuracy of automatically learned decision trees was greater than the accuracy of the cloud masks p < 0.001.
Cloud Computing and Its Applications in GIS
NASA Astrophysics Data System (ADS)
Kang, Cao
2011-12-01
Cloud computing is a novel computing paradigm that offers highly scalable and highly available distributed computing services. The objectives of this research are to: 1. analyze and understand cloud computing and its potential for GIS; 2. discover the feasibilities of migrating truly spatial GIS algorithms to distributed computing infrastructures; 3. explore a solution to host and serve large volumes of raster GIS data efficiently and speedily. These objectives thus form the basis for three professional articles. The first article is entitled "Cloud Computing and Its Applications in GIS". This paper introduces the concept, structure, and features of cloud computing. Features of cloud computing such as scalability, parallelization, and high availability make it a very capable computing paradigm. Unlike High Performance Computing (HPC), cloud computing uses inexpensive commodity computers. The uniform administration systems in cloud computing make it easier to use than GRID computing. Potential advantages of cloud-based GIS systems such as lower barrier to entry are consequently presented. Three cloud-based GIS system architectures are proposed: public cloud- based GIS systems, private cloud-based GIS systems and hybrid cloud-based GIS systems. Public cloud-based GIS systems provide the lowest entry barriers for users among these three architectures, but their advantages are offset by data security and privacy related issues. Private cloud-based GIS systems provide the best data protection, though they have the highest entry barriers. Hybrid cloud-based GIS systems provide a compromise between these extremes. The second article is entitled "A cloud computing algorithm for the calculation of Euclidian distance for raster GIS". Euclidean distance is a truly spatial GIS algorithm. Classical algorithms such as the pushbroom and growth ring techniques require computational propagation through the entire raster image, which makes it incompatible with the distributed nature of cloud computing. This paper presents a parallel Euclidean distance algorithm that works seamlessly with the distributed nature of cloud computing infrastructures. The mechanism of this algorithm is to subdivide a raster image into sub-images and wrap them with a one pixel deep edge layer of individually computed distance information. Each sub-image is then processed by a separate node, after which the resulting sub-images are reassembled into the final output. It is shown that while any rectangular sub-image shape can be used, those approximating squares are computationally optimal. This study also serves as a demonstration of this subdivide and layer-wrap strategy, which would enable the migration of many truly spatial GIS algorithms to cloud computing infrastructures. However, this research also indicates that certain spatial GIS algorithms such as cost distance cannot be migrated by adopting this mechanism, which presents significant challenges for the development of cloud-based GIS systems. The third article is entitled "A Distributed Storage Schema for Cloud Computing based Raster GIS Systems". This paper proposes a NoSQL Database Management System (NDDBMS) based raster GIS data storage schema. NDDBMS has good scalability and is able to use distributed commodity computers, which make it superior to Relational Database Management Systems (RDBMS) in a cloud computing environment. In order to provide optimized data service performance, the proposed storage schema analyzes the nature of commonly used raster GIS data sets. It discriminates two categories of commonly used data sets, and then designs corresponding data storage models for both categories. As a result, the proposed storage schema is capable of hosting and serving enormous volumes of raster GIS data speedily and efficiently on cloud computing infrastructures. In addition, the scheme also takes advantage of the data compression characteristics of Quadtrees, thus promoting efficient data storage. Through this assessment of cloud computing technology, the exploration of the challenges and solutions to the migration of GIS algorithms to cloud computing infrastructures, and the examination of strategies for serving large amounts of GIS data in a cloud computing infrastructure, this dissertation lends support to the feasibility of building a cloud-based GIS system. However, there are still challenges that need to be addressed before a full-scale functional cloud-based GIS system can be successfully implemented. (Abstract shortened by UMI.)
Impact of different cloud deployments on real-time video applications for mobile video cloud users
NASA Astrophysics Data System (ADS)
Khan, Kashif A.; Wang, Qi; Luo, Chunbo; Wang, Xinheng; Grecos, Christos
2015-02-01
The latest trend to access mobile cloud services through wireless network connectivity has amplified globally among both entrepreneurs and home end users. Although existing public cloud service vendors such as Google, Microsoft Azure etc. are providing on-demand cloud services with affordable cost for mobile users, there are still a number of challenges to achieve high-quality mobile cloud based video applications, especially due to the bandwidth-constrained and errorprone mobile network connectivity, which is the communication bottleneck for end-to-end video delivery. In addition, existing accessible clouds networking architectures are different in term of their implementation, services, resources, storage, pricing, support and so on, and these differences have varied impact on the performance of cloud-based real-time video applications. Nevertheless, these challenges and impacts have not been thoroughly investigated in the literature. In our previous work, we have implemented a mobile cloud network model that integrates localized and decentralized cloudlets (mini-clouds) and wireless mesh networks. In this paper, we deploy a real-time framework consisting of various existing Internet cloud networking architectures (Google Cloud, Microsoft Azure and Eucalyptus Cloud) and a cloudlet based on Ubuntu Enterprise Cloud over wireless mesh networking technology for mobile cloud end users. It is noted that the increasing trend to access real-time video streaming over HTTP/HTTPS is gaining popularity among both research and industrial communities to leverage the existing web services and HTTP infrastructure in the Internet. To study the performance under different deployments using different public and private cloud service providers, we employ real-time video streaming over the HTTP/HTTPS standard, and conduct experimental evaluation and in-depth comparative analysis of the impact of different deployments on the quality of service for mobile video cloud users. Empirical results are presented and discussed to quantify and explain the different impacts resulted from various cloud deployments, video application and wireless/mobile network setting, and user mobility. Additionally, this paper analyses the advantages, disadvantages, limitations and optimization techniques in various cloud networking deployments, in particular the cloudlet approach compared with the Internet cloud approach, with recommendations of optimized deployments highlighted. Finally, federated clouds and inter-cloud collaboration challenges and opportunities are discussed in the context of supporting real-time video applications for mobile users.
NASA Astrophysics Data System (ADS)
Hsu, Kuo-Hsien
2012-11-01
Formosat-2 image is a kind of high-spatial-resolution (2 meters GSD) remote sensing satellite data, which includes one panchromatic band and four multispectral bands (Blue, Green, Red, near-infrared). An essential sector in the daily processing of received Formosat-2 image is to estimate the cloud statistic of image using Automatic Cloud Coverage Assessment (ACCA) algorithm. The information of cloud statistic of image is subsequently recorded as an important metadata for image product catalog. In this paper, we propose an ACCA method with two consecutive stages: preprocessing and post-processing analysis. For pre-processing analysis, the un-supervised K-means classification, Sobel's method, thresholding method, non-cloudy pixels reexamination, and cross-band filter method are implemented in sequence for cloud statistic determination. For post-processing analysis, Box-Counting fractal method is implemented. In other words, the cloud statistic is firstly determined via pre-processing analysis, the correctness of cloud statistic of image of different spectral band is eventually cross-examined qualitatively and quantitatively via post-processing analysis. The selection of an appropriate thresholding method is very critical to the result of ACCA method. Therefore, in this work, We firstly conduct a series of experiments of the clustering-based and spatial thresholding methods that include Otsu's, Local Entropy(LE), Joint Entropy(JE), Global Entropy(GE), and Global Relative Entropy(GRE) method, for performance comparison. The result shows that Otsu's and GE methods both perform better than others for Formosat-2 image. Additionally, our proposed ACCA method by selecting Otsu's method as the threshoding method has successfully extracted the cloudy pixels of Formosat-2 image for accurate cloud statistic estimation.
Assessment of different models for computing the probability of a clear line of sight
NASA Astrophysics Data System (ADS)
Bojin, Sorin; Paulescu, Marius; Badescu, Viorel
2017-12-01
This paper is focused on modeling the morphological properties of the cloud fields in terms of the probability of a clear line of sight (PCLOS). PCLOS is defined as the probability that a line of sight between observer and a given point of the celestial vault goes freely without intersecting a cloud. A variety of PCLOS models assuming the cloud shape hemisphere, semi-ellipsoid and ellipsoid are tested. The effective parameters (cloud aspect ratio and absolute cloud fraction) are extracted from high-resolution series of sunshine number measurements. The performance of the PCLOS models is evaluated from the perspective of their ability in retrieving the point cloudiness. The advantages and disadvantages of the tested models are discussed, aiming to a simplified parameterization of PCLOS models.
Status of High Latitude Precipitation Estimates from Observations and Reanalyses
NASA Technical Reports Server (NTRS)
Behrangi, Ali; Christensen, Matthew; Richardson, Mark; Lebsock, Matthew; Stephens, Graeme; Huffman, George J.; Bolvin, David T.; Adler, Robert F.; Gardner, Alex; Lambrigtsen, Bjorn H.;
2016-01-01
An intercomparison of high-latitude precipitation characteristics from observation-based and reanalysis products is performed. In particular, the precipitation products from CloudSat provide an independent assessment to other widely used products, these being the observationally based Global Precipitation Climatology Project (GPCP), Global Precipitation Climatology Centre, and Climate Prediction Center Merged Analysis of Precipitation (CMAP) products and the ERA-Interim, Modern-Era Retrospective Analysis for Research and Applications (MERRA), and National Centers for Environmental Prediction-Department of Energy Reanalysis 2 (NCEP-DOE R2) reanalyses. Seasonal and annual total precipitation in both hemispheres poleward of 55 latitude are considered in all products, and CloudSat is used to assess intensity and frequency of precipitation occurrence by phase, defined as rain, snow, or mixed phase. Furthermore, an independent estimate of snow accumulation during the cold season was calculated from the Gravity Recovery and Climate Experiment. The intercomparison is performed for the 20072010 period when CloudSat was fully operational. It is found that ERA-Interim and MERRA are broadly similar, agreeing more closely with CloudSat over oceans. ERA-Interim also agrees well with CloudSat estimates of snowfall over Antarctica where total snowfall from GPCP and CloudSat is almost identical. A number of disagreements on regional or seasonal scales are identified: CMAP reports much lower ocean precipitation relative to other products, NCEP-DOE R2 reports much higher summer precipitation over Northern Hemisphere land, GPCP reports much higher snowfall over Eurasia, and CloudSat overestimates precipitation over Greenland, likely due to mischaracterization of rain and mixed-phase precipitation. These outliers are likely unrealistic for these specific regions and time periods. These estimates from observations and reanalyses provide useful insights for diagnostic assessment of precipitation products in high latitudes, quantifying the current uncertainties, improving the products, and establishing a benchmark for assessment of climate models.
Cloud Computing with iPlant Atmosphere.
McKay, Sheldon J; Skidmore, Edwin J; LaRose, Christopher J; Mercer, Andre W; Noutsos, Christos
2013-10-15
Cloud Computing refers to distributed computing platforms that use virtualization software to provide easy access to physical computing infrastructure and data storage, typically administered through a Web interface. Cloud-based computing provides access to powerful servers, with specific software and virtual hardware configurations, while eliminating the initial capital cost of expensive computers and reducing the ongoing operating costs of system administration, maintenance contracts, power consumption, and cooling. This eliminates a significant barrier to entry into bioinformatics and high-performance computing for many researchers. This is especially true of free or modestly priced cloud computing services. The iPlant Collaborative offers a free cloud computing service, Atmosphere, which allows users to easily create and use instances on virtual servers preconfigured for their analytical needs. Atmosphere is a self-service, on-demand platform for scientific computing. This unit demonstrates how to set up, access and use cloud computing in Atmosphere. Copyright © 2013 John Wiley & Sons, Inc.
Electron Cloud Measurements in Fermilab Main Injector and Recycler
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eldred, Jeffrey Scott; Backfish, M.; Tan, C. Y.
This conference paper presents a series of electron cloud measurements in the Fermilab Main Injector and Recycler. A new instability was observed in the Recycler in July 2014 that generates a fast transverse excitation in the first high intensity batch to be injected. Microwave measurements of electron cloud in the Recycler show a corresponding depen- dence on the batch injection pattern. These electron cloud measurements are compared to those made with a retard- ing field analyzer (RFA) installed in a field-free region of the Recycler in November. RFAs are also used in the Main Injector to evaluate the performance ofmore » beampipe coatings for the mitigation of electron cloud. Contamination from an unexpected vacuum leak revealed a potential vulnerability in the amorphous carbon beampipe coating. The diamond-like carbon coating, in contrast, reduced the electron cloud signal to 1% of that measured in uncoated stainless steel beampipe.« less
Sector and Sphere: the design and implementation of a high-performance data cloud
Gu, Yunhong; Grossman, Robert L.
2009-01-01
Cloud computing has demonstrated that processing very large datasets over commodity clusters can be done simply, given the right programming model and infrastructure. In this paper, we describe the design and implementation of the Sector storage cloud and the Sphere compute cloud. By contrast with the existing storage and compute clouds, Sector can manage data not only within a data centre, but also across geographically distributed data centres. Similarly, the Sphere compute cloud supports user-defined functions (UDFs) over data both within and across data centres. As a special case, MapReduce-style programming can be implemented in Sphere by using a Map UDF followed by a Reduce UDF. We describe some experimental studies comparing Sector/Sphere and Hadoop using the Terasort benchmark. In these studies, Sector is approximately twice as fast as Hadoop. Sector/Sphere is open source. PMID:19451100
NASA Astrophysics Data System (ADS)
Nomokonova, Tatiana; Ebell, Kerstin; Löhnert, Ulrich; Maturilli, Marion
2017-04-01
Clouds are one of the crucial components of the hydrological and energy cycles and thus affecting the global climate. Their special importance in Arctic regions is defined by cloud's influence on the radiation budget. Arctic clouds usually occur at low altitudes and often contain highly concentrated tiny liquid drops. During winter, spring, and autumn periods such clouds tend to conserve the long-wave radiation in the atmosphere and, thus, produce warming of the Arctic climate. In summer though clouds efficiently scatter the solar radiation back to space and, therefore, induce a cooling effect. An accurate characterization of the net effect of clouds on the Arctic climate requires long-term and precise observations. However, only a few measurement sites exist which perform continuous, vertically resolved observations of clouds in the Arctic, e.g. in Alaska, Canada, and Greenland. These sites typically make use of a combination of different ground-based remote sensing instruments, e.g. cloud radar, ceilometer and microwave radiometer in order to characterize clouds. Within the Transregional Collaborative Research Center (TR 172) "Arctic Amplification: Climate Relevant Atmospheric and Surface Processes, and Feedback Mechanisms (AC)3" comprehensive observations of the atmospheric column are performed at the German-French Research Station AWIPEV at Ny-Ålesund, Svalbard. Ny-Ålesund is located in the warmest part of the Arctic where climate is significantly influenced by adiabatic heating from the warm ocean. Thus, measurements at Ny-Ålesund will complement our understanding of cloud formation and development in the Arctic. This particular study is devoted to the characterization of the cloud macro- and microphysical properties at Ny-Ålesund and of the atmospheric conditions, under which these clouds form and develop. To this end, the information of the various instrumentation at the AWIPEV observatory is synergistically analysed: information about the thermodynamic structure of the atmosphere is obtained from long-term radiosonde launches. In addition, continuous vertical profiles of temperature and humidity are provided by the microwave radiometer HATPRO. A set of active remote sensing instruments performs cloud observations at Ny-Ålesund: a ceilometer and a Doppler lidar operating since 2011 and 2013, respectively, are now complemented with a novel 94 GHz FMCW cloud radar. As a first step, the CLOUDNET algorithms, including a target categorization and classification, are applied to the observations. In this study, we will present a first analysis of cloud properties at Ny-Ålesund including for example cloud occurrence, cloud geometry (cloud base, cloud top, and thickness) and cloud type (liquid, ice, mixed-phase). The different types of clouds are set into context to the environmental conditions such as temperature, amount of water vapour, and liquid water. We also expect that the cloud properties strongly depend on the wind direction. The first results of this analysis will be also shown.
NASA Astrophysics Data System (ADS)
Hoose, C.; Lohmann, U.; Stier, P.; Verheggen, B.; Weingartner, E.; Herich, H.
2007-12-01
The global aerosol-climate model ECHAM5-HAM (Stier et al., 2005) has been extended by an explicit treatment of cloud-borne particles. Two additional modes for in-droplet and in-crystal particles are introduced, which are coupled to the number of cloud droplet and ice crystal concentrations simulated by the ECHAM5 double-moment cloud microphysics scheme (Lohmann et al., 2007). Transfer, production and removal of cloud-borne aerosol number and mass by cloud droplet activation, collision scavenging, aqueous-phase sulfate production, freezing, melting, evaporation, sublimation and precipitation formation are taken into account. The model performance is demonstrated and validated with observations of the evolution of total and interstitial aerosol concentrations and size distributions during three different mixed-phase cloud events at the alpine high-altitude research station Jungfraujoch (Switzerland) (Verheggen et al, 2007). Although the single-column simulations can not be compared one-to-one with the observations, the governing processes in the evolution of the cloud and aerosol parameters are captured qualitatively well. High scavenged fractions are found during the presence of liquid water, while the release of particles during the Bergeron-Findeisen process results in low scavenged fractions after cloud glaciation. The observed coexistence of liquid and ice, which might be related to cloud heterogeneity at subgrid scales, can only be simulated in the model when forcing non-equilibrium conditions. References: U. Lohmann et al., Cloud microphysics and aerosol indirect effects in the global climate model ECHAM5-HAM, Atmos. Chem. Phys. 7, 3425-3446 (2007) P. Stier et al., The aerosol-climate model ECHAM5-HAM, Atmos. Chem. Phys. 5, 1125-1156 (2005) B. Verheggen et al., Aerosol partitioning between the interstitial and the condensed phase in mixed-phase clouds, Accepted for publication in J. Geophys. Res. (2007)
NASA Technical Reports Server (NTRS)
da Silva, Arlindo M.; Norris, Peter M.
2013-01-01
Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.
Cirrus Cloud Retrieval Using Infrared Sounding Data: Multilevel Cloud Errors.
NASA Astrophysics Data System (ADS)
Baum, Bryan A.; Wielicki, Bruce A.
1994-01-01
In this study we perform an error analysis for cloud-top pressure retrieval using the High-Resolution Infrared Radiometric Sounder (HIRS/2) 15-µm CO2 channels for the two-layer case of transmissive cirrus overlying an overcast, opaque stratiform cloud. This analysis includes standard deviation and bias error due to instrument noise and the presence of two cloud layers, the lower of which is opaque. Instantaneous cloud pressure retrieval errors are determined for a range of cloud amounts (0.1 1.0) and cloud-top pressures (850250 mb). Large cloud-top pressure retrieval errors are found to occur when a lower opaque layer is present underneath an upper transmissive cloud layer in the satellite field of view (FOV). Errors tend to increase with decreasing upper-cloud elective cloud amount and with decreasing cloud height (increasing pressure). Errors in retrieved upper-cloud pressure result in corresponding errors in derived effective cloud amount. For the case in which a HIRS FOV has two distinct cloud layers, the difference between the retrieved and actual cloud-top pressure is positive in all casts, meaning that the retrieved upper-cloud height is lower than the actual upper-cloud height. In addition, errors in retrieved cloud pressure are found to depend upon the lapse rate between the low-level cloud top and the surface. We examined which sounder channel combinations would minimize the total errors in derived cirrus cloud height caused by instrument noise and by the presence of a lower-level cloud. We find that while the sounding channels that peak between 700 and 1000 mb minimize random errors, the sounding channels that peak at 300—500 mb minimize bias errors. For a cloud climatology, the bias errors are most critical.
Ground truth spectrometry and imagery of eruption clouds to maximize utility of satellite imagery
NASA Technical Reports Server (NTRS)
Rose, William I.
1993-01-01
Field experiments with thermal imaging infrared radiometers were performed and a laboratory system was designed for controlled study of simulated ash clouds. Using AVHRR (Advanced Very High Resolution Radiometer) thermal infrared bands 4 and 5, a radiative transfer method was developed to retrieve particle sizes, optical depth and particle mass involcanic clouds. A model was developed for measuring the same parameters using TIMS (Thermal Infrared Multispectral Scanner), MODIS (Moderate Resolution Imaging Spectrometer), and ASTER (Advanced Spaceborne Thermal Emission and Reflection Radiometer). Related publications are attached.
An Automatic Cloud Mask Algorithm Based on Time Series of MODIS Measurements
NASA Technical Reports Server (NTRS)
Lyapustin, Alexei; Wang, Yujie; Frey, R.
2008-01-01
Quality of aerosol retrievals and atmospheric correction depends strongly on accuracy of the cloud mask (CM) algorithm. The heritage CM algorithms developed for AVHRR and MODIS use the latest sensor measurements of spectral reflectance and brightness temperature and perform processing at the pixel level. The algorithms are threshold-based and empirically tuned. They don't explicitly address the classical problem of cloud search, wherein the baseline clear-skies scene is defined for comparison. Here, we report on a new CM algorithm which explicitly builds and maintains a reference clear-skies image of the surface (refcm) using a time series of MODIS measurements. The new algorithm, developed as part of the Multi-Angle Implementation of Atmospheric Correction (MAIAC) algorithm for MODIS, relies on fact that clear-skies images of the same surface area have a common textural pattern, defined by the surface topography, boundaries of rivers and lakes, distribution of soils and vegetation etc. This pattern changes slowly given the daily rate of global Earth observations, whereas clouds introduce high-frequency random disturbances. Under clear skies, consecutive gridded images of the same surface area have a high covariance, whereas in presence of clouds covariance is usually low. This idea is central to initialization of refcm which is used to derive cloud mask in combination with spectral and brightness temperature tests. The refcm is continuously updated with the latest clear-skies MODIS measurements, thus adapting to seasonal and rapid surface changes. The algorithm is enhanced by an internal dynamic land-water-snow classification coupled with a surface change mask. An initial comparison shows that the new algorithm offers the potential to perform better than the MODIS MOD35 cloud mask in situations where the land surface is changing rapidly, and over Earth regions covered by snow and ice.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martin, Shawn
This code consists of Matlab routines which enable the user to perform non-manifold surface reconstruction via triangulation from high dimensional point cloud data. The code was based on an algorithm originally developed in [Freedman (2007), An Incremental Algorithm for Reconstruction of Surfaces of Arbitrary Codimension Computational Geometry: Theory and Applications, 36(2):106-116]. This algorithm has been modified to accommodate non-manifold surface according to the work described in [S. Martin and J.-P. Watson (2009), Non-Manifold Surface Reconstruction from High Dimensional Point Cloud DataSAND #5272610].The motivation for developing the code was a point cloud describing the molecular conformation space of cyclooctane (C8H16). Cyclooctanemore » conformation space was represented using points in 72 dimensions (3 coordinates for each molecule). The code was used to triangulate the point cloud and thereby study the geometry and topology of cyclooctane. Futures applications are envisioned for peptides and proteins.« less
Design and deployment of an elastic network test-bed in IHEP data center based on SDN
NASA Astrophysics Data System (ADS)
Zeng, Shan; Qi, Fazhi; Chen, Gang
2017-10-01
High energy physics experiments produce huge amounts of raw data, while because of the sharing characteristics of the network resources, there is no guarantee of the available bandwidth for each experiment which may cause link congestion problems. On the other side, with the development of cloud computing technologies, IHEP have established a cloud platform based on OpenStack which can ensure the flexibility of the computing and storage resources, and more and more computing applications have been deployed on virtual machines established by OpenStack. However, under the traditional network architecture, network capability can’t be required elastically, which becomes the bottleneck of restricting the flexible application of cloud computing. In order to solve the above problems, we propose an elastic cloud data center network architecture based on SDN, and we also design a high performance controller cluster based on OpenDaylight. In the end, we present our current test results.
Optical Depth Sensor (ODS) for the measurement of dust and clouds properties in the Mars atmosphere
NASA Astrophysics Data System (ADS)
Toledo, D.; Rannou, P.; Pommereau, J.-P.; Sarkissian, A.; Foujols, T.
2014-04-01
A small and sophisticated optical depth sensor (ODS) has been designed to work in both Martian and Earth environments. The principal goal of ODS is to carry out the opacity due to the Martian dust as well as to characterize the high altitude clouds at twilight, crucial parameters in understanding of Martian meteorology. The instrument was initially designed for the failed MARS96 Russian mission, and also was included in the payload of several other missions [1]. Until recently, it was selected (NASA/ESA AO) in the payload of the atmospheric package DREAMS onboard the MARS 2016 mission. But following a decision of the CNES, it is no more included in the payload. In order to study the performance of ODS under a wide range of conditions as well as its capable to provide daily measurements of both dust optical thickness and high altitude clouds properties, the instrument has participated in different terrestrial campaigns. A good performance of ODS prototype (Figure 1) on cirrus clouds detection and in dust opacity estimation was previously archived in Africa during 2004-2005 and in Brasil from 2012 to nowadays. Moreover, a campaign in the arctic is expected before 2016 where fifteen ODSs will be part of an integrated observing system over the Arctic Ocean, allowing test the ODS performance in extreme conditions. In this presentation we present main principle of the retrieval, the instrumental concept, the result of the tests performed and the principal objectives of ODS in Mars.
Beam tests of beampipe coatings for electron cloud mitigation in Fermilab Main Injector
Backfish, Michael; Eldred, Jeffrey; Tan, Cheng Yang; ...
2015-10-26
Electron cloud beam instabilities are an important consideration in virtually all high-energy particle accelerators and could pose a formidable challenge to forthcoming high-intensity accelerator upgrades. Dedicated tests have shown beampipe coatings dramatically reduce the density of electron cloud in particle accelerators. In this work, we evaluate the performance of titanium nitride, amorphous carbon, and diamond-like carbon as beampipe coatings for the mitigation of electron cloud in the Fermilab Main Injector. Altogether our tests represent 2700 ampere-hours of proton operation spanning five years. Three electron cloud detectors, retarding field analyzers, are installed in a straight section and allow a direct comparisonmore » between the electron flux in the coated and uncoated stainless steel beampipe. We characterize the electron flux as a function of intensity up to a maximum of 50 trillion protons per cycle. Each beampipe material conditions in response to electron bombardment from the electron cloud and we track the changes in these materials as a function of time and the number of absorbed electrons. Contamination from an unexpected vacuum leak revealed a potential vulnerability in the amorphous carbon beampipe coating. We measure the energy spectrum of electrons incident on the stainless steel, titanium nitride and amorphous carbon beampipes. We find the electron cloud signal is highly sensitive to stray magnetic fields and bunch-length over the Main Injector ramp cycle. In conclusion, we conduct a complete survey of the stray magnetic fields at the test station and compare the electron cloud signal to that in a field-free region.« less
Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian
2011-08-30
Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.
Yang, Hui; He, Yongqi; Zhang, Jie; Ji, Yuefeng; Bai, Wei; Lee, Young
2016-04-18
Cloud radio access network (C-RAN) has become a promising scenario to accommodate high-performance services with ubiquitous user coverage and real-time cloud computing using cloud BBUs. In our previous work, we implemented cross stratum optimization of optical network and application stratums resources that allows to accommodate the services in optical networks. In view of this, this study extends to consider the multiple dimensional resources optimization of radio, optical and BBU processing in 5G age. We propose a novel multi-stratum resources optimization (MSRO) architecture with network functions virtualization for cloud-based radio over optical fiber networks (C-RoFN) using software defined control. A global evaluation scheme (GES) for MSRO in C-RoFN is introduced based on the proposed architecture. The MSRO can enhance the responsiveness to dynamic end-to-end user demands and globally optimize radio frequency, optical and BBU resources effectively to maximize radio coverage. The efficiency and feasibility of the proposed architecture are experimentally demonstrated on OpenFlow-based enhanced SDN testbed. The performance of GES under heavy traffic load scenario is also quantitatively evaluated based on MSRO architecture in terms of resource occupation rate and path provisioning latency, compared with other provisioning scheme.
PLAStiCC: Predictive Look-Ahead Scheduling for Continuous dataflows on Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kumbhare, Alok; Simmhan, Yogesh; Prasanna, Viktor K.
2014-05-27
Scalable stream processing and continuous dataflow systems are gaining traction with the rise of big data due to the need for processing high velocity data in near real time. Unlike batch processing systems such as MapReduce and workflows, static scheduling strategies fall short for continuous dataflows due to the variations in the input data rates and the need for sustained throughput. The elastic resource provisioning of cloud infrastructure is valuable to meet the changing resource needs of such continuous applications. However, multi-tenant cloud resources introduce yet another dimension of performance variability that impacts the application’s throughput. In this paper wemore » propose PLAStiCC, an adaptive scheduling algorithm that balances resource cost and application throughput using a prediction-based look-ahead approach. It not only addresses variations in the input data rates but also the underlying cloud infrastructure. In addition, we also propose several simpler static scheduling heuristics that operate in the absence of accurate performance prediction model. These static and adaptive heuristics are evaluated through extensive simulations using performance traces obtained from public and private IaaS clouds. Our results show an improvement of up to 20% in the overall profit as compared to the reactive adaptation algorithm.« less
NASA Astrophysics Data System (ADS)
Ma, Zhanshan; Liu, Qijun; Zhao, Chuanfeng; Shen, Xueshun; Wang, Yuan; Jiang, Jonathan H.; Li, Zhe; Yung, Yuk
2018-03-01
An explicit prognostic cloud-cover scheme (PROGCS) is implemented into the Global/Regional Assimilation and Prediction System (GRAPES) for global middle-range numerical weather predication system (GRAPES_GFS) to improve the model performance in simulating cloud cover and radiation. Unlike the previous diagnostic cloud-cover scheme (DIAGCS), PROGCS considers the formation and dissipation of cloud cover by physically connecting it to the cumulus convection and large-scale stratiform condensation processes. Our simulation results show that clouds in mid-high latitudes arise mainly from large-scale stratiform condensation processes, while cumulus convection and large-scale condensation processes jointly determine cloud cover in low latitudes. Compared with DIAGCS, PROGCS captures more consistent vertical distributions of cloud cover with the observations from Atmospheric Radiation Measurements (ARM) program at the Southern Great Plains (SGP) site and simulates more realistic diurnal cycle of marine stratocumulus with the ERA-Interim reanalysis data. The low, high, and total cloud covers that are determined via PROGCS appear to be more realistic than those simulated via DIAGCS when both are compared with satellite retrievals though the former maintains slight negative biases. In addition, the simulations of outgoing longwave radiation (OLR) at the top of the atmosphere (TOA) from PROGCS runs have been considerably improved as well, resulting in less biases in radiative heating rates at heights below 850 hPa and above 400 hPa of GRAPES_GFS. Our results indicate that a prognostic method of cloud-cover calculation has significant advantage over the conventional diagnostic one, and it should be adopted in both weather and climate simulation and forecast.
Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; ...
2016-04-11
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnostics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud withmore » stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains, 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this study we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.« less
NASA Astrophysics Data System (ADS)
Holtzapple, R. L.; Billing, M. G.; Campbell, R. C.; Dugan, G. F.; Flanagan, J.; McArdle, K. E.; Miller, M. I.; Palmer, M. A.; Ramirez, G. A.; Sonnad, K. G.; Totten, M. M.; Tucker, S. L.; Williams, H. A.
2016-04-01
Electron cloud related emittance dilution and instabilities of bunch trains limit the performance of high intensity circular colliders. One of the key goals of the Cornell electron-positron storage ring Test Accelerator (CesrTA) research program is to improve our understanding of how the electron cloud alters the dynamics of bunches within the train. Single bunch beam diagnotics have been developed to measure the beam spectra, vertical beam size, two important dynamical effects of beams interacting with the electron cloud, for bunch trains on a turn-by-turn basis. Experiments have been performed at CesrTA to probe the interaction of the electron cloud with stored positron bunch trains. The purpose of these experiments was to characterize the dependence of beam-electron cloud interactions on the machine parameters such as bunch spacing, vertical chromaticity, and bunch current. The beam dynamics of the stored beam, in the presence of the electron cloud, was quantified using: 1) a gated beam position monitor (BPM) and spectrum analyzer to measure the bunch-by-bunch frequency spectrum of the bunch trains; 2) an x-ray beam size monitor to record the bunch-by-bunch, turn-by-turn vertical size of each bunch within the trains. In this paper we report on the observations from these experiments and analyze the effects of the electron cloud on the stability of bunches in a train under many different operational conditions.
Active Raman sounding of the earth's water vapor field.
Tratt, David M; Whiteman, David N; Demoz, Belay B; Farley, Robert W; Wessel, John E
2005-08-01
The typically weak cross-sections characteristic of Raman processes has historically limited their use in atmospheric remote sensing to nighttime application. However, with advances in instrumentation and techniques, it is now possible to apply Raman lidar to the monitoring of atmospheric water vapor, aerosols and clouds throughout the diurnal cycle. Upper tropospheric and lower stratospheric measurements of water vapor using Raman lidar are also possible but are limited to nighttime and require long integration times. However, boundary layer studies of water vapor variability can now be performed with high temporal and spatial resolution. This paper will review the current state-of-the-art of Raman lidar for high-resolution measurements of the atmospheric water vapor, aerosol and cloud fields. In particular, we describe the use of Raman lidar for mapping the vertical distribution and variability of atmospheric water vapor, aerosols and clouds throughout the evolution of dynamic meteorological events. The ability of Raman lidar to detect and characterize water in the region of the tropopause and the importance of high-altitude water vapor for climate-related studies and meteorological satellite performance are discussed.
NASA Astrophysics Data System (ADS)
McGibbon, J.; Bretherton, C. S.
2017-06-01
During the Marine ARM GPCI Investigation of Clouds (MAGIC) in October 2011 to September 2012, a container ship making periodic cruises between Los Angeles, CA, and Honolulu, HI, was instrumented with surface meteorological, aerosol and radiation instruments, a cloud radar and ceilometer, and radiosondes. Here large-eddy simulation (LES) is performed in a ship-following frame of reference for 13 four day transects from the MAGIC field campaign. The goal is to assess if LES can skillfully simulate the broad range of observed cloud characteristics and boundary layer structure across the subtropical stratocumulus to cumulus transition region sampled during different seasons and meteorological conditions. Results from Leg 15A, which sampled a particularly well-defined stratocumulus to cumulus transition, demonstrate the approach. The LES reproduces the observed timing of decoupling and transition from stratocumulus to cumulus and matches the observed evolution of boundary layer structure, cloud fraction, liquid water path, and precipitation statistics remarkably well. Considering the simulations of all 13 cruises, the LES skillfully simulates the mean diurnal variation of key measured quantities, including liquid water path (LWP), cloud fraction, measures of decoupling, and cloud radar-derived precipitation. The daily mean quantities are well represented, and daily mean LWP and cloud fraction show the expected correlation with estimated inversion strength. There is a -0.6 K low bias in LES near-surface air temperature that results in a high bias of 5.6 W m-2 in sensible heat flux (SHF). Overall, these results build confidence in the ability of LES to represent the northeast Pacific stratocumulus to trade cumulus transition region.
Task 28: Web Accessible APIs in the Cloud Trade Study
NASA Technical Reports Server (NTRS)
Gallagher, James; Habermann, Ted; Jelenak, Aleksandar; Lee, Joe; Potter, Nathan; Yang, Muqun
2017-01-01
This study explored three candidate architectures for serving NASA Earth Science Hierarchical Data Format Version 5 (HDF5) data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the project are: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and Network Common Data Format Version 4 (netCDF4) data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3).Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.
NASA Astrophysics Data System (ADS)
Groß, Silke; Wirth, Martin; Gutleben, Manuel; Ewald, Florian; Kiemle, Christoph; Kölling, Tobias; Mayer, Bernhard
2017-04-01
Clouds and aerosols have a large impact on the Earth's radiation budget by scattering and absorption of solar and terrestrial radiation. Furthermore aerosols can modify cloud properties and distribution. Up to now no sufficient understanding in aerosol-cloud interaction and in climate feedback of clouds is achieved. Especially shallow marine convection in the trade wind regions show large uncertainties in climate feedback. Thus a better understanding of these shallow marine convective clouds and how aerosols affect these clouds, e.g. by changing the cloud properties and distribution, is highly demanded. During NARVAL-I (Next-generation airborne remote-sensing for validation studies) and NARVAL-II a set of active and passive remote sensing instruments, i.e. a cloud radar, an aerosol and water vapor lidar system, microwave radiometer, a hyper spectral imager (NARVAL-II only) and radiation measurements, were installed on the German research aircraft HALO. Measurements were performed out of Barbados over the tropical North-Atlantic region in December 2013 and August 2016 to study shallow trade wind convection as well as its environment in the dry and wet season. While no or only few aerosol layers were observed above the marine boundary layer during the dry season in December 2013, part of the measurement area was influenced by high aerosol load caused by long-range transport of Saharan dust during the NARVAL-II measurements in August 2016. Measurement flights during NARVAL-II were conducted the way that we could probed aerosol influenced regions as well as areas with low aerosol load. Thus the measurements during both campaigns provide the opportunity to investigate if and how the transported aerosol layers change the distribution and formation of the shallow marine convection by altering their properties and environment. In our presentation we will focus on the lidar measurements performed during NARVAL-I and NARVAL-II. We will give an overview of the measurements and of the general aerosol and cloud situation, and we will show first results how cloud properties and distribution of shallow marine convection change in the presence of lofted aerosol layers. In particular we will determine if aerosols modify horizontal cloud distribution and cloud top height distribution by looking on the correlations between aerosol load and cloud distribution, and we will investigate if and how the presence of the lofted aerosol layer changes the properties of the clouds, e.g. by acting as ice nuclei.
From large-eddy simulation to multi-UAVs sampling of shallow cumulus clouds
NASA Astrophysics Data System (ADS)
Lamraoui, Fayçal; Roberts, Greg; Burnet, Frédéric
2016-04-01
In-situ sampling of clouds that can provide simultaneous measurements at satisfying spatio-temporal resolutions to capture 3D small scale physical processes continues to present challenges. This project (SKYSCANNER) aims at bringing together cloud sampling strategies using a swarm of unmanned aerial vehicles (UAVs) based on Large-eddy simulation (LES). The multi-UAV-based field campaigns with a personalized sampling strategy for individual clouds and cloud fields will significantly improve the understanding of the unresolved cloud physical processes. An extensive set of LES experiments for case studies from ARM-SGP site have been performed using MesoNH model at high resolutions down to 10 m. The carried out simulations led to establishing a macroscopic model that quantifies the interrelationship between micro- and macrophysical properties of shallow convective clouds. Both the geometry and evolution of individual clouds are critical to multi-UAV cloud sampling and path planning. The preliminary findings of the current project reveal several linear relationships that associate many cloud geometric parameters to cloud related meteorological variables. In addition, the horizontal wind speed indicates a proportional impact on cloud number concentration as well as triggering and prolonging the occurrence of cumulus clouds. In the framework of the joint collaboration that involves a Multidisciplinary Team (including institutes specializing in aviation, robotics and atmospheric science), this model will be a reference point for multi-UAVs sampling strategies and path planning.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record.
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology.
Identifying the impact of G-quadruplexes on Affymetrix 3' arrays using cloud computing.
Memon, Farhat N; Owen, Anne M; Sanchez-Graillet, Olivia; Upton, Graham J G; Harrison, Andrew P
2010-01-15
A tetramer quadruplex structure is formed by four parallel strands of DNA/ RNA containing runs of guanine. These quadruplexes are able to form because guanine can Hoogsteen hydrogen bond to other guanines, and a tetrad of guanines can form a stable arrangement. Recently we have discovered that probes on Affymetrix GeneChips that contain runs of guanine do not measure gene expression reliably. We associate this finding with the likelihood that quadruplexes are forming on the surface of GeneChips. In order to cope with the rapidly expanding size of GeneChip array datasets in the public domain, we are exploring the use of cloud computing to replicate our experiments on 3' arrays to look at the effect of the location of G-spots (runs of guanines). Cloud computing is a recently introduced high-performance solution that takes advantage of the computational infrastructure of large organisations such as Amazon and Google. We expect that cloud computing will become widely adopted because it enables bioinformaticians to avoid capital expenditure on expensive computing resources and to only pay a cloud computing provider for what is used. Moreover, as well as financial efficiency, cloud computing is an ecologically-friendly technology, it enables efficient data-sharing and we expect it to be faster for development purposes. Here we propose the advantageous use of cloud computing to perform a large data-mining analysis of public domain 3' arrays.
Capabilities and Advantages of Cloud Computing in the Implementation of Electronic Health Record
Ahmadi, Maryam; Aslani, Nasim
2018-01-01
Background: With regard to the high cost of the Electronic Health Record (EHR), in recent years the use of new technologies, in particular cloud computing, has increased. The purpose of this study was to review systematically the studies conducted in the field of cloud computing. Methods: The present study was a systematic review conducted in 2017. Search was performed in the Scopus, Web of Sciences, IEEE, Pub Med and Google Scholar databases by combination keywords. From the 431 article that selected at the first, after applying the inclusion and exclusion criteria, 27 articles were selected for surveyed. Data gathering was done by a self-made check list and was analyzed by content analysis method. Results: The finding of this study showed that cloud computing is a very widespread technology. It includes domains such as cost, security and privacy, scalability, mutual performance and interoperability, implementation platform and independence of Cloud Computing, ability to search and exploration, reducing errors and improving the quality, structure, flexibility and sharing ability. It will be effective for electronic health record. Conclusion: According to the findings of the present study, higher capabilities of cloud computing are useful in implementing EHR in a variety of contexts. It also provides wide opportunities for managers, analysts and providers of health information systems. Considering the advantages and domains of cloud computing in the establishment of HER, it is recommended to use this technology. PMID:29719309
Cosmic ray processing of N2-containing interstellar ice analogues at dark cloud conditions
NASA Astrophysics Data System (ADS)
Fedoseev, G.; Scirè, C.; Baratta, G. A.; Palumbo, M. E.
2018-04-01
N2 is believed to lock considerable part of nitrogen elemental budget and, therefore, to be one of the most abundant ice constituent in cold dark clouds. This laboratory-based research utilizes high energetic processing of N2 containing interstellar ice analogues using 200 keV H+ and He+ ions that mimics cosmic ray processing of the interstellar icy grains. It aims to investigate the formation of (iso)cyanates and cyanides in the ice mantles at the conditions typical for cold dark clouds and prestellar cores. Investigation of cosmic ray processing as a chemical trigger mechanism is explained by the high stability of N2 molecules that are chemically inert in most of the atom- and radical-addition reactions and cannot be efficiently dissociated by cosmic ray induced UV-field. Two sets of experiments are performed to closer address solid-state chemistry occurring in two distinct layers of the ice formed at different stages of dark cloud evolution, i.e. `H2O-rich' and `CO-rich' ice layers. Formation of HNCO and OCN- is discussed in all of the performed experiments. Corresponding kinetic curves for HNCO and OCN- are obtained. Furthermore, a feature around 2092 cm-1 assigned to the contributions of 13CO, CN-, and HCN is analysed. The kinetic curves for the combined HCN/CN- abundance are derived. In turn, normalized formation yields are evaluated by interpolation of the obtained results to the low irradiation doses relevant to dark cloud stage. The obtained values can be used to interpret future observations towards cold dark clouds using James Webb Space Telescope.
An Airborne A-Band Spectrometer for Remote Sensing Of Aerosol and Cloud Optical Properties
NASA Technical Reports Server (NTRS)
Pitts, Michael; Hostetler, Chris; Poole, Lamont; Holden, Carl; Rault, Didier
2000-01-01
Atmospheric remote sensing with the O2 A-band has a relatively long history, but most of these studies were attempting to estimate surface pressure or cloud-top pressure. Recent conceptual studies have demonstrated the potential of spaceborne high spectral resolution O2 A-band spectrometers for retrieval of aerosol and cloud optical properties. The physical rationale of this new approach is that information on the scattering properties of the atmosphere is embedded in the detailed line structure of the O2 A-band reflected radiance spectrum. The key to extracting this information is to measure the radiance spectrum at very high spectral resolution. Instrument performance requirement studies indicate that, in addition to high spectral resolution, the successful retrieval of aerosol and cloud properties from A-band radiance spectra will also require high radiometric accuracy, instrument stability, and high signal-to-noise measurements. To experimentally assess the capabilities of this promising new remote sensing application, the NASA Langley Research Center is developing an airborne high spectral resolution A-band spectrometer. The spectrometer uses a plane holographic grating with a folded Littrow geometry to achieve high spectral resolution (0.5 cm-1) and low stray light in a compact package. This instrument will be flown in a series of field campaigns beginning in 2001 to evaluate the overall feasibility of this new technique. Results from these campaigns should be particularly valuable for future spaceborne applications of A-band spectrometers for aerosol and cloud retrievals.
NASA Astrophysics Data System (ADS)
Planche, C.; Flossmann, A. I.; Wobrock, W.
2009-04-01
A 3D cloud model with detailed microphysics for ice, water and aerosol particles (AP) is used to study the role of AP on the evolution of summertime convective mixed phase clouds and the subsequent precipitation. The model couples the dynamics of the NCAR Clark-Hall cloud scale model (Clark et al., 1996) with the detailed scavenging model (DESCAM) of Flossmann and Pruppacher (1988) and the ice phase module of Leroy et al. (2007). The microphysics follows the evolution of AP, drop, and ice crystal spectra each with 39 bins. Aerosol mass in drops and ice crystals is also predicted by two distribution functions to close the aerosol budget. The simulated cases are compared with radar observations over the northern Vosges mountains and the Rhine valley which were performed on 12 and 13 August 2007 during the COPS field campaign. Using a 3D grid resolution of 250m, our model, called DESCAM-3D, is able to simulate very well the dynamical, cloud and precipitation features observed for the two different cloud systems. The high horizontal grid resolution provides new elements for the understanding of the formation of orographic convection. In addition the fine numerical scale compares well with the high resolved radar observation given by the LaMP X-band radar and Poldirad. The prediction of the liquid and ice hydrometeor spectra allows a detailed calculation of the cloud radar reflectivity. Sensitivity studies realized by the use of different mass-diameter relationships for ice crystals demonstrate the role of the crystal habits on the simulated reflectivities. In order to better understand the role of AP on cloud evolution and precipitation formation several sensitivity studies were performed by modifying not only aerosol number concentration but also their physico-chemical properties. The numerical results show a strong influence of the aerosol number concentration on the precipitation intensity but no effect of the aerosol particle solubility on the rain formation can be found.
NASA Technical Reports Server (NTRS)
Toth, L. V.; Mattila, K.; Haikala, L.; Balazs, L. G.
1992-01-01
The spectra of the 21cm HI radiation from the direction of L1780, a small high-galactic latitude dark/molecular cloud, were analyzed by multivariate methods. Factor analysis was performed on HI (21cm) spectra in order to separate the different components responsible for the spectral features. The rotated, orthogonal factors explain the spectra as a sum of radiation from the background (an extended HI emission layer), and from the L1780 dark cloud. The coefficients of the cloud-indicator factors were used to locate the HI 'halo' of the molecular cloud. Our statistically derived 'background' and 'cloud' spectral profiles, as well as the spatial distribution of the HI halo emission distribution were compared to the results of a previous study which used conventional methods analyzing nearly the same data set.
NASA Astrophysics Data System (ADS)
Wang, Zhe; Wang, Zhenhui; Cao, Xiaozhong; Tao, Fa
2018-01-01
Clouds are currently observed by both ground-based and satellite remote sensing techniques. Each technique has its own strengths and weaknesses depending on the observation method, instrument performance and the methods used for retrieval. It is important to study synergistic cloud measurements to improve the reliability of the observations and to verify the different techniques. The FY-2 geostationary orbiting meteorological satellites continuously observe the sky over China. Their cloud top temperature product can be processed to retrieve the cloud top height (CTH). The ground-based millimeter wavelength cloud radar can acquire information about the vertical structure of clouds-such as the cloud base height (CBH), CTH and the cloud thickness-and can continuously monitor changes in the vertical profiles of clouds. The CTHs were retrieved using both cloud top temperature data from the FY-2 satellites and the cloud radar reflectivity data for the same time period (June 2015 to May 2016) and the resulting datasets were compared in order to evaluate the accuracy of CTH retrievals using FY-2 satellites. The results show that the concordance rate of cloud detection between the two datasets was 78.1%. Higher consistencies were obtained for thicker clouds with larger echo intensity and for more continuous clouds. The average difference in the CTH between the two techniques was 1.46 km. The difference in CTH between low- and mid-level clouds was less than that for high-level clouds. An attenuation threshold of the cloud radar for rainfall was 0.2 mm/min; a rainfall intensity below this threshold had no effect on the CTH. The satellite CTH can be used to compensate for the attenuation error in the cloud radar data.
Study of Aerosol-Cloud Interaction from ground-based long-term statistical analysis at SGP
NASA Astrophysics Data System (ADS)
Zhao, C.; Qiu, Y.
2017-12-01
Previous studies have shown various relationships between cloud droplet effective radius (re) and aerosol amount based on limited observations, indicative of the uncertainties of this relationship caused by many factors. Using 8-year ground-based cloud and aerosol observations at Southern Great Plain (SGP) site in Oklahoma, US, we here analyze the seasonal variation of aerosol effect on low liquid clouds. It shows positive instead of negative AOD-re relationship in all seasons except summer. Potential contribution to AOD-re relationship from the precipitable water vapor (PWV) has been analyzed. Results show that the AOD-re relationship is indeed negative in low PWV condition regardless of seasonality, but it turns positive in high PWV condition for all seasons other than summer. The most likely explanation for the positive AOD-re relationship in high PWV condition for spring, fall and winter is that high PWV could promote the growth of cloud droplets by providing sufficient water vapor. The different performance of AOD-re relationship in summer could be related to the much heavier aerosol loading, which makes the PWV not sufficient any more so that the droplets compete water with each other. By limiting the variation of other meteorological conditions such as low tropospheric stability and wind speed near cloud bases, further analysis also indicates that higher PWVs help change AOD-re relationship from negative to positive.
NASA Technical Reports Server (NTRS)
Hlavka, Dennis L.; Palm, S. P.; Welton, E. J.; Hart, W. D.; Spinhirne, J. D.; McGill, M.; Mahesh, A.; Starr, David OC. (Technical Monitor)
2001-01-01
The Geoscience Laser Altimeter System (GLAS) is scheduled for launch on the ICESat satellite as part of the NASA EOS mission in 2002. GLAS will be used to perform high resolution surface altimetry and will also provide a continuously operating atmospheric lidar to profile clouds, aerosols, and the planetary boundary layer with horizontal and vertical resolution of 175 and 76.8 m, respectively. GLAS is the first active satellite atmospheric profiler to provide global coverage. Data products include direct measurements of the heights of aerosol and cloud layers, and the optical depth of transmissive layers. In this poster we provide an overview of the GLAS atmospheric data products, present a simulated GLAS data set, and show results from the simulated data set using the GLAS data processing algorithm. Optical results from the ER-2 Cloud Physics Lidar (CPL), which uses many of the same processing algorithms as GLAS, show algorithm performance with real atmospheric conditions during the Southern African Regional Science Initiative (SAFARI 2000).
Rosnell, Tomi; Honkavaara, Eija
2012-01-01
The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems’ SOCET SET classical commercial photogrammetric software and another is built using Microsoft®’s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation. PMID:22368479
Rosnell, Tomi; Honkavaara, Eija
2012-01-01
The objective of this investigation was to develop and investigate methods for point cloud generation by image matching using aerial image data collected by quadrocopter type micro unmanned aerial vehicle (UAV) imaging systems. Automatic generation of high-quality, dense point clouds from digital images by image matching is a recent, cutting-edge step forward in digital photogrammetric technology. The major components of the system for point cloud generation are a UAV imaging system, an image data collection process using high image overlaps, and post-processing with image orientation and point cloud generation. Two post-processing approaches were developed: one of the methods is based on Bae Systems' SOCET SET classical commercial photogrammetric software and another is built using Microsoft(®)'s Photosynth™ service available in the Internet. Empirical testing was carried out in two test areas. Photosynth processing showed that it is possible to orient the images and generate point clouds fully automatically without any a priori orientation information or interactive work. The photogrammetric processing line provided dense and accurate point clouds that followed the theoretical principles of photogrammetry, but also some artifacts were detected. The point clouds from the Photosynth processing were sparser and noisier, which is to a large extent due to the fact that the method is not optimized for dense point cloud generation. Careful photogrammetric processing with self-calibration is required to achieve the highest accuracy. Our results demonstrate the high performance potential of the approach and that with rigorous processing it is possible to reach results that are consistent with theory. We also point out several further research topics. Based on theoretical and empirical results, we give recommendations for properties of imaging sensor, data collection and processing of UAV image data to ensure accurate point cloud generation.
Evaluating open-source cloud computing solutions for geosciences
NASA Astrophysics Data System (ADS)
Huang, Qunying; Yang, Chaowei; Liu, Kai; Xia, Jizhe; Xu, Chen; Li, Jing; Gui, Zhipeng; Sun, Min; Li, Zhenglong
2013-09-01
Many organizations start to adopt cloud computing for better utilizing computing resources by taking advantage of its scalability, cost reduction, and easy to access characteristics. Many private or community cloud computing platforms are being built using open-source cloud solutions. However, little has been done to systematically compare and evaluate the features and performance of open-source solutions in supporting Geosciences. This paper provides a comprehensive study of three open-source cloud solutions, including OpenNebula, Eucalyptus, and CloudStack. We compared a variety of features, capabilities, technologies and performances including: (1) general features and supported services for cloud resource creation and management, (2) advanced capabilities for networking and security, and (3) the performance of the cloud solutions in provisioning and operating the cloud resources as well as the performance of virtual machines initiated and managed by the cloud solutions in supporting selected geoscience applications. Our study found that: (1) no significant performance differences in central processing unit (CPU), memory and I/O of virtual machines created and managed by different solutions, (2) OpenNebula has the fastest internal network while both Eucalyptus and CloudStack have better virtual machine isolation and security strategies, (3) Cloudstack has the fastest operations in handling virtual machines, images, snapshots, volumes and networking, followed by OpenNebula, and (4) the selected cloud computing solutions are capable for supporting concurrent intensive web applications, computing intensive applications, and small-scale model simulations without intensive data communication.
Bao, Riyue; Hernandez, Kyle; Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud.
A high-resolution oxygen A-band spectrometer (HABS) and its radiation closure
NASA Astrophysics Data System (ADS)
Min, Q.; Yin, B.; Li, S.; Berndt, J.; Harrison, L.; Joseph, E.; Duan, M.; Kiedron, P.
2014-02-01
The pressure dependence of oxygen A-band absorption enables the retrieval of the vertical profiles of aerosol and cloud properties from oxygen A-band spectrometry. To improve the understanding of oxygen A-band inversions and utility, we developed a high-resolution oxygen A-band spectrometer (HABS), and deployed it at Howard University Beltsville site during the NASA Discover Air-Quality Field Campaign in July 2011. The HABS has the ability to measure solar direct-beam and zenith diffuse radiation through a telescope automatically. It exhibits excellent performance: stable spectral response ratio, high signal-to-noise ratio (SNR), high spectrum resolution (0.16 nm), and high Out-of-Band Rejection (10-5). To evaluate the spectra performance of HABS, a HABS simulator has been developed by combing the discrete ordinates radiative transfer (DISORT) code with the High Resolution Transmission (HTRAN) database HITRAN2008. The simulator uses double-k approach to reduce the computational cost. The HABS measured spectra are consistent with the related simulated spectra. For direct-beam spectra, the confidence intervals (95%) of relative difference between measurements and simulation are (-0.06, 0.05) and (-0.08, 0.09) for solar zenith angles of 27° and 72°, respectively. The main differences between them occur at or near the strong oxygen absorption line centers. They are mainly caused by the noise/spikes of HABS measured spectra, as a result of combined effects of weak signal, low SNR, and errors in wavelength registration and absorption line parameters. The high-resolution oxygen A-band measurements from HABS can constrain the active radar retrievals for more accurate cloud optical properties, particularly for multi-layer clouds and for mixed-phase clouds.
An evaluation of atmospheric corrections to advanced very high resolution radiometer data
Meyer, David; Hood, Joy J.
1993-01-01
A data set compiled to analyze vegetation indices is used to evaluate the effect of atmospheric correction to AVHRR measurement in the solar spectrum. Such corrections include cloud screening and "clear sky" corrections. We used the "clouds from AVHRR" (CLAVR) method for cloud detection and evaluated its performance over vegetated targets. Clear sky corrections, designed to reduce the effects of molecular scattering and absorption due to ozone, water vapor, carbon dioxide, and molecular oxygen, were applied to data values determine to be cloud free. Generally, it was found that the screening and correction of the AVHRR data did not affect the maximum NDVI compositing process adversely, while at the same time improving estimates of the land-surface radiances over a compositing period.
NASA Astrophysics Data System (ADS)
Karlsson, Karl-Göran; Håkansson, Nina
2018-02-01
The sensitivity in detecting thin clouds of the cloud screening method being used in the CM SAF cloud, albedo and surface radiation data set from AVHRR data (CLARA-A2) cloud climate data record (CDR) has been evaluated using cloud information from the Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP) onboard the CALIPSO satellite. The sensitivity, including its global variation, has been studied based on collocations of Advanced Very High Resolution Radiometer (AVHRR) and CALIOP measurements over a 10-year period (2006-2015). The cloud detection sensitivity has been defined as the minimum cloud optical thickness for which 50 % of clouds could be detected, with the global average sensitivity estimated to be 0.225. After using this value to reduce the CALIOP cloud mask (i.e. clouds with optical thickness below this threshold were interpreted as cloud-free cases), cloudiness results were found to be basically unbiased over most of the globe except over the polar regions where a considerable underestimation of cloudiness could be seen during the polar winter. The overall probability of detecting clouds in the polar winter could be as low as 50 % over the highest and coldest parts of Greenland and Antarctica, showing that a large fraction of optically thick clouds also remains undetected here. The study included an in-depth analysis of the probability of detecting a cloud as a function of the vertically integrated cloud optical thickness as well as of the cloud's geographical position. Best results were achieved over oceanic surfaces at mid- to high latitudes where at least 50 % of all clouds with an optical thickness down to a value of 0.075 were detected. Corresponding cloud detection sensitivities over land surfaces outside of the polar regions were generally larger than 0.2 with maximum values of approximately 0.5 over the Sahara and the Arabian Peninsula. For polar land surfaces the values were close to 1 or higher with maximum values of 4.5 for the parts with the highest altitudes over Greenland and Antarctica. It is suggested to quantify the detection performance of other CDRs in terms of a sensitivity threshold of cloud optical thickness, which can be estimated using active lidar observations. Validation results are proposed to be used in Cloud Feedback Model Intercomparison Project (CFMIP) Observation Simulation Package (COSP) simulators for cloud detection characterization of various cloud CDRs from passive imagery.
NASA Technical Reports Server (NTRS)
1995-01-01
The theoretical bases for the Release 1 algorithms that will be used to process satellite data for investigation of the Clouds and Earth's Radiant Energy System (CERES) are described. The architecture for software implementation of the methodologies is outlined. Volume 3 details the advanced CERES methods for performing scene identification and inverting each CERES scanner radiance to a top-of-the-atmosphere (TOA) flux. CERES determines cloud fraction, height, phase, effective particle size, layering, and thickness from high-resolution, multispectral imager data. CERES derives cloud properties for each pixel of the Tropical Rainfall Measuring Mission (TRMM) visible and infrared scanner and the Earth Observing System (EOS) moderate-resolution imaging spectroradiometer. Cloud properties for each imager pixel are convolved with the CERES footprint point spread function to produce average cloud properties for each CERES scanner radiance. The mean cloud properties are used to determine an angular distribution model (ADM) to convert each CERES radiance to a TOA flux. The TOA fluxes are used in simple parameterization to derive surface radiative fluxes. This state-of-the-art cloud-radiation product will be used to substantially improve our understanding of the complex relationship between clouds and the radiation budget of the Earth-atmosphere system.
THE LAUNCHING OF COLD CLOUDS BY GALAXY OUTFLOWS. II. THE ROLE OF THERMAL CONDUCTION
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brüggen, Marcus; Scannapieco, Evan
2016-05-01
We explore the impact of electron thermal conduction on the evolution of radiatively cooled cold clouds embedded in flows of hot and fast material as it occurs in outflowing galaxies. Performing a parameter study of three-dimensional adaptive mesh refinement hydrodynamical simulations, we show that electron thermal conduction causes cold clouds to evaporate, but it can also extend their lifetimes by compressing them into dense filaments. We distinguish between low column-density clouds, which are disrupted on very short times, and high-column density clouds with much longer disruption times that are set by a balance between impinging thermal energy and evaporation. Wemore » provide fits to the cloud lifetimes and velocities that can be used in galaxy-scale simulations of outflows in which the evolution of individual clouds cannot be modeled with the required resolution. Moreover, we show that the clouds are only accelerated to a small fraction of the ambient velocity because compression by evaporation causes the clouds to present a small cross-section to the ambient flow. This means that either magnetic fields must suppress thermal conduction, or that the cold clouds observed in galaxy outflows are not formed of cold material carried out from the galaxy.« less
Cloud Processing of Secondary Organic Aerosol from Isoprene and Methacrolein Photooxidation.
Giorio, Chiara; Monod, Anne; Brégonzio-Rozier, Lola; DeWitt, Helen Langley; Cazaunau, Mathieu; Temime-Roussel, Brice; Gratien, Aline; Michoud, Vincent; Pangui, Edouard; Ravier, Sylvain; Zielinski, Arthur T; Tapparo, Andrea; Vermeylen, Reinhilde; Claeys, Magda; Voisin, Didier; Kalberer, Markus; Doussin, Jean-François
2017-10-12
Aerosol-cloud interaction contributes to the largest uncertainties in the estimation and interpretation of the Earth's changing energy budget. The present study explores experimentally the impacts of water condensation-evaporation events, mimicking processes occurring in atmospheric clouds, on the molecular composition of secondary organic aerosol (SOA) from the photooxidation of methacrolein. A range of on- and off-line mass spectrometry techniques were used to obtain a detailed chemical characterization of SOA formed in control experiments in dry conditions, in triphasic experiments simulating gas-particle-cloud droplet interactions (starting from dry conditions and from 60% relative humidity (RH)), and in bulk aqueous-phase experiments. We observed that cloud events trigger fast SOA formation accompanied by evaporative losses. These evaporative losses decreased SOA concentration in the simulation chamber by 25-32% upon RH increase, while aqueous SOA was found to be metastable and slowly evaporated after cloud dissipation. In the simulation chamber, SOA composition measured with a high-resolution time-of-flight aerosol mass spectrometer, did not change during cloud events compared with high RH conditions (RH > 80%). In all experiments, off-line mass spectrometry techniques emphasize the critical role of 2-methylglyceric acid as a major product of isoprene chemistry, as an important contributor to the total SOA mass (15-20%) and as a key building block of oligomers found in the particulate phase. Interestingly, the comparison between the series of oligomers obtained from experiments performed under different conditions show a markedly different reactivity. In particular, long reaction times at high RH seem to create the conditions for aqueous-phase processing to occur in a more efficient manner than during two relatively short cloud events.
NASA Astrophysics Data System (ADS)
Meyer, Hanna; Kühnlein, Meike; Appelhans, Tim; Nauss, Thomas
2016-03-01
Machine learning (ML) algorithms have successfully been demonstrated to be valuable tools in satellite-based rainfall retrievals which show the practicability of using ML algorithms when faced with high dimensional and complex data. Moreover, recent developments in parallel computing with ML present new possibilities for training and prediction speed and therefore make their usage in real-time systems feasible. This study compares four ML algorithms - random forests (RF), neural networks (NNET), averaged neural networks (AVNNET) and support vector machines (SVM) - for rainfall area detection and rainfall rate assignment using MSG SEVIRI data over Germany. Satellite-based proxies for cloud top height, cloud top temperature, cloud phase and cloud water path serve as predictor variables. The results indicate an overestimation of rainfall area delineation regardless of the ML algorithm (averaged bias = 1.8) but a high probability of detection ranging from 81% (SVM) to 85% (NNET). On a 24-hour basis, the performance of the rainfall rate assignment yielded R2 values between 0.39 (SVM) and 0.44 (AVNNET). Though the differences in the algorithms' performance were rather small, NNET and AVNNET were identified as the most suitable algorithms. On average, they demonstrated the best performance in rainfall area delineation as well as in rainfall rate assignment. NNET's computational speed is an additional advantage in work with large datasets such as in remote sensing based rainfall retrievals. However, since no single algorithm performed considerably better than the others we conclude that further research in providing suitable predictors for rainfall is of greater necessity than an optimization through the choice of the ML algorithm.
Cirrus cloud retrieval from MSG/SEVIRI during day and night using artificial neural networks
NASA Astrophysics Data System (ADS)
Strandgren, Johan; Bugliaro, Luca
2017-04-01
By covering a large part of the Earth, cirrus clouds play an important role in climate as they reflect incoming solar radiation and absorb outgoing thermal radiation. Nevertheless, the cirrus clouds remain one of the largest uncertainties in atmospheric research and the understanding of the physical processes that govern their life cycle is still poorly understood, as is their representation in climate models. To monitor and better understand the properties and physical processes of cirrus clouds, it's essential that those tenuous clouds can be observed from geostationary spaceborne imagers like SEVIRI (Spinning Enhanced Visible and InfraRed Imager), that possess a high temporal resolution together with a large field of view and play an important role besides in-situ observations for the investigation of cirrus cloud processes. CiPS (Cirrus Properties from Seviri) is a new algorithm targeting thin cirrus clouds. CiPS is an artificial neural network trained with coincident SEVIRI and CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) observations in order to retrieve a cirrus cloud mask along with the cloud top height (CTH), ice optical thickness (IOT) and ice water path (IWP) from SEVIRI. By utilizing only the thermal/IR channels of SEVIRI, CiPS can be used during day and night making it a powerful tool for the cirrus life cycle analysis. Despite the great challenge of detecting thin cirrus clouds and retrieving their properties from a geostationary imager using only the thermal/IR wavelengths, CiPS performs well. Among the cirrus clouds detected by CALIOP, CiPS detects 70 and 95 % of the clouds with an optical thickness of 0.1 and 1.0 respectively. Among the cirrus free pixels, CiPS classify 96 % correctly. For the CTH retrieval, CiPS has a mean absolute percentage error of 10 % or less with respect to CALIOP for cirrus clouds with a CTH greater than 8 km. For the IOT retrieval, CiPS has a mean absolute percentage error of 100 % or less with respect to CALIOP for cirrus clouds with an optical thickness down to 0.07. For such thin cirrus clouds an error of 100 % should be regarded as low from a geostationary imager like SEVIRI. The IWP retrieved by CiPS shows a similar performance, but has larger deviations for the thinner cirrus clouds.
2011-01-01
Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105
An ARM data-oriented diagnostics package to evaluate the climate model simulation
NASA Astrophysics Data System (ADS)
Zhang, C.; Xie, S.
2016-12-01
A set of diagnostics that utilize long-term high frequency measurements from the DOE Atmospheric Radiation Measurement (ARM) program is developed for evaluating the regional simulation of clouds, radiation and precipitation in climate models. The diagnostics results are computed and visualized automatically in a python-based package that aims to serve as an easy entry point for evaluating climate simulations using the ARM data, as well as the CMIP5 multi-model simulations. Basic performance metrics are computed to measure the accuracy of mean state and variability of simulated regional climate. The evaluated physical quantities include vertical profiles of clouds, temperature, relative humidity, cloud liquid water path, total column water vapor, precipitation, sensible and latent heat fluxes, radiative fluxes, aerosol and cloud microphysical properties. Process-oriented diagnostics focusing on individual cloud and precipitation-related phenomena are developed for the evaluation and development of specific model physical parameterizations. Application of the ARM diagnostics package will be presented in the AGU session. This work is performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344, IM release number is: LLNL-ABS-698645.
Observations of high droplet number concentrations in Southern Ocean boundary layer clouds
NASA Astrophysics Data System (ADS)
Chubb, T.; Huang, Y.; Jensen, J.; Campos, T.; Siems, S.; Manton, M.
2015-09-01
Data from the standard cloud physics payload during the NSF/NCAR High-performance Instrumented Airborne Platform for Environmental Research (HIAPER) Pole-to-Pole Observations (HIPPO) campaigns provide a snapshot of unusual wintertime microphysical conditions in the boundary layer over the Southern Ocean. On 29 June 2011, the HIAPER sampled the boundary layer in a region of pre-frontal warm air advection between 58 and 48° S to the south of Tasmania. Cloud droplet number concentrations were consistent with climatological values in the northernmost profiles but were exceptionally high for wintertime in the Southern Ocean at 100-200 cm-3 in the southernmost profiles. Sub-micron (0.06
NASA Astrophysics Data System (ADS)
Wendisch, Manfred; Pöschl, Ulrich; Andreae, Meinrat O.; Machado, Luiz A. T.; Albrecht, Rachel; Schlager, Hans; Rosenfeld, Daniel; Krämer, Martina
2015-04-01
An extensive airborne/ground-based measurement campaign to study tropical convective clouds is introduced. It was performed in Brazil with focus on the Amazon rainforest from 1 September to 4 October 2014. The project combined the joint German-Brazilian ACRIDICON (Aerosol, Cloud, Precipitation, and Radiation Interactions and Dynamics of Convective Cloud Systems) and CHUVA (Machado et al.2014) projects. ACRIDICON aimed at the quantification of aerosol-cloud-precipitation interactions and their thermodynamic, dynamic and radiative effects in convective cloud systems by in-situ aircraft observations and indirect measurements (aircraft, satellite, and ground-based). The ACRIDICON-CHUVA campaign was conducted in cooperation with the second Intensive Operational Phase (IOP) of the GOAmazon (Green Ocean Amazon) program. The focus in this presentation is on the airborne observations within ACRIDICON-CHUVA. The German HALO (High Altitude and Long-Range Research Aircraft) was based in Manaus (Amazonas State); it carried out 14 research flights (96 flight hours in total). HALO was equipped with remote sensing and in-situ instrumentation for meteorological, trace gas, aerosol, cloud, and precipitation measurements. Five mission objectives were pursued: (1) cloud vertical evolution (cloud profiling), (2) aerosol processing (inflow and outflow), (3) satellite validation, (4) vertical transport and mixing (tracer experiment), and (5) clouds over forested and deforested areas. The five cloud missions collected data in clean atmospheric conditions and in contrasting polluted (urban and biomass burning) environments.
NASA Astrophysics Data System (ADS)
Letu, H.; Nagao, T. M.; Nakajima, T. Y.; Ishimoto, H.; Riedi, J.; Shang, H.
2017-12-01
Ice cloud property product from satellite measurements is applicable in climate change study, numerical weather prediction, as well as atmospheric study. Ishimoto et al., (2010) and Letu et al., (2016) developed a single scattering property of the highly irregular ice particle model, called the Voronoi model for developing ice cloud product of the GCOM-C satellite program. It is investigated that Voronoi model has a good performance on retrieval of the ice cloud properties by comparing it with other well-known scattering models. Cloud property algorithm (Nakajima et al., 1995, Ishida and Nakajima., 2009, Ishimoto et al., 2009, Letu et al., 2012, 2014, 2016) of the GCOM-C satellite program is improved to produce the Himawari-8/AHI cloud products based on the variation of the solar zenith angle. Himawari-8 is the new-generational geostationary meteorological satellite, which is successfully launched by the Japan Meteorological Agency (JMA) on 7 October 2014. In this study, ice cloud optical and microphysical properties are simulated from RSTAR radiative transfer code by using various model. Scattering property of the Voronoi model is investigated for developing the AHI ice cloud products. Furthermore, optical and microphysical properties of the ice clouds are retrieved from Himawari-8/AHI satellite measurements. Finally, retrieval results from Himawari-8/AHI are compared to MODIS-C6 cloud property products for validation of the AHI cloud products.
Star cluster formation in a turbulent molecular cloud self-regulated by photoionization feedback
NASA Astrophysics Data System (ADS)
Gavagnin, Elena; Bleuler, Andreas; Rosdahl, Joakim; Teyssier, Romain
2017-12-01
Most stars in the Galaxy are believed to be formed within star clusters from collapsing molecular clouds. However, the complete process of star formation, from the parent cloud to a gas-free star cluster, is still poorly understood. We perform radiation-hydrodynamical simulations of the collapse of a turbulent molecular cloud using the RAMSES-RT code. Stars are modelled using sink particles, from which we self-consistently follow the propagation of the ionizing radiation. We study how different feedback models affect the gas expulsion from the cloud and how they shape the final properties of the emerging star cluster. We find that the star formation efficiency is lower for stronger feedback models. Feedback also changes the high-mass end of the stellar mass function. Stronger feedback also allows the establishment of a lower density star cluster, which can maintain a virial or sub-virial state. In the absence of feedback, the star formation efficiency is very high, as well as the final stellar density. As a result, high-energy close encounters make the cluster evaporate quickly. Other indicators, such as mass segregation, statistics of multiple systems and escaping stars confirm this picture. Observations of young star clusters are in best agreement with our strong feedback simulation.
NASA Technical Reports Server (NTRS)
Chaudhary, Aashish; Votava, Petr; Nemani, Ramakrishna R.; Michaelis, Andrew; Kotfila, Chris
2016-01-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
Analytics and Visualization Pipelines for Big Data on the NASA Earth Exchange (NEX) and OpenNEX
NASA Astrophysics Data System (ADS)
Chaudhary, A.; Votava, P.; Nemani, R. R.; Michaelis, A.; Kotfila, C.
2016-12-01
We are developing capabilities for an integrated petabyte-scale Earth science collaborative analysis and visualization environment. The ultimate goal is to deploy this environment within the NASA Earth Exchange (NEX) and OpenNEX in order to enhance existing science data production pipelines in both high-performance computing (HPC) and cloud environments. Bridging of HPC and cloud is a fairly new concept under active research and this system significantly enhances the ability of the scientific community to accelerate analysis and visualization of Earth science data from NASA missions, model outputs and other sources. We have developed a web-based system that seamlessly interfaces with both high-performance computing (HPC) and cloud environments, providing tools that enable science teams to develop and deploy large-scale analysis, visualization and QA pipelines of both the production process and the data products, and enable sharing results with the community. Our project is developed in several stages each addressing separate challenge - workflow integration, parallel execution in either cloud or HPC environments and big-data analytics or visualization. This work benefits a number of existing and upcoming projects supported by NEX, such as the Web Enabled Landsat Data (WELD), where we are developing a new QA pipeline for the 25PB system.
NASA Astrophysics Data System (ADS)
Deguillaume, L.; Charbouillot, T.; Joly, M.; Vaïtilingom, M.; Parazols, M.; Marinoni, A.; Amato, P.; Delort, A.-M.; Vinatier, V.; Flossmann, A.; Chaumerliac, N.; Pichon, J. M.; Houdier, S.; Laj, P.; Sellegri, K.; Colomb, A.; Brigante, M.; Mailhot, G.
2014-02-01
Long-term monitoring of the chemical composition of clouds (73 cloud events representing 199 individual samples) sampled at the puy de Dôme (pdD) station (France) was performed between 2001 and 2011. Physicochemical parameters, as well as the concentrations of the major organic and inorganic constituents, were measured and analyzed by multicomponent statistical analysis. Along with the corresponding back-trajectory plots, this allowed for distinguishing four different categories of air masses reaching the summit of the pdD: polluted, continental, marine and highly marine. The statistical analysis led to the determination of criteria (concentrations of inorganic compounds, pH) that differentiate each category of air masses. Highly marine clouds exhibited high concentrations of Na+ and Cl-; the marine category presented lower concentration of ions but more elevated pH. Finally, the two remaining clusters were classified as "continental" and "polluted"; these clusters had the second-highest and highest levels of NH4+, NO3-, and SO24-, respectively. This unique data set of cloud chemical composition is then discussed as a function of this classification. Total organic carbon (TOC) is significantly higher in polluted air masses than in the other categories, which suggests additional anthropogenic sources. Concentrations of carboxylic acids and carbonyls represent around 10% of the organic matter in all categories of air masses and are studied for their relative importance. Iron concentrations are significantly higher for polluted air masses and iron is mainly present in its oxidation state (+II) in all categories of air masses. Finally, H2O2 concentrations are much more varied in marine and highly marine clouds than in polluted clouds, which are characterized by the lowest average concentration of H2O2. This data set provides concentration ranges of main inorganic and organic compounds for modeling purposes on multiphase cloud chemistry.
Speeding Up Geophysical Research Using Docker Containers Within Multi-Cloud Environment.
NASA Astrophysics Data System (ADS)
Synytsky, R.; Henadiy, S.; Lobzakov, V.; Kolesnikov, L.; Starovoit, Y. O.
2016-12-01
How useful are the geophysical observations in a scope of minimizing losses from natural disasters today? Does it help to decrease number of human victims during tsunami and earthquake? Unfortunately it's still at early stage these days. It's a big goal and achievement to make such observations more useful by improving early warning and prediction systems with the help of cloud computing. Cloud computing technologies have proved the ability to speed up application development in many areas for 10 years already. Cloud unlocks new opportunities for geoscientists by providing access to modern data processing tools and algorithms including real-time high-performance computing, big data processing, artificial intelligence and others. Emerging lightweight cloud technologies, such as Docker containers, are gaining wide traction in IT due to the fact of faster and more efficient deployment of different applications in a cloud environment. It allows to deploy and manage geophysical applications and systems in minutes across multiple clouds and data centers that becomes of utmost importance for the next generation applications. In this session we'll demonstrate how Docker containers technology within multi-cloud can accelerate the development of applications specifically designed for geophysical researches.
Aerosol and Cloud Observations and Data Products by the GLAS Polar Orbiting Lidar Instrument
NASA Technical Reports Server (NTRS)
Spinhirne, J. D.; Palm, S. P.; Hlavka, D. L.; Hart, W. D.; Mahesh, A.; Welton, E. J.
2005-01-01
The Geoscience Laser Altimeter System (GLAS) launched in 2003 is the first polar orbiting satellite lidar. The instrument was designed for high performance observations of the distribution and optical scattering cross sections of clouds and aerosol. The backscatter lidar operates at two wavelengths, 532 and 1064 nm. Both receiver channels meet and exceed their design goals, and beginning with a two month period through October and November 2003, an excellent global lidar data set now exists. The data products for atmospheric observations include the calibrated, attenuated backscatter cross section for cloud and aerosol; height detection for multiple cloud layers; planetary boundary layer height; cirrus and aerosol optical depth and the height distribution of aerosol and cloud scattering cross section profiles. The data sets are now in open release through the NASA data distribution system. The initial results on global statistics for cloud and aerosol distribution has been produced and in some cases compared to other satellite observations. The sensitivity of the cloud measurements is such that the 70% global cloud coverage result should be the most accurate to date. Results on the global distribution of aerosol are the first that produce the true height distribution for model inter-comparison.
Investigating the Use of Cloudbursts for High-Throughput Medical Image Registration
Kim, Hyunjoo; Parashar, Manish; Foran, David J.; Yang, Lin
2010-01-01
This paper investigates the use of clouds and autonomic cloudbursting to support a medical image registration. The goal is to enable a virtual computational cloud that integrates local computational environments and public cloud services on-the-fly, and support image registration requests from different distributed researcher groups with varied computational requirements and QoS constraints. The virtual cloud essentially implements shared and coordinated task-spaces, which coordinates the scheduling of jobs submitted by a dynamic set of research groups to their local job queues. A policy-driven scheduling agent uses the QoS constraints along with performance history and the state of the resources to determine the appropriate size and mix of the public and private cloud resource that should be allocated to a specific request. The virtual computational cloud and the medical image registration service have been developed using the CometCloud engine and have been deployed on a combination of private clouds at Rutgers University and the Cancer Institute of New Jersey and Amazon EC2. An experimental evaluation is presented and demonstrates the effectiveness of autonomic cloudbursts and policy-based autonomic scheduling for this application. PMID:20640235
Can We Use Single-Column Models for Understanding the Boundary Layer Cloud-Climate Feedback?
NASA Astrophysics Data System (ADS)
Dal Gesso, S.; Neggers, R. A. J.
2018-02-01
This study explores how to drive Single-Column Models (SCMs) with existing data sets of General Circulation Model (GCM) outputs, with the aim of studying the boundary layer cloud response to climate change in the marine subtropical trade wind regime. The EC-EARTH SCM is driven with the large-scale tendencies and boundary conditions as derived from two different data sets, consisting of high-frequency outputs of GCM simulations. SCM simulations are performed near Barbados Cloud Observatory in the dry season (January-April), when fair-weather cumulus is the dominant low-cloud regime. This climate regime is characterized by a near equilibrium in the free troposphere between the long-wave radiative cooling and the large-scale advection of warm air. In the SCM, this equilibrium is ensured by scaling the monthly mean dynamical tendency of temperature and humidity such that it balances that of the model physics in the free troposphere. In this setup, the high-frequency variability in the forcing is maintained, and the boundary layer physics acts freely. This technique yields representative cloud amount and structure in the SCM for the current climate. Furthermore, the cloud response to a sea surface warming of 4 K as produced by the SCM is consistent with that of the forcing GCM.
NASA Astrophysics Data System (ADS)
Champion, N.
2012-08-01
Contrary to aerial images, satellite images are often affected by the presence of clouds. Identifying and removing these clouds is one of the primary steps to perform when processing satellite images, as they may alter subsequent procedures such as atmospheric corrections, DSM production or land cover classification. The main goal of this paper is to present the cloud detection approach, developed at the French Mapping agency. Our approach is based on the availability of multi-temporal satellite images (i.e. time series that generally contain between 5 and 10 images) and is based on a region-growing procedure. Seeds (corresponding to clouds) are firstly extracted through a pixel-to-pixel comparison between the images contained in time series (the presence of a cloud is here assumed to be related to a high variation of reflectance between two images). Clouds are then delineated finely using a dedicated region-growing algorithm. The method, originally designed for panchromatic SPOT5-HRS images, is tested in this paper using time series with 9 multi-temporal satellite images. Our preliminary experiments show the good performances of our method. In a near future, the method will be applied to Pléiades images, acquired during the in-flight commissioning phase of the satellite (launched at the end of 2011). In that context, this is a particular goal of this paper to show to which extent and in which way our method can be adapted to this kind of imagery.
Hayman, Matthew; Spuler, Scott
2017-11-27
We present a demonstration of a diode-laser-based high spectral resolution lidar. It is capable of performing calibrated retrievals of aerosol and cloud optical properties at a 150 m range resolution with less than 1 minute integration time over an approximate range of 12 km during day and night. This instrument operates at 780 nm, a wavelength that is well established for reliable semiconductor lasers and detectors, and was chosen because it corresponds to the D2 rubidium absorption line. A heated vapor reference cell of isotopic rubidium 87 is used as an effective and reliable aerosol signal blocking filter in the instrument. In principle, the diode-laser-based high spectral resolution lidar can be made cost competitive with elastic backscatter lidar systems, yet delivers a significant improvement in data quality through direct retrieval of quantitative optical properties of clouds and aerosols.
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community.
Krampis, Konstantinos; Booth, Tim; Chapman, Brad; Tiwari, Bela; Bicak, Mesude; Field, Dawn; Nelson, Karen E
2012-03-19
A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them.
Cloud BioLinux: pre-configured and on-demand bioinformatics computing for the genomics community
2012-01-01
Background A steep drop in the cost of next-generation sequencing during recent years has made the technology affordable to the majority of researchers, but downstream bioinformatic analysis still poses a resource bottleneck for smaller laboratories and institutes that do not have access to substantial computational resources. Sequencing instruments are typically bundled with only the minimal processing and storage capacity required for data capture during sequencing runs. Given the scale of sequence datasets, scientific value cannot be obtained from acquiring a sequencer unless it is accompanied by an equal investment in informatics infrastructure. Results Cloud BioLinux is a publicly accessible Virtual Machine (VM) that enables scientists to quickly provision on-demand infrastructures for high-performance bioinformatics computing using cloud platforms. Users have instant access to a range of pre-configured command line and graphical software applications, including a full-featured desktop interface, documentation and over 135 bioinformatics packages for applications including sequence alignment, clustering, assembly, display, editing, and phylogeny. Each tool's functionality is fully described in the documentation directly accessible from the graphical interface of the VM. Besides the Amazon EC2 cloud, we have started instances of Cloud BioLinux on a private Eucalyptus cloud installed at the J. Craig Venter Institute, and demonstrated access to the bioinformatic tools interface through a remote connection to EC2 instances from a local desktop computer. Documentation for using Cloud BioLinux on EC2 is available from our project website, while a Eucalyptus cloud image and VirtualBox Appliance is also publicly available for download and use by researchers with access to private clouds. Conclusions Cloud BioLinux provides a platform for developing bioinformatics infrastructures on the cloud. An automated and configurable process builds Virtual Machines, allowing the development of highly customized versions from a shared code base. This shared community toolkit enables application specific analysis platforms on the cloud by minimizing the effort required to prepare and maintain them. PMID:22429538
Norris, Peter M.; da Silva, Arlindo M.
2018-01-01
Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational–Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state. PMID:29618848
NASA Technical Reports Server (NTRS)
Norris, Peter M.; da Silva, Arlindo M.
2016-01-01
Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state.
Norris, Peter M; da Silva, Arlindo M
2016-07-01
Part 1 of this series presented a Monte Carlo Bayesian method for constraining a complex statistical model of global circulation model (GCM) sub-gridcolumn moisture variability using high-resolution Moderate Resolution Imaging Spectroradiometer (MODIS) cloud data, thereby permitting parameter estimation and cloud data assimilation for large-scale models. This article performs some basic testing of this new approach, verifying that it does indeed reduce mean and standard deviation biases significantly with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud-top pressure and that it also improves the simulated rotational-Raman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the Ozone Monitoring Instrument (OMI). Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows non-gradient-based jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast, where the background state has a clear swath. This article also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in passive-radiometer-retrieved cloud observables on cloud vertical structure, beyond cloud-top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification from Riishojgaard provides some help in this respect, by better honouring inversion structures in the background state.
High-Performance Compute Infrastructure in Astronomy: 2020 Is Only Months Away
NASA Astrophysics Data System (ADS)
Berriman, B.; Deelman, E.; Juve, G.; Rynge, M.; Vöckler, J. S.
2012-09-01
By 2020, astronomy will be awash with as much as 60 PB of public data. Full scientific exploitation of such massive volumes of data will require high-performance computing on server farms co-located with the data. Development of this computing model will be a community-wide enterprise that has profound cultural and technical implications. Astronomers must be prepared to develop environment-agnostic applications that support parallel processing. The community must investigate the applicability and cost-benefit of emerging technologies such as cloud computing to astronomy, and must engage the Computer Science community to develop science-driven cyberinfrastructure such as workflow schedulers and optimizers. We report here the results of collaborations between a science center, IPAC, and a Computer Science research institute, ISI. These collaborations may be considered pathfinders in developing a high-performance compute infrastructure in astronomy. These collaborations investigated two exemplar large-scale science-driver workflow applications: 1) Calculation of an infrared atlas of the Galactic Plane at 18 different wavelengths by placing data from multiple surveys on a common plate scale and co-registering all the pixels; 2) Calculation of an atlas of periodicities present in the public Kepler data sets, which currently contain 380,000 light curves. These products have been generated with two workflow applications, written in C for performance and designed to support parallel processing on multiple environments and platforms, but with different compute resource needs: the Montage image mosaic engine is I/O-bound, and the NASA Star and Exoplanet Database periodogram code is CPU-bound. Our presentation will report cost and performance metrics and lessons-learned for continuing development. Applicability of Cloud Computing: Commercial Cloud providers generally charge for all operations, including processing, transfer of input and output data, and for storage of data, and so the costs of running applications vary widely according to how they use resources. The cloud is well suited to processing CPU-bound (and memory bound) workflows such as the periodogram code, given the relatively low cost of processing in comparison with I/O operations. I/O-bound applications such as Montage perform best on high-performance clusters with fast networks and parallel file-systems. Science-driven Cyberinfrastructure: Montage has been widely used as a driver application to develop workflow management services, such as task scheduling in distributed environments, designing fault tolerance techniques for job schedulers, and developing workflow orchestration techniques. Running Parallel Applications Across Distributed Cloud Environments: Data processing will eventually take place in parallel distributed across cyber infrastructure environments having different architectures. We have used the Pegasus Work Management System (WMS) to successfully run applications across three very different environments: TeraGrid, OSG (Open Science Grid), and FutureGrid. Provisioning resources across different grids and clouds (also referred to as Sky Computing), involves establishing a distributed environment, where issues of, e.g, remote job submission, data management, and security need to be addressed. This environment also requires building virtual machine images that can run in different environments. Usually, each cloud provides basic images that can be customized with additional software and services. In most of our work, we provisioned compute resources using a custom application, called Wrangler. Pegasus WMS abstracts the architectures of the compute environments away from the end-user, and can be considered a first-generation tool suitable for scientists to run their applications on disparate environments.
Validity of Molecular Tagging Velocimetry in a Cavitating Flow for Turbopump Analysis
NASA Astrophysics Data System (ADS)
Kuzmich, Kayla; Bohl, Doug
2012-11-01
This research establishes multi-phase molecular tagging velocimetry (MTV) use and explores its limitations. The flow conditions and geometry in the inducer of an upper stage liquid Oxygen (LOX)/LH2 engine frequently cause cavitation which decreases turbopump performance. Complications arise in performing experiments in liquid hydrogen and oxygen due to high costs, high pressures, extremely low fluid temperatures, the presence of cavitation, and associated safety risks. Due to the complex geometry and hazardous nature of the fluids, a simplified throat geometry with water as a simulant fluid is used. Flow characteristics are measured using MTV, a noninvasive flow diagnostic technique. MTV is found to be an applicable tool in cases of low cavitation. Highly cavitating flows reflect and scatter most of the laser beam disallowing penetration into the cavitation cloud. However, data can be obtained in high cavitation cases near the cloud boundary layer. Distribution A: Public Release, Public Affairs Clearance Number: 12654
Exploiting Parallel R in the Cloud with SPRINT
Piotrowski, M.; McGilvary, G.A.; Sloan, T. M.; Mewissen, M.; Lloyd, A.D.; Forster, T.; Mitchell, L.; Ghazal, P.; Hill, J.
2012-01-01
Background Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Objectives Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon’s Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. Methods The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. Results It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of algorithm. Resource underutilization can further improve the time to result. End-user’s location impacts on costs due to factors such as local taxation. Conclusions: Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds. PMID:23223611
Exploiting parallel R in the cloud with SPRINT.
Piotrowski, M; McGilvary, G A; Sloan, T M; Mewissen, M; Lloyd, A D; Forster, T; Mitchell, L; Ghazal, P; Hill, J
2013-01-01
Advances in DNA Microarray devices and next-generation massively parallel DNA sequencing platforms have led to an exponential growth in data availability but the arising opportunities require adequate computing resources. High Performance Computing (HPC) in the Cloud offers an affordable way of meeting this need. Bioconductor, a popular tool for high-throughput genomic data analysis, is distributed as add-on modules for the R statistical programming language but R has no native capabilities for exploiting multi-processor architectures. SPRINT is an R package that enables easy access to HPC for genomics researchers. This paper investigates: setting up and running SPRINT-enabled genomic analyses on Amazon's Elastic Compute Cloud (EC2), the advantages of submitting applications to EC2 from different parts of the world and, if resource underutilization can improve application performance. The SPRINT parallel implementations of correlation, permutation testing, partitioning around medoids and the multi-purpose papply have been benchmarked on data sets of various size on Amazon EC2. Jobs have been submitted from both the UK and Thailand to investigate monetary differences. It is possible to obtain good, scalable performance but the level of improvement is dependent upon the nature of the algorithm. Resource underutilization can further improve the time to result. End-user's location impacts on costs due to factors such as local taxation. Although not designed to satisfy HPC requirements, Amazon EC2 and cloud computing in general provides an interesting alternative and provides new possibilities for smaller organisations with limited funds.
The Role of Aerosols on Precipitation Processes: Cloud Resolving Model Simulations
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Li, X.; Matsui, T.
2012-01-01
Cloud microphysics is inevitably affected by the smoke particle (CCN, cloud condensation nuclei) size distributions below the clouds. Therefore, size distributions parameterized as spectral bin microphysics are needed to explicitly study the effects of atmospheric aerosol concentration on cloud development, rainfall production, and rainfall rates for convective clouds. Recently, a detailed spectral-bin microphysical scheme was implemented into the Goddard Cumulus Ensemble (GCE) model. The formulation for the explicit spectral bin microphysical processes is based on solving stochastic kinetic equations for the size distribution functions of water droplets (i.e., cloud droplets and raindrops), and several types of ice particles [i.e. pristine ice crystals (columnar and plate-like), snow (dendrites and aggregates), graupel and frozen drops/hail]. Each type is described by a special size distribution function containing many categories (i.e., 33 bins). Atmospheric aerosols are also described using number density size-distribution functions. The model is tested by studying the evolution of deep cloud systems in the west Pacific warm pool region, the sub-tropics (Florida) and midlatitudes using identical thermodynamic conditions but with different concentrations of CCN: a low "clean" concentration and a high "dirty" concentration. Results indicate that the low CCN concentration case produces rainfall at the surface sooner than the high CeN case but has less cloud water mass aloft. Because the spectral-bin model explicitly calculates and allows for the examination of both the mass and number concentration of species in each size category, a detailed analysis of the instantaneous size spectrum can be obtained for these cases. It is shown that since the low (CN case produces fewer droplets, larger sizes develop due to greater condensational and collection growth, leading to a broader size spectrum in comparison to the high CCN case. Sensitivity tests were performed to identify the impact of ice processes, radiation and large-scale influence on cloud-aerosol interactive processes, especially regarding surface rainfall amounts and characteristics (i.e., heavy or convective versus light or stratiform types). In addition, an inert tracer was included to follow the vertical redistribution of aerosols by cloud processes. We will also give a brief review from observational evidence on the role of aerosol on precipitation processes.
NASA Technical Reports Server (NTRS)
Schwemmer, Geary K.; Miller, David O.
2005-01-01
Clouds have a powerful influence on atmospheric radiative transfer and hence are crucial to understanding and interpreting the exchange of radiation between the Earth's surface, the atmosphere, and space. Because clouds are highly variable in space, time and physical makeup, it is important to be able to observe them in three dimensions (3-D) with sufficient resolution that the data can be used to generate and validate parameterizations of cloud fields at the resolution scale of global climate models (GCMs). Simulation of photon transport in three dimensionally inhomogeneous cloud fields show that spatial inhomogeneities tend to decrease cloud reflection and absorption and increase direct and diffuse transmission, Therefore it is an important task to characterize cloud spatial structures in three dimensions on the scale of GCM grid elements. In order to validate cloud parameterizations that represent the ensemble, or mean and variance of cloud properties within a GCM grid element, measurements of the parameters must be obtained on a much finer scale so that the statistics on those measurements are truly representative. High spatial sampling resolution is required, on the order of 1 km or less. Since the radiation fields respond almost instantaneously to changes in the cloud field, and clouds changes occur on scales of seconds and less when viewed on scales of approximately 100m, the temporal resolution of cloud properties should be measured and characterized on second time scales. GCM time steps are typically on the order of an hour, but in order to obtain sufficient statistical representations of cloud properties in the parameterizations that are used as model inputs, averaged values of cloud properties should be calculated on time scales on the order of 10-100 s. The Holographic Airborne Rotating Lidar Instrument Experiment (HARLIE) provides exceptional temporal (100 ms) and spatial (30 m) resolution measurements of aerosol and cloud backscatter in three dimensions. HARLIE was used in a ground-based configuration in several recent field campaigns. Principal data products include aerosol backscatter profiles, boundary layer heights, entrainment zone thickness, cloud fraction as a function of altitude and horizontal wind vector profiles based on correlating the motions of clouds and aerosol structures across portions of the scan. Comparisons will be made between various cloud detecting instruments to develop a baseline performance metric.
NASA Astrophysics Data System (ADS)
Cesana, G.; Waliser, D. E.; Jiang, X.; Li, J. L. F.
2014-12-01
The ubiquitous presence of clouds within the troposphere contributes to modulate the radiative balance of the earth-atmosphere system. Depending on their cloud phase, clouds may have different microphysical and macrophysical properties, and hence, different radiative effects. In this study, we took advantage of climate runs from the GASS-YoTC and AMIP multi-model experiments to document the differences associated to the cloud phase parameterizations of 16 GCMs. A particular emphasize has been put on the vertical structure of the transition between liquid and ice in clouds. A way to intercompare the models regardless of their cloud fraction is to study the ratio of the ice mass to the total mass of the condensed water. To address the challenge of evaluating the modeled cloud phase, we profited from the cloud phase climatology so called CALIPSO-GOCCP, which separates liquid clouds from ice clouds at global scale, with a high vertical resolution (480m), above all surfaces. We also used reanalysis data and GPCP satellite observations to investigate the influence of the temperature, the relative humidity, the vertical wind speed and the precipitations on the cloud phase transition. In 12 (of 16) models, there are too few super cooled liquid in clouds compared to observations, mostly in the high troposphere. We exhibited evidences of the link between the cloud phase transition and the humidity, the vertical wind speed as well as the precipitations. Some cloud phase schemes are more affected by the humidity and the vertical velocity and some other by the precipitations. Although a few models can reproduce the observe relation between cloud phase and temperature, humidity, vertical velocity or precipitations, none of them perform well for all the parameters. An important result of this study is that the T-dependent phase parameterizations do not allow simulating the complexity of the observed cloud phase transition. Unfortunately, more complex microphysics schemes do not succeed to reproduce all the processes neither. Finally, thanks to the combined use of CALIPSO-GOCCP and ECMWF water vapor pressure, we showed an updated version of the Clausius-Clapeyron water vapor phase diagram. This diagram represents a new tool to improve the simulation of the cloud phase transition in climate models.
The Earthcare Cloud Profiling Radar, its PFM development status (Conference Presentation)
NASA Astrophysics Data System (ADS)
Nakatsuka, Hirotaka; Tomita, Eichi; Aida, Yoshihisa; Seki, Yoshihiro; Okada, Kazuyuki; Maruyama, Kenta; Ishii, Yasuyuki; Tomiyama, Nobuhiro; Ohno, Yuichi; Horie, Hiroaki; Sato, Kenji
2016-10-01
The Earth Clouds, Aerosols and Radiation Explorer (EarthCARE) mission is joint mission between Europe and Japan for the launch year of 2018. Mission objective is to improve scientific understanding of cloud-aerosol-radiation interactions that is one of the biggest uncertain factors for numerical climate and weather predictions. The EarthCARE spacecraft equips four instruments such as an ultra violet lidar (ATLID), a cloud profiling radar (CPR), a broadband radiometer (BBR), and a multi-spectral imager (MSI) and perform complete synergy observation to observe aerosols, clouds and their interactions simultaneously from the orbit. Japan Aerospace Exploration Agency (JAXA) is responsible for development of the CPR in this EarthCARE mission and the CPR will be the first space-borne W-band Doppler radar. The CPR is defined with minimum radar sensitivity of -35dBz (6dB better than current space-borne cloud radar, i.e. CloudSat, NASA), radiometric accuracy of 2.7 dB, and Doppler velocity measurement accuracy of less than 1.3 m/s. These specifications require highly accurate pointing technique in orbit and high power source with large antenna dish. JAXA and National Institute of Information and Communications Technology (NICT) have been jointly developed this CPR to meet these strict requirements so far and then achieved the development such as new CFRP flex-core structure, long life extended interaction klystron, low loss quasi optical feed technique, and so on. Through these development successes, CPR development phase has been progressed to critical design phase. In addition, new ground calibration technique is also being progressed for launch of EarthCARE/CPR. The unique feature of EarthCARE CPR is vertical Doppler velocity measurement capability. Vertical Doppler velocity measurement is very attractive function from the science point of view, because vertical motions of cloud particles are related with cloud microphysics and dynamics. However, from engineering point of view, Doppler measurement from satellite is quite challenging Technology. In order to maintain and ensure the CPR performance, several types of calibration data will be obtained by CPR. Overall performance of CPR is checked by Active Radar Calibrator (ARC) equipped on the ground (CPR in External Calibration mode). ARC is used to check the CPR transmitter performance (ARC in receiver mode) and receiver performance (ARC in transmitter mode) as well as overall performance (ARC in transponder mode with delay to avoid the contamination with ground echo). In Japan, the instrument industrial Critical Design Review of the CPR was completed in 2013 and it was also complemented by an Interface and Mission aspects CPR CDR, involving ESA and the EarthCARE Prime, that was completed successfully in 2015. The CPR Proto-Flight Model is currently being tested with almost completion of Proto-Flight Model integration. After handed-over to ESA planned for the beginning of 2017, the CPR will be installed onto the EarthCARE satellite with the other instruments. After that the CPR will be tested, transported to Guiana Space Center in Kourou, French Guiana and launched by a Soyuz launcher in 2018. This presentation will show the summary of the latest CPR design and CPR PFM testing status.
The evolution of nocturnal boundary-layer clouds in southern West Africa - a case study from DACCIWA
NASA Astrophysics Data System (ADS)
Adler, Bianca; Kalthoff, Norbert; Babić, Karmen; Lohou, Fabienne; Dione, Cheikh; Lothon, Marie; Pedruzo-Bagazgoitia, Xabier
2017-04-01
During the monsoon season, the atmospheric boundary layer in southern West Africa is characterised by various kinds of low-level clouds which experience a distinct diurnal cycle. During the night, extensive low-level stratiform clouds frequently form with a cloud base often less than few hundred metres above ground. After sunrise the cloud base slowly starts rising and eventually a transition to convective clouds occurs. While the existence of the clouds is documented in satellite images and synoptic observations, little is known about the mechanisms controlling their evolution. To provide observational evidence, a field campaign was conducted in southern West Africa in June and July 2016 within the framework of the Dynamics-aerosol-chemistry-cloud interactions in West Africa (DACCIWA) project. Comprehensive ground-based in situ and remote sensing measurements were performed at three different supersites in Ghana, Benin and Nigeria. In this contribution, we present the diurnal cycle of boundary-layer clouds for a typical day using data from a supersite at Savè in Benin. Due to the synergy of various instruments, we are able to obtain detailed information on the evolution of the clouds as well as on the boundary-layer structure with high temporal and vertical resolution. By combining ceilometer, cloud radar and microwave radiometer data we determined the cloud base, -depth and -density. The clouds form in the same layer as a nocturnal low-level jet (NLLJ), which we probe by sodar and UHF profiler. There is evidence for a strong link between the height and strength of the NLLJ and the density of the nocturnal clouds.
Cloud Size Distributions from Multi-sensor Observations of Shallow Cumulus Clouds
NASA Astrophysics Data System (ADS)
Kleiss, J.; Riley, E.; Kassianov, E.; Long, C. N.; Riihimaki, L.; Berg, L. K.
2017-12-01
Combined radar-lidar observations have been used for almost two decades to document temporal changes of shallow cumulus clouds at the U.S. Department of Energy (DOE) Atmospheric Radiation Measurement (ARM) Facility's Southern Great Plains (SGP) site in Oklahoma, USA. Since the ARM zenith-pointed radars and lidars have a narrow field-of-view (FOV), the documented cloud statistics, such as distributions of cloud chord length (or horizontal length scale), represent only a slice along the wind direction of a region surrounding the SGP site, and thus may not be representative for this region. To investigate this impact, we compare cloud statistics obtained from wide-FOV sky images collected by ground-based observations at the SGP site to those from the narrow FOV active sensors. The main wide-FOV cloud statistics considered are cloud area distributions of shallow cumulus clouds, which are frequently required to evaluate model performance, such as routine large eddy simulation (LES) currently being conducted by the ARM LASSO (LES ARM Symbiotic Simulation and Observation) project. We obtain complementary macrophysical properties of shallow cumulus clouds, such as cloud chord length, base height and thickness, from the combined radar-lidar observations. To better understand the broader observational context where these narrow FOV cloud statistics occur, we compare them to collocated and coincident cloud area distributions from wide-FOV sky images and high-resolution satellite images. We discuss the comparison results and illustrate the possibility to generate a long-term climatology of cloud size distributions from multi-sensor observations at the SGP site.
NASA Technical Reports Server (NTRS)
De Boer, G.; Shupe, M.D.; Caldwell, P.M.; Bauer, Susanne E.; Persson, O.; Boyle, J.S.; Kelley, M.; Klein, S.A.; Tjernstrom, M.
2014-01-01
Atmospheric measurements from the Arctic Summer Cloud Ocean Study (ASCOS) are used to evaluate the performance of three atmospheric reanalyses (European Centre for Medium Range Weather Forecasting (ECMWF)- Interim reanalysis, National Center for Environmental Prediction (NCEP)-National Center for Atmospheric Research (NCAR) reanalysis, and NCEP-DOE (Department of Energy) reanalysis) and two global climate models (CAM5 (Community Atmosphere Model 5) and NASA GISS (Goddard Institute for Space Studies) ModelE2) in simulation of the high Arctic environment. Quantities analyzed include near surface meteorological variables such as temperature, pressure, humidity and winds, surface-based estimates of cloud and precipitation properties, the surface energy budget, and lower atmospheric temperature structure. In general, the models perform well in simulating large-scale dynamical quantities such as pressure and winds. Near-surface temperature and lower atmospheric stability, along with surface energy budget terms, are not as well represented due largely to errors in simulation of cloud occurrence, phase and altitude. Additionally, a development version of CAM5, which features improved handling of cloud macro physics, has demonstrated to improve simulation of cloud properties and liquid water amount. The ASCOS period additionally provides an excellent example of the benefits gained by evaluating individual budget terms, rather than simply evaluating the net end product, with large compensating errors between individual surface energy budget terms that result in the best net energy budget.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin
The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less
NASA Technical Reports Server (NTRS)
Yost, Christopher R.; Minnis, Patrick; Trepte, Qing Z.; Palikonda, Rabindra; Ayers, Jeffrey K.; Spangenberg, Doulas A.
2012-01-01
With geostationary satellite data it is possible to have a continuous record of diurnal cycles of cloud properties for a large portion of the globe. Daytime cloud property retrieval algorithms are typically superior to nighttime algorithms because daytime methods utilize measurements of reflected solar radiation. However, reflected solar radiation is difficult to accurately model for high solar zenith angles where the amount of incident radiation is small. Clear and cloudy scenes can exhibit very small differences in reflected radiation and threshold-based cloud detection methods have more difficulty setting the proper thresholds for accurate cloud detection. Because top-of-atmosphere radiances are typically more accurately modeled outside the terminator region, information from previous scans can help guide cloud detection near the terminator. This paper presents an algorithm that uses cloud fraction and clear and cloudy infrared brightness temperatures from previous satellite scan times to improve the performance of a threshold-based cloud mask near the terminator. Comparisons of daytime, nighttime, and terminator cloud fraction derived from Geostationary Operational Environmental Satellite (GOES) radiance measurements show that the algorithm greatly reduces the number of false cloud detections and smoothes the transition from the daytime to the nighttime clod detection algorithm. Comparisons with the Geoscience Laser Altimeter System (GLAS) data show that using this algorithm decreases the number of false detections by approximately 20 percentage points.
Millimeter- and Submillimeter-Wave Remote Sensing Using Small Satellites
NASA Technical Reports Server (NTRS)
Ehsan, N.; Esper, J.; Piepmeier, J.; Racette, P.; Wu, D.
2014-01-01
Cloud ice properties and processes play fundamental roles in atmospheric radiation and precipitation. Limited knowledge and poor representation of clouds in global climate models have led to large uncertainties about cloud feedback processes under climate change. Ice clouds have been used as a tuning parameter in the models to force agreement with observations of the radiation budget at the top of the atmosphere, and precipitation at the bottom. The lack of ice cloud measurements has left the cloud processes at intermediate altitudes unconstrained. Millimeter (mm) and submillimeter (submm)-wave radiometry is widely recognized for its potential to fill the cloud measurement gap in the middle and upper troposphere. Analyses have shown that channels from 183900 GHz offer good sensitivity to ice cloud scattering and can provide ice water path (IWP) products to an accuracy of 25 by simultaneously retrieving ice particle size (Dme) and IWP. Therefore, it is highly desirable to develop a cost-effective, compact mm/submm-wave instrument for cloud observations that can be deployed on future small satellites.This paper presents a conceptual study for a mm/submm-wave instrument for multispectral measurements of ice clouds. It discusses previous work at these frequencies by NASA Goddard Space Flight Center (GSFC) and the current instrument study, as well as receiver architectures and their anticipated performance. And finally, it describes a microsatellite prototype intended for use with this mm/submm-wave instrument.
A scalable infrastructure for CMS data analysis based on OpenStack Cloud and Gluster file system
NASA Astrophysics Data System (ADS)
Toor, S.; Osmani, L.; Eerola, P.; Kraemer, O.; Lindén, T.; Tarkoma, S.; White, J.
2014-06-01
The challenge of providing a resilient and scalable computational and data management solution for massive scale research environments requires continuous exploration of new technologies and techniques. In this project the aim has been to design a scalable and resilient infrastructure for CERN HEP data analysis. The infrastructure is based on OpenStack components for structuring a private Cloud with the Gluster File System. We integrate the state-of-the-art Cloud technologies with the traditional Grid middleware infrastructure. Our test results show that the adopted approach provides a scalable and resilient solution for managing resources without compromising on performance and high availability.
Regime-based evaluation of cloudiness in CMIP5 models
NASA Astrophysics Data System (ADS)
Jin, Daeho; Oreopoulos, Lazaros; Lee, Dongmin
2017-01-01
The concept of cloud regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating in each grid cell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product [long-term average total cloud amount (TCA)], cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our results support previous findings that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is still not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer cloud observations evaluated against ISCCP like another model output. Lastly, contrasting cloud simulation performance against each model's equilibrium climate sensitivity in order to gain insight on whether good cloud simulation pairs with particular values of this parameter, yields no clear conclusions.
NASA Technical Reports Server (NTRS)
Habermann, Ted; Gallagher, James; Jelenak, Aleksandar; Potter, Nathan; Lee, Joe; Yang, Kent
2017-01-01
This study explored three candidate architectures with different types of objects and access paths for serving NASA Earth Science HDF5 data via Hyrax running on Amazon Web Services (AWS). We studied the cost and performance for each architecture using several representative Use-Cases. The objectives of the study were: Conduct a trade study to identify one or more high performance integrated solutions for storing and retrieving NASA HDF5 and netCDF4 data in a cloud (web object store) environment. The target environment is Amazon Web Services (AWS) Simple Storage Service (S3). Conduct needed level of software development to properly evaluate solutions in the trade study and to obtain required benchmarking metrics for input into government decision of potential follow-on prototyping. Develop a cloud cost model for the preferred data storage solution (or solutions) that accounts for different granulation and aggregation schemes as well as cost and performance trades.We will describe the three architectures and the use cases along with performance results and recommendations for further work.
NASA Astrophysics Data System (ADS)
Fisher, Daniel; Poulsen, Caroline A.; Thomas, Gareth E.; Muller, Jan-Peter
2016-03-01
In this paper we evaluate the impact on the cloud parameter retrievals of the ORAC (Optimal Retrieval of Aerosol and Cloud) algorithm following the inclusion of stereo-derived cloud top heights as a priori information. This is performed in a mathematically rigorous way using the ORAC optimal estimation retrieval framework, which includes the facility to use such independent a priori information. Key to the use of a priori information is a characterisation of their associated uncertainty. This paper demonstrates the improvements that are possible using this approach and also considers their impact on the microphysical cloud parameters retrieved. The Along-Track Scanning Radiometer (AATSR) instrument has two views and three thermal channels, so it is well placed to demonstrate the synergy of the two techniques. The stereo retrieval is able to improve the accuracy of the retrieved cloud top height when compared to collocated Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO), particularly in the presence of boundary layer inversions and high clouds. The impact of the stereo a priori information on the microphysical cloud properties of cloud optical thickness (COT) and effective radius (RE) was evaluated and generally found to be very small for single-layer clouds conditions over open water (mean RE differences of 2.2 (±5.9) microns and mean COD differences of 0.5 (±1.8) for single-layer ice clouds over open water at elevations of above 9 km, which are most strongly affected by the inclusion of the a priori).
Cloud-cloud collision in the Galactic center 50 km s-1 molecular cloud
NASA Astrophysics Data System (ADS)
Tsuboi, Masato; Miyazaki, Atsushi; Uehara, Kenta
2015-12-01
We performed a search of star-forming sites influenced by external factors, such as SNRs, H II regions, and cloud-cloud collisions (CCCs), to understand the star-forming activity in the Galactic center region using the NRO Galactic Center Survey in SiO v = 0, J = 2-1, H13CO+J = 1-0, and CS J = 1-0 emission lines obtained with the Nobeyama 45 m telescope. We found a half-shell-like feature (HSF) with a high integrated line intensity ratio of ∫TB(SiO v = 0, J = 2-1)dv/∫TB(H13CO+J = 1-0)dv ˜ 6-8 in the 50 km s-1 molecular cloud; the HSF is a most conspicuous molecular cloud in the region and harbors an active star-forming site where several compact H II regions can be seen. The high ratio in the HSF indicates that the cloud contains huge shocked molecular gas. The HSF can be also seen as a half-shell feature in the position-velocity diagram. A hypothesis explaining the chemical and kinetic properties of the HSF is that the feature originates from a CCC. We analyzed the CS J = 1-0 emission line data obtained with the Nobeyama Millimeter Array to reveal the relation between the HSF and the molecular cloud cores in the cloud. We made a cumulative core mass function (CMF) of the molecular cloud cores within the HSF. The CMF in the CCC region is not truncated at least up to ˜2500 M⊙, although the CMF of the non-CCC region reaches the upper limit of ˜1500 M⊙. Most massive molecular cores with Mgas > 750 M⊙ are located only around the ridge of the HSF and adjoin the compact H II region. These may be a sign of massive star formation induced by CCCs in the Galactic center region.
Huang, Lei; Kang, Wenjun; Bartom, Elizabeth; Onel, Kenan; Volchenboum, Samuel; Andrade, Jorge
2015-01-01
Whole exome sequencing has facilitated the discovery of causal genetic variants associated with human diseases at deep coverage and low cost. In particular, the detection of somatic mutations from tumor/normal pairs has provided insights into the cancer genome. Although there is an abundance of publicly-available software for the detection of germline and somatic variants, concordance is generally limited among variant callers and alignment algorithms. Successful integration of variants detected by multiple methods requires in-depth knowledge of the software, access to high-performance computing resources, and advanced programming techniques. We present ExScalibur, a set of fully automated, highly scalable and modulated pipelines for whole exome data analysis. The suite integrates multiple alignment and variant calling algorithms for the accurate detection of germline and somatic mutations with close to 99% sensitivity and specificity. ExScalibur implements streamlined execution of analytical modules, real-time monitoring of pipeline progress, robust handling of errors and intuitive documentation that allows for increased reproducibility and sharing of results and workflows. It runs on local computers, high-performance computing clusters and cloud environments. In addition, we provide a data analysis report utility to facilitate visualization of the results that offers interactive exploration of quality control files, read alignment and variant calls, assisting downstream customization of potential disease-causing mutations. ExScalibur is open-source and is also available as a public image on Amazon cloud. PMID:26271043
Automated interpretation of 3D laserscanned point clouds for plant organ segmentation.
Wahabzada, Mirwaes; Paulus, Stefan; Kersting, Kristian; Mahlein, Anne-Katrin
2015-08-08
Plant organ segmentation from 3D point clouds is a relevant task for plant phenotyping and plant growth observation. Automated solutions are required to increase the efficiency of recent high-throughput plant phenotyping pipelines. However, plant geometrical properties vary with time, among observation scales and different plant types. The main objective of the present research is to develop a fully automated, fast and reliable data driven approach for plant organ segmentation. The automated segmentation of plant organs using unsupervised, clustering methods is crucial in cases where the goal is to get fast insights into the data or no labeled data is available or costly to achieve. For this we propose and compare data driven approaches that are easy-to-realize and make the use of standard algorithms possible. Since normalized histograms, acquired from 3D point clouds, can be seen as samples from a probability simplex, we propose to map the data from the simplex space into Euclidean space using Aitchisons log ratio transformation, or into the positive quadrant of the unit sphere using square root transformation. This, in turn, paves the way to a wide range of commonly used analysis techniques that are based on measuring the similarities between data points using Euclidean distance. We investigate the performance of the resulting approaches in the practical context of grouping 3D point clouds and demonstrate empirically that they lead to clustering results with high accuracy for monocotyledonous and dicotyledonous plant species with diverse shoot architecture. An automated segmentation of 3D point clouds is demonstrated in the present work. Within seconds first insights into plant data can be deviated - even from non-labelled data. This approach is applicable to different plant species with high accuracy. The analysis cascade can be implemented in future high-throughput phenotyping scenarios and will support the evaluation of the performance of different plant genotypes exposed to stress or in different environmental scenarios.
NASA Astrophysics Data System (ADS)
Merlin, G.; Riedi, J.; Labonnote, L. C.; Cornet, C.; Davis, A. B.; Dubuisson, P.; Desmons, M.; Ferlay, N.; Parol, F.
2015-12-01
The vertical distribution of cloud cover has a significant impact on a large number of meteorological and climatic processes. Cloud top altitude and cloud geometrical thickness are then essential. Previous studies established the possibility of retrieving those parameters from multi-angular oxygen A-band measurements. Here we perform a study and comparison of the performances of future instruments. The 3MI (Multi-angle, Multi-channel and Multi-polarization Imager) instrument developed by EUMETSAT, which is an extension of the POLDER/PARASOL instrument, and MSPI (Multi-angles Spectro-Polarimetric Imager) develoloped by NASA's Jet Propulsion Laboratory will measure total and polarized light reflected by the Earth's atmosphere-surface system in several spectral bands (from UV to SWIR) and several viewing geometries. Those instruments should provide opportunities to observe the links between the cloud structures and the anisotropy of the reflected solar radiation into space. Specific algorithms will need be developed in order to take advantage of the new capabilities of this instrument. However, prior to this effort, we need to understand, through a theoretical Shannon information content analysis, the limits and advantages of these new instruments for retrieving liquid and ice cloud properties, and especially, in this study, the amount of information coming from the A-Band channel on the cloud top altitude (CTOP) and geometrical thickness (CGT). We compare the information content of 3MI A-Band in two configurations and that of MSPI. Quantitative information content estimates show that the retrieval of CTOP with a high accuracy is possible in almost all cases investigated. The retrieval of CGT seems less easy but possible for optically thick clouds above a black surface, at least when CGT > 1-2 km.
Toward a Big Data Science: A challenge of "Science Cloud"
NASA Astrophysics Data System (ADS)
Murata, Ken T.; Watanabe, Hidenobu
2013-04-01
During these 50 years, along with appearance and development of high-performance computers (and super-computers), numerical simulation is considered to be a third methodology for science, following theoretical (first) and experimental and/or observational (second) approaches. The variety of data yielded by the second approaches has been getting more and more. It is due to the progress of technologies of experiments and observations. The amount of the data generated by the third methodologies has been getting larger and larger. It is because of tremendous development and programming techniques of super computers. Most of the data files created by both experiments/observations and numerical simulations are saved in digital formats and analyzed on computers. The researchers (domain experts) are interested in not only how to make experiments and/or observations or perform numerical simulations, but what information (new findings) to extract from the data. However, data does not usually tell anything about the science; sciences are implicitly hidden in the data. Researchers have to extract information to find new sciences from the data files. This is a basic concept of data intensive (data oriented) science for Big Data. As the scales of experiments and/or observations and numerical simulations get larger, new techniques and facilities are required to extract information from a large amount of data files. The technique is called as informatics as a fourth methodology for new sciences. Any methodologies must work on their facilities: for example, space environment are observed via spacecraft and numerical simulations are performed on super-computers, respectively in space science. The facility of the informatics, which deals with large-scale data, is a computational cloud system for science. This paper is to propose a cloud system for informatics, which has been developed at NICT (National Institute of Information and Communications Technology), Japan. The NICT science cloud, we named as OneSpaceNet (OSN), is the first open cloud system for scientists who are going to carry out their informatics for their own science. The science cloud is not for simple uses. Many functions are expected to the science cloud; such as data standardization, data collection and crawling, large and distributed data storage system, security and reliability, database and meta-database, data stewardship, long-term data preservation, data rescue and preservation, data mining, parallel processing, data publication and provision, semantic web, 3D and 4D visualization, out-reach and in-reach, and capacity buildings. Figure (not shown here) is a schematic picture of the NICT science cloud. Both types of data from observation and simulation are stored in the storage system in the science cloud. It should be noted that there are two types of data in observation. One is from archive site out of the cloud: this is a data to be downloaded through the Internet to the cloud. The other one is data from the equipment directly connected to the science cloud. They are often called as sensor clouds. In the present talk, we first introduce the NICT science cloud. We next demonstrate the efficiency of the science cloud, showing several scientific results which we achieved with this cloud system. Through the discussions and demonstrations, the potential performance of sciences cloud will be revealed for any research fields.
First Results of AirMSPI Imaging Polarimetry at ORACLES 2016: Aerosol and Water Cloud Retrievals
NASA Astrophysics Data System (ADS)
van Harten, G.; Xu, F.; Diner, D. J.; Rheingans, B. E.; Tosca, M.; Seidel, F.; Bull, M. A.; Tkatcheva, I. N.; McDuffie, J. L.; Garay, M. J.; Jovanovic, V. M.; Cairns, B.; Alexandrov, M. D.; Hostetler, C. A.; Ferrare, R. A.; Burton, S. P.
2017-12-01
The Airborne Multiangle SpectroPolarimetric Imager (AirMSPI) is a remote sensing instrument for the characterization of atmospheric aerosols and clouds. We will report on the successful deployment and resulting data products of AirMSPI in the 2016 field campaign as part of NASA's ObseRvations of Aerosols above CLouds and their intEractionS (ORACLES). The goal of this five-year investigation is to study the impacts of African biomass burning aerosols on the radiative properties of the subtropical stratocumulus cloud deck over the southeast Atlantic Ocean. On board the NASA ER-2 high-altitude aircraft, AirMSPI collected over 4000 high-resolution images on 16 days. The observations are performed in two different modes: step-and-stare mode, in which a 10x10 km target is observed from 9 view angles at 10 m resolution, and sweep mode, where a 80-100 km along-track by 10-25 km across-track target is observed with continuously changing view angle between ±67° at 25 m resolution. This Level 1B2 calibrated and georectified imagery is publically available at the NASA Langley Atmospheric Science Data Center (ASDC)*. We will then describe the Level 2 water cloud products that will be made publically available, viz. optical depth and droplet size distribution, which are retrieved using a polarimetric algorithm. Finally, we will present the results of a recently developed research algorithm for the simultaneous retrieval of these cloud properties and above-cloud aerosols, and validations using collocated High Spectral Resolution Lidar-2 (HSRL-2) and Research Scanning Polarimeter (RSP) products. * https://eosweb.larc.nasa.gov/project/airmspi/airmspi_table
Effects of Atmospheric Dynamics and Aerosols on the Fraction of Supercooled Water Clouds
NASA Astrophysics Data System (ADS)
Li, J.
2016-12-01
Based on the 8 years (2007-2015) of data of cloud phase information from the GCM-Oriented Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) Cloud Product (GOCCP), aerosol products from CALIPSO, and meteorological parameters from the ERA-Interim products, this study investigates the effects of atmospheric dynamics on the supercooled liquid cloud fraction (SCF) under different aerosol loadings at a global scale in order to better understand the conditions under which supercooled liquid water will gradually transform to ice phase. Statistical results indicate that aerosols' effect on nucleation cannot fully explain all SCF changes, especially in those regions where aerosols' effect on nucleation is not a first-order influence (e.g., due to low IN aerosol frequency). By performing the temporal and spatial correlations between SCFs and different meteorological factors, we find that the impacts of different meteorological factors on SCFs contain obvious regional differences. In the tropics, obvious positive correlations between SCFs and vertical velocity and relative humidity indicate that high vertical velocity and relative humidity suppress ice formation. However, the impacts of LTSS, skin temperature and horizontal wind on SCFs are relatively complex than those of vertical velocity and humidity. But, their effects are predominantly located in middle and high latitudes, and the temporal correlations with SCFs depend on latitude or surface type. In addition, this study also indicates that strong horizontal wind inhibits the glaciation of supercooled droplets in the middle and high latitudes. Our results verify the importance and regional of dynamical factors on the changes of supercooled water cloud fraction, thus have potential implications for further improving the parameterization of the cloud phase and determining the climate feedbacks.
NASA Technical Reports Server (NTRS)
Molthan, A. L.; Haynes, J. A.; Case, J. L.; Jedlovec, G. L.; Lapenta, W. M.
2008-01-01
As computational power increases, operational forecast models are performing simulations with higher spatial resolution allowing for the transition from sub-grid scale cloud parameterizations to an explicit forecast of cloud characteristics and precipitation through the use of single- or multi-moment bulk water microphysics schemes. investments in space-borne and terrestrial remote sensing have developed the NASA CloudSat Cloud Profiling Radar and the NOAA National Weather Service NEXRAD system, each providing observations related to the bulk properties of clouds and precipitation through measurements of reflectivity. CloudSat and NEXRAD system radars observed light to moderate snowfall in association with a cold-season, midlatitude cyclone traversing the Central United States in February 2007. These systems are responsible for widespread cloud cover and various types of precipitation, are of economic consequence, and pose a challenge to operational forecasters. This event is simulated with the Weather Research and Forecast (WRF) Model, utilizing the NASA Goddard Cumulus Ensemble microphysics scheme. Comparisons are made between WRF-simulated and observed reflectivity available from the CloudSat and NEXRAD systems. The application of CloudSat reflectivity is made possible through the QuickBeam radiative transfer model, with cautious application applied in light of single scattering characteristics and spherical target assumptions. Significant differences are noted within modeled and observed cloud profiles, based upon simulated reflectivity, and modifications to the single-moment scheme are tested through a supplemental WRF forecast that incorporates a temperature dependent snow crystal size distribution.
Partitioning of ice nucleating particles: Which modes matter?
NASA Astrophysics Data System (ADS)
Hande, Luke; Hoose, Corinna
2017-04-01
Ice particles in clouds have a large impact on cloud lifetime, precipitation amount, and cloud radiative properties through the indirect aerosol effect. Thus, correctly modelling ice formation processes is important for simulations preformed on all spatial and temporal scales. Ice forms on aerosol particles through several different mechanisms, namely deposition nucleation, immersion freezing, and contact freezing. However there is conflicting evidence as to which mode dominates, and the relative importance of the three heterogeneous ice nucleation mechanisms, as well as homogeneous nucleation, remains an open question. The environmental conditions, and hence the cloud type, have a large impact on determining which nucleation mode dominates. In order to understand this, simulations were performed with the COSMO-LES model, utilising state of the art parameterisations to describe the different nucleation mechanisms for several semi-idealised cloud types commonly occurring over central Europe. The cloud types investigated include a semi-idealised, and an idealised convective cloud, an orographic cloud, and a stratiform cloud. Results show that immersion and contact freezing dominate at warmer temperatures, and under most conditions, deposition nucleation plays only a minor role. In clouds where sufficiently high levels of water vapour are present at colder temperatures, deposition nucleation can play a role, however in general homogeneous nucleation dominates at colder temperatures. Since contact nucleation depends on the environmental relative humidity, enhancements in this nucleation mode can be seen in areas of dry air entrainment. The results indicate that ice microphysical processes are somewhat sensitve to the environmental conditions and therefore the cloud type.
NASA Astrophysics Data System (ADS)
Wiacek, A.; Peter, T.; Lohmann, U.
2010-02-01
This modelling study explores the availability of mineral dust particles as ice nuclei for interactions with ice, mixed-phase and liquid water clouds, also tracking the particles' history of cloud-processing. We performed 61 320 one-week forward trajectory calculations originating near the surface of major dust emitting regions in Africa and Asia using high-resolution meteorological analysis fields for the year 2007. Without explicitly modelling dust emission and deposition processes, dust-bearing trajectories were assumed to be those coinciding with known dust emission seasons. We found that dust emissions from Asian deserts lead to a higher potential for interactions with high clouds, despite being the climatologically much smaller dust emission source. This is due to Asian regions experiencing significantly more ascent than African regions, with strongest ascent in the Asian Taklimakan desert at ~25%, ~40% and 10% of trajectories ascending to 300 hPa in spring, summer and fall, respectively. The specific humidity at each trajectory's starting point was transported in a Lagrangian manner and relative humidities with respect to water and ice were calculated in 6-h steps downstream, allowing us to estimate the formation of liquid, mixed-phase and ice clouds. Practically none of the simulated air parcels reached regions where homogeneous ice nucleation can take place (T≲-40 °C) along trajectories that have not experienced water saturation first. By far the largest fraction of cloud forming trajectories entered conditions of mixed-phase clouds, where mineral dust will potentially exert the biggest influence. The majority of trajectories also passed through regions supersaturated with respect to ice but subsaturated with respect to water, where "warm" (T≳-40 °C) ice clouds may form prior to supercooled water or mixed-phase clouds. The importance of "warm" ice clouds and the general influence of dust in the mixed-phase cloud region are highly uncertain due to considerable scatter in recent laboratory data from ice nucleation experiments, which we briefly review in this work. For "classical" cirrus-forming temperatures, our results show that only mineral dust IN that underwent mixed-phase cloud-processing previously are likely to be relevant, and, therefore, we recommend further systematic studies of immersion mode ice nucleation on mineral dust suspended in atmospherically relevant coatings.
a Hadoop-Based Algorithm of Generating dem Grid from Point Cloud Data
NASA Astrophysics Data System (ADS)
Jian, X.; Xiao, X.; Chengfang, H.; Zhizhong, Z.; Zhaohui, W.; Dengzhong, Z.
2015-04-01
Airborne LiDAR technology has proven to be the most powerful tools to obtain high-density, high-accuracy and significantly detailed surface information of terrain and surface objects within a short time, and from which the Digital Elevation Model of high quality can be extracted. Point cloud data generated from the pre-processed data should be classified by segmentation algorithms, so as to differ the terrain points from disorganized points, then followed by a procedure of interpolating the selected points to turn points into DEM data. The whole procedure takes a long time and huge computing resource due to high-density, that is concentrated on by a number of researches. Hadoop is a distributed system infrastructure developed by the Apache Foundation, which contains a highly fault-tolerant distributed file system (HDFS) with high transmission rate and a parallel programming model (Map/Reduce). Such a framework is appropriate for DEM generation algorithms to improve efficiency. Point cloud data of Dongting Lake acquired by Riegl LMS-Q680i laser scanner was utilized as the original data to generate DEM by a Hadoop-based algorithms implemented in Linux, then followed by another traditional procedure programmed by C++ as the comparative experiment. Then the algorithm's efficiency, coding complexity, and performance-cost ratio were discussed for the comparison. The results demonstrate that the algorithm's speed depends on size of point set and density of DEM grid, and the non-Hadoop implementation can achieve a high performance when memory is big enough, but the multiple Hadoop implementation can achieve a higher performance-cost ratio, while point set is of vast quantities on the other hand.
Neural network cloud top pressure and height for MODIS
NASA Astrophysics Data System (ADS)
Håkansson, Nina; Adok, Claudia; Thoss, Anke; Scheirer, Ronald; Hörnquist, Sara
2018-06-01
Cloud top height retrieval from imager instruments is important for nowcasting and for satellite climate data records. A neural network approach for cloud top height retrieval from the imager instrument MODIS (Moderate Resolution Imaging Spectroradiometer) is presented. The neural networks are trained using cloud top layer pressure data from the CALIOP (Cloud-Aerosol Lidar with Orthogonal Polarization) dataset. Results are compared with two operational reference algorithms for cloud top height: the MODIS Collection 6 Level 2 height product and the cloud top temperature and height algorithm in the 2014 version of the NWC SAF (EUMETSAT (European Organization for the Exploitation of Meteorological Satellites) Satellite Application Facility on Support to Nowcasting and Very Short Range Forecasting) PPS (Polar Platform System). All three techniques are evaluated using both CALIOP and CPR (Cloud Profiling Radar for CloudSat (CLOUD SATellite)) height. Instruments like AVHRR (Advanced Very High Resolution Radiometer) and VIIRS (Visible Infrared Imaging Radiometer Suite) contain fewer channels useful for cloud top height retrievals than MODIS, therefore several different neural networks are investigated to test how infrared channel selection influences retrieval performance. Also a network with only channels available for the AVHRR1 instrument is trained and evaluated. To examine the contribution of different variables, networks with fewer variables are trained. It is shown that variables containing imager information for neighboring pixels are very important. The error distributions of the involved cloud top height algorithms are found to be non-Gaussian. Different descriptive statistic measures are presented and it is exemplified that bias and SD (standard deviation) can be misleading for non-Gaussian distributions. The median and mode are found to better describe the tendency of the error distributions and IQR (interquartile range) and MAE (mean absolute error) are found to give the most useful information of the spread of the errors. For all descriptive statistics presented MAE, IQR, RMSE (root mean square error), SD, mode, median, bias and percentage of absolute errors above 0.25, 0.5, 1 and 2 km the neural network perform better than the reference algorithms both validated with CALIOP and CPR (CloudSat). The neural networks using the brightness temperatures at 11 and 12 µm show at least 32 % (or 623 m) lower MAE compared to the two operational reference algorithms when validating with CALIOP height. Validation with CPR (CloudSat) height gives at least 25 % (or 430 m) reduction of MAE.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kassianov, Evgueni I.; Flynn, Connor J.; Koontz, Annette S.
2013-09-11
Well-known cloud-screening algorithms, which are designed to remove cloud-contaminated aerosol optical depths (AOD) from AOD measurements, have shown great performance at many middle-to-low latitude sites around the world. However, they may occasionally fail under challenging observational conditions, such as when the sun is low (near the horizon) or when optically thin clouds with small spatial inhomogeneity occur. Such conditions have been observed quite frequently at the high-latitude Atmospheric Radiation Measurement (ARM) North Slope of Alaska (NSA) sites. A slightly modified cloud-screening version of the standard algorithm is proposed here with a focus on the ARM-supported Multifilter Rotating Shadowband Radiometer (MFRSR)more » and Normal Incidence Multifilter Radiometer (NIMFR) data. The modified version uses approximately the same techniques as the standard algorithm, but it additionally examines the magnitude of the slant-path line of sight transmittance and eliminates points when the observed magnitude is below a specified threshold. Substantial improvement of the multi-year (1999-2012) aerosol product (AOD and its Angstrom exponent) is shown for the NSA sites when the modified version is applied. Moreover, this version reproduces the AOD product at the ARM Southern Great Plains (SGP) site, which was originally generated by the standard cloud-screening algorithms. The proposed minor modification is easy to implement and its application to existing and future cloud-screening algorithms can be particularly beneficial for challenging observational conditions.« less
Space Situational Awareness Data Processing Scalability Utilizing Google Cloud Services
NASA Astrophysics Data System (ADS)
Greenly, D.; Duncan, M.; Wysack, J.; Flores, F.
Space Situational Awareness (SSA) is a fundamental and critical component of current space operations. The term SSA encompasses the awareness, understanding and predictability of all objects in space. As the population of orbital space objects and debris increases, the number of collision avoidance maneuvers grows and prompts the need for accurate and timely process measures. The SSA mission continually evolves to near real-time assessment and analysis demanding the need for higher processing capabilities. By conventional methods, meeting these demands requires the integration of new hardware to keep pace with the growing complexity of maneuver planning algorithms. SpaceNav has implemented a highly scalable architecture that will track satellites and debris by utilizing powerful virtual machines on the Google Cloud Platform. SpaceNav algorithms for processing CDMs outpace conventional means. A robust processing environment for tracking data, collision avoidance maneuvers and various other aspects of SSA can be created and deleted on demand. Migrating SpaceNav tools and algorithms into the Google Cloud Platform will be discussed and the trials and tribulations involved. Information will be shared on how and why certain cloud products were used as well as integration techniques that were implemented. Key items to be presented are: 1.Scientific algorithms and SpaceNav tools integrated into a scalable architecture a) Maneuver Planning b) Parallel Processing c) Monte Carlo Simulations d) Optimization Algorithms e) SW Application Development/Integration into the Google Cloud Platform 2. Compute Engine Processing a) Application Engine Automated Processing b) Performance testing and Performance Scalability c) Cloud MySQL databases and Database Scalability d) Cloud Data Storage e) Redundancy and Availability
A cloud masking algorithm for EARLINET lidar systems
NASA Astrophysics Data System (ADS)
Binietoglou, Ioannis; Baars, Holger; D'Amico, Giuseppe; Nicolae, Doina
2015-04-01
Cloud masking is an important first step in any aerosol lidar processing chain as most data processing algorithms can only be applied on cloud free observations. Up to now, the selection of a cloud-free time interval for data processing is typically performed manually, and this is one of the outstanding problems for automatic processing of lidar data in networks such as EARLINET. In this contribution we present initial developments of a cloud masking algorithm that permits the selection of the appropriate time intervals for lidar data processing based on uncalibrated lidar signals. The algorithm is based on a signal normalization procedure using the range of observed values of lidar returns, designed to work with different lidar systems with minimal user input. This normalization procedure can be applied to measurement periods of only few hours, even if no suitable cloud-free interval exists, and thus can be used even when only a short period of lidar measurements is available. Clouds are detected based on a combination of criteria including the magnitude of the normalized lidar signal and time-space edge detection performed using the Sobel operator. In this way the algorithm avoids misclassification of strong aerosol layers as clouds. Cloud detection is performed using the highest available time and vertical resolution of the lidar signals, allowing the effective detection of low-level clouds (e.g. cumulus humilis). Special attention is given to suppress false cloud detection due to signal noise that can affect the algorithm's performance, especially during day-time. In this contribution we present the details of algorithm, the effect of lidar characteristics (space-time resolution, available wavelengths, signal-to-noise ratio) to detection performance, and highlight the current strengths and limitations of the algorithm using lidar scenes from different lidar systems in different locations across Europe.
Evaluation of wind field statistics near and inside clouds using a coherent Doppler lidar
NASA Astrophysics Data System (ADS)
Lottman, Brian Todd
1998-09-01
This work proposes advanced techniques for measuring the spatial wind field statistics near and inside clouds using a vertically pointing solid state coherent Doppler lidar on a fixed ground based platform. The coherent Doppler lidar is an ideal instrument for high spatial and temporal resolution velocity estimates. The basic parameters of lidar are discussed, including a complete statistical description of the Doppler lidar signal. This description is extended to cases with simple functional forms for aerosol backscatter and velocity. An estimate for the mean velocity over a sensing volume is produced by estimating the mean spectra. There are many traditional spectral estimators, which are useful for conditions with slowly varying velocity and backscatter. A new class of estimators (novel) is introduced that produces reliable velocity estimates for conditions with large variations in aerosol backscatter and velocity with range, such as cloud conditions. Performance of traditional and novel estimators is computed for a variety of deterministic atmospheric conditions using computer simulated data. Wind field statistics are produced for actual data for a cloud deck, and for multi- layer clouds. Unique results include detection of possible spectral signatures for rain, estimates for the structure function inside a cloud deck, reliable velocity estimation techniques near and inside thin clouds, and estimates for simple wind field statistics between cloud layers.
On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat
NASA Astrophysics Data System (ADS)
Hua, H.
2016-12-01
Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.
Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation
NASA Technical Reports Server (NTRS)
Stocker, John C.; Golomb, Andrew M.
2011-01-01
Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.
Regime-Based Evaluation of Cloudiness in CMIP5 Models
NASA Technical Reports Server (NTRS)
Jin, Daeho; Oraiopoulos, Lazaros; Lee, Dong Min
2016-01-01
The concept of Cloud Regimes (CRs) is used to develop a framework for evaluating the cloudiness of 12 fifth Coupled Model Intercomparison Project (CMIP5) models. Reference CRs come from existing global International Satellite Cloud Climatology Project (ISCCP) weather states. The evaluation is made possible by the implementation in several CMIP5 models of the ISCCP simulator generating for each gridcell daily joint histograms of cloud optical thickness and cloud top pressure. Model performance is assessed with several metrics such as CR global cloud fraction (CF), CR relative frequency of occurrence (RFO), their product (long-term average total cloud amount [TCA]), cross-correlations of CR RFO maps, and a metric of resemblance between model and ISCCP CRs. In terms of CR global RFO, arguably the most fundamental metric, the models perform unsatisfactorily overall, except for CRs representing thick storm clouds. Because model CR CF is internally constrained by our method, RFO discrepancies yield also substantial TCA errors. Our findings support previous studies showing that CMIP5 models underestimate cloudiness. The multi-model mean performs well in matching observed RFO maps for many CRs, but is not the best for this or other metrics. When overall performance across all CRs is assessed, some models, despite their shortcomings, apparently outperform Moderate Resolution Imaging Spectroradiometer (MODIS) cloud observations evaluated against ISCCP as if they were another model output. Lastly, cloud simulation performance is contrasted with each model's equilibrium climate sensitivity (ECS) in order to gain insight on whether good cloud simulation pairs with particular values of this parameter.
SIMPLEX: Cloud-Enabled Pipeline for the Comprehensive Analysis of Exome Sequencing Data
Fischer, Maria; Snajder, Rene; Pabinger, Stephan; Dander, Andreas; Schossig, Anna; Zschocke, Johannes; Trajanoski, Zlatko; Stocker, Gernot
2012-01-01
In recent studies, exome sequencing has proven to be a successful screening tool for the identification of candidate genes causing rare genetic diseases. Although underlying targeted sequencing methods are well established, necessary data handling and focused, structured analysis still remain demanding tasks. Here, we present a cloud-enabled autonomous analysis pipeline, which comprises the complete exome analysis workflow. The pipeline combines several in-house developed and published applications to perform the following steps: (a) initial quality control, (b) intelligent data filtering and pre-processing, (c) sequence alignment to a reference genome, (d) SNP and DIP detection, (e) functional annotation of variants using different approaches, and (f) detailed report generation during various stages of the workflow. The pipeline connects the selected analysis steps, exposes all available parameters for customized usage, performs required data handling, and distributes computationally expensive tasks either on a dedicated high-performance computing infrastructure or on the Amazon cloud environment (EC2). The presented application has already been used in several research projects including studies to elucidate the role of rare genetic diseases. The pipeline is continuously tested and is publicly available under the GPL as a VirtualBox or Cloud image at http://simplex.i-med.ac.at; additional supplementary data is provided at http://www.icbi.at/exome. PMID:22870267
An energy-efficient failure detector for vehicular cloud computing.
Liu, Jiaxi; Wu, Zhibo; Dong, Jian; Wu, Jin; Wen, Dongxin
2018-01-01
Failure detectors are one of the fundamental components for maintaining the high availability of vehicular cloud computing. In vehicular cloud computing, lots of RSUs are deployed along the road to improve the connectivity. Many of them are equipped with solar battery due to the unavailability or excess expense of wired electrical power. So it is important to reduce the battery consumption of RSU. However, the existing failure detection algorithms are not designed to save battery consumption RSU. To solve this problem, a new energy-efficient failure detector 2E-FD has been proposed specifically for vehicular cloud computing. 2E-FD does not only provide acceptable failure detection service, but also saves the battery consumption of RSU. Through the comparative experiments, the results show that our failure detector has better performance in terms of speed, accuracy and battery consumption.
An energy-efficient failure detector for vehicular cloud computing
Liu, Jiaxi; Wu, Zhibo; Wu, Jin; Wen, Dongxin
2018-01-01
Failure detectors are one of the fundamental components for maintaining the high availability of vehicular cloud computing. In vehicular cloud computing, lots of RSUs are deployed along the road to improve the connectivity. Many of them are equipped with solar battery due to the unavailability or excess expense of wired electrical power. So it is important to reduce the battery consumption of RSU. However, the existing failure detection algorithms are not designed to save battery consumption RSU. To solve this problem, a new energy-efficient failure detector 2E-FD has been proposed specifically for vehicular cloud computing. 2E-FD does not only provide acceptable failure detection service, but also saves the battery consumption of RSU. Through the comparative experiments, the results show that our failure detector has better performance in terms of speed, accuracy and battery consumption. PMID:29352282
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoth, Gregory W., E-mail: gregory.hoth@nist.gov; Pelle, Bruno; Riedl, Stefan
We demonstrate a two axis gyroscope by the use of light pulse atom interferometry with an expanding cloud of atoms in the regime where the cloud has expanded by 1.1–5 times its initial size during the interrogation. Rotations are measured by analyzing spatial fringe patterns in the atom population obtained by imaging the final cloud. The fringes arise from a correlation between an atom's initial velocity and its final position. This correlation is naturally created by the expansion of the cloud, but it also depends on the initial atomic distribution. We show that the frequency and contrast of these spatialmore » fringes depend on the details of the initial distribution and develop an analytical model to explain this dependence. We also discuss several challenges that must be overcome to realize a high-performance gyroscope with this technique.« less
NASA Astrophysics Data System (ADS)
Hong, Yang
Precipitation estimation from satellite information (VISIBLE , IR, or microwave) is becoming increasingly imperative because of its high spatial/temporal resolution and board coverage unparalleled by ground-based data. After decades' efforts of rainfall estimation using IR imagery as basis, it has been explored and concluded that the limitations/uncertainty of the existing techniques are: (1) pixel-based local-scale feature extraction; (2) IR temperature threshold to define rain/no-rain clouds; (3) indirect relationship between rain rate and cloud-top temperature; (4) lumped techniques to model high variability of cloud-precipitation processes; (5) coarse scales of rainfall products. As continuing studies, a new version of Precipitation Estimation from Remotely Sensed Information using Artificial Neural Network (PERSIANN), called Cloud Classification System (CCS), has been developed to cope with these limitations in this dissertation. CCS includes three consecutive components: (1) a hybrid segmentation algorithm, namely Hierarchically Topographical Thresholding and Stepwise Seeded Region Growing (HTH-SSRG), to segment satellite IR images into separated cloud patches; (2) a 3D feature extraction procedure to retrieve both pixel-based local-scale and patch-based large-scale features of cloud patch at various heights; (3) an ANN model, Self-Organizing Nonlinear Output (SONO) network, to classify cloud patches into similarity-based clusters, using Self-Organizing Feature Map (SOFM), and then calibrate hundreds of multi-parameter nonlinear functions to identify the relationship between every cloud types and their underneath precipitation characteristics using Probability Matching Method and Multi-Start Downhill Simplex optimization techniques. The model was calibrated over the Southwest of United States (100°--130°W and 25°--45°N) first and then adaptively adjusted to the study region of North America Monsoon Experiment (65°--135°W and 10°--50°N) using observations from Geostationary Operational Environmental Satellite (GOES) IR imagery, Next Generation Radar (NEXRAD) rainfall network, and Tropical Rainfall Measurement Mission (TRMM) microwave rain rate estimates. CCS functions as a distributed model that first identifies cloud patches and then dispatches different but the best matching cloud-precipitation function for each cloud patch to estimate instantaneous rain rate at high spatial resolution (4km) and full temporal resolution of GOES IR images (every 30-minute). Evaluated over a range of spatial and temporal scales, the performance of CCS compared favorably with GOES Precipitation Index (GPI), Universal Adjusted GPI (UAGPI), PERSIANN, and Auto-Estimator (AE) algorithms, consistently. Particularly, the large number of nonlinear functions and optimum IR-rain rate thresholds of CCS model are highly variable, reflecting the complexity of dominant cloud-precipitation processes from cloud patch to cloud patch over various regions. As a result, CCS can more successfully capture variability in rain rate at small scales than existing algorithms and potentially provides rainfall product from GOES IR-NEXARD-TRMM TMI (SSM/I) at 0.12° x 0.12° and 3-hour resolution with relative low standard error (˜=3.0mm/hr) and high correlation coefficient (˜=0.65).
Performance, Agility and Cost of Cloud Computing Services for NASA GES DISC Giovanni Application
NASA Astrophysics Data System (ADS)
Pham, L.; Chen, A.; Wharton, S.; Winter, E. L.; Lynnes, C.
2013-12-01
The NASA Goddard Earth Science Data and Information Services Center (GES DISC) is investigating the performance, agility and cost of Cloud computing for GES DISC applications. Giovanni (Geospatial Interactive Online Visualization ANd aNalysis Infrastructure), one of the core applications at the GES DISC for online climate-related Earth science data access, subsetting, analysis, visualization, and downloading, was used to evaluate the feasibility and effort of porting an application to the Amazon Cloud Services platform. The performance and the cost of running Giovanni on the Amazon Cloud were compared to similar parameters for the GES DISC local operational system. A Giovanni Time-Series analysis of aerosol absorption optical depth (388nm) from OMI (Ozone Monitoring Instrument)/Aura was selected for these comparisons. All required data were pre-cached in both the Cloud and local system to avoid data transfer delays. The 3-, 6-, 12-, and 24-month data were used for analysis on the Cloud and local system respectively, and the processing times for the analysis were used to evaluate system performance. To investigate application agility, Giovanni was installed and tested on multiple Cloud platforms. The cost of using a Cloud computing platform mainly consists of: computing, storage, data requests, and data transfer in/out. The Cloud computing cost is calculated based on the hourly rate, and the storage cost is calculated based on the rate of Gigabytes per month. Cost for incoming data transfer is free, and for data transfer out, the cost is based on the rate in Gigabytes. The costs for a local server system consist of buying hardware/software, system maintenance/updating, and operating cost. The results showed that the Cloud platform had a 38% better performance and cost 36% less than the local system. This investigation shows the potential of cloud computing to increase system performance and lower the overall cost of system management.
Multi-scale Modeling of Arctic Clouds
NASA Astrophysics Data System (ADS)
Hillman, B. R.; Roesler, E. L.; Dexheimer, D.
2017-12-01
The presence and properties of clouds are critically important to the radiative budget in the Arctic, but clouds are notoriously difficult to represent in global climate models (GCMs). The challenge stems partly from a disconnect in the scales at which these models are formulated and the scale of the physical processes important to the formation of clouds (e.g., convection and turbulence). Because of this, these processes are parameterized in large-scale models. Over the past decades, new approaches have been explored in which a cloud system resolving model (CSRM), or in the extreme a large eddy simulation (LES), is embedded into each gridcell of a traditional GCM to replace the cloud and convective parameterizations to explicitly simulate more of these important processes. This approach is attractive in that it allows for more explicit simulation of small-scale processes while also allowing for interaction between the small and large-scale processes. The goal of this study is to quantify the performance of this framework in simulating Arctic clouds relative to a traditional global model, and to explore the limitations of such a framework using coordinated high-resolution (eddy-resolving) simulations. Simulations from the global model are compared with satellite retrievals of cloud fraction partioned by cloud phase from CALIPSO, and limited-area LES simulations are compared with ground-based and tethered-balloon measurements from the ARM Barrow and Oliktok Point measurement facilities.
Cloud Statistics and Discrimination in the Polar Regions
NASA Astrophysics Data System (ADS)
Chan, M.; Comiso, J. C.
2012-12-01
Despite their important role in the climate system, cloud cover and their statistics are poorly known, especially in the polar regions, where clouds are difficult to discriminate from snow covered surfaces. The advent of the A-train, which included Aqua/MODIS, CALIPSO/CALIOP and CloudSat/CPR sensors has provided an opportunity to improve our ability to accurately characterize the cloud cover. MODIS provides global coverage at a relatively good temporal and spatial resolution while CALIOP and CPR provide limited nadir sampling but accurate characterization of the vertical structure and phase of the cloud cover. Over the polar regions, cloud detection from a passive sensors like MODIS is challenging because of the presence of cold and highly reflective surfaces such as snow, sea-ice, glaciers, and ice-sheet, which have surface signatures similar to those of clouds. On the other hand, active sensors such as CALIOP and CPR are not only very sensitive to the presence of clouds but can also provide information about its microphysical characteristics. However, these nadir-looking sensors have sparse spatial coverage and their global data can have data spatial gaps of up to 100 km. We developed a polar cloud detection system for MODIS that is trained using collocated data from CALIOP and CPR. In particular, we employ a machine learning system that reads the radiative profile observed by MODIS and determine whether the field of view is cloudy or clear. Results have shown that the improved cloud detection scheme performs better than typical cloud mask algorithms using a validation data set not used for training. A one-year data set was generated and results indicate that daytime cloud detection accuracies improved from 80.1% to 92.6% (over sea-ice) and 71.2% to 87.4% (over ice-sheet) with CALIOP data used as the baseline. Significant improvements are also observed during nighttime, where cloud detection accuracies increase by 19.8% (over sea-ice) and 11.6% (over ice-sheet). The immediate impact of the new algorithm is that it can minimize large biases of MODIS-derived cloud amount over the Polar Regions and thus a more realistic and high quality global cloud statistics. In particular, our results show that cloud fraction in the Arctic is typically 81.2 % during daytime and 84.0% during nighttime. This is significantly higher than the 71.8% and 58.5%, respectively, derived from standard MODIS cloud product.
The iPlant collaborative: cyberinfrastructure for enabling data to discovery for the life sciences
USDA-ARS?s Scientific Manuscript database
The iPlant Collaborative provides life science research communities access to comprehensive, scalable, and cohesive computational infrastructure for data management; identify management; collaboration tools; and cloud, high-performance, high-throughput computing. iPlant provides training, learning m...
Cloud Computing for Complex Performance Codes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, Gordon John; Hadgu, Teklu; Klein, Brandon Thorin
This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.
Molecular Line Studies of Ballistic Stellar Interlopers Burrowing through Dense Interstellar Clouds
NASA Astrophysics Data System (ADS)
Rosen, Anna; Sahai, R.; Claussen, M.; Morris, M.
2010-01-01
When an intermediate-mass star speeds through a dense interstellar cloud at a high velocity, it can produce a cometary or bow shock structure due to the cloud being impacted by the intense stellar wind. This class of objects, recently discovered in an HST imaging survey, has been dubbed "ballistic stellar interlopers" (Sahai et al. 2009). Using the ARO's 12m and SMT 10m millimeter-wave dishes, we have obtained molecular line emission data towards 10 stellar interloper sources, in order to identify and characterize the dense clouds with which the interlopers are interacting. We have made small "on-the-fly" maps in the 12CO (J=2-1) and 13CO (J=2-1) lines for each cloud, and obtained spectra of high-density tracers such as N2H+ (J=3-2), HCO+ (J=3-2), CN(N=2-1), and SO(J=5-4), which probe a range of physical conditions in the interstellar clouds being impacted by the interlopers. The data have been reduced and analyzed, and preliminary estimates of the cloud temperatures (9-22 K) and 13CO optical depths (0.18-0.37) have been made. The maps, which show the emission as a function of radial velocity and spatial offset from the location of the interlopers, have helped us distinguish between the clouds interacting with the interlopers, and those which are unrelated but happen to lie along the line of sight. These data will now enable us to carry out high-resolution mm-wave interferometric observations of the interlopers in the future. This research was performed at JPL under the Minority Education Initiatives program. RS and MM were funded by a Long Term Space Astrophysics award from NASA for this work. The National Radio Astronomy Observatory is a facility of the National Science Foundation operated under cooperative agreement by Associated Universities, Inc. Special thanks goes to John Bieging and Bill Peters of the Arizona Radio Observatory.
Evolving the Land Information System into a Cloud Computing Service
DOE Office of Scientific and Technical Information (OSTI.GOV)
Houser, Paul R.
The Land Information System (LIS) was developed to use advanced flexible land surface modeling and data assimilation frameworks to integrate extremely large satellite- and ground-based observations with advanced land surface models to produce continuous high-resolution fields of land surface states and fluxes. The resulting fields are extremely useful for drought and flood assessment, agricultural planning, disaster management, weather and climate forecasting, water resources assessment, and the like. We envisioned transforming the LIS modeling system into a scientific cloud computing-aware web and data service that would allow clients to easily setup and configure for use in addressing large water management issues.more » The focus of this Phase 1 project was to determine the scientific, technical, commercial merit and feasibility of the proposed LIS-cloud innovations that are currently barriers to broad LIS applicability. We (a) quantified the barriers to broad LIS utility and commercialization (high performance computing, big data, user interface, and licensing issues); (b) designed the proposed LIS-cloud web service, model-data interface, database services, and user interfaces; (c) constructed a prototype LIS user interface including abstractions for simulation control, visualization, and data interaction, (d) used the prototype to conduct a market analysis and survey to determine potential market size and competition, (e) identified LIS software licensing and copyright limitations and developed solutions, and (f) developed a business plan for development and marketing of the LIS-cloud innovation. While some significant feasibility issues were found in the LIS licensing, overall a high degree of LIS-cloud technical feasibility was found.« less
Airborne Polarimeter Intercomparison for the NASA Aerosols-Clouds-Ecosystems (ACE) Mission
NASA Technical Reports Server (NTRS)
Knobelspiesse, Kirk; Redemann, Jens
2014-01-01
The Aerosols-Clouds-Ecosystems (ACE) mission, recommended by the National Research Council's Decadal Survey, calls for a multi-angle, multi-spectral polarimeter devoted to observations of atmospheric aerosols and clouds. In preparation for ACE, NASA funds the deployment of airborne polarimeters, including the Airborne Multi-angle SpectroPolarimeter Imager (AirMSPI), the Passive Aerosol and Cloud Suite (PACS) and the Research Scanning Polarimeter (RSP). These instruments have been operated together on NASA's ER-2 high altitude aircraft as part of field campaigns such as the POlarimeter DEfinition EXperiment (PODEX) (California, early 2013) and Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, California and Texas, summer 2013). Our role in these efforts has been to serve as an assessment team performing level 1 (calibrated radiance, polarization) and level 2 (retrieved geophysical parameter) instrument intercomparisons, and to promote unified and generalized calibration, uncertainty assessment and retrieval techniques. We will present our progress in this endeavor thus far and describe upcoming research in 2015.
NASA Technical Reports Server (NTRS)
Knobelspiesse, Kirk; Redemann, Jens
2014-01-01
The Aerosols-Clouds-Ecosystems (ACE) mission, recommended by the National Research Council's Decadal Survey, calls for a multi-angle, multi-spectral polarimeter devoted to observations of atmospheric aerosols and clouds. In preparation for ACE, NASA funds the deployment of airborne polarimeters, including the Airborne Multiangle SpectroPolarimeter Imager (AirMSPI), the Passive Aerosol and Cloud Suite (PACS) and the Research Scanning Polarimeter (RSP). These instruments have been operated together on NASA's ER-2 high altitude aircraft as part of field campaigns such as the POlarimeter DEfinition EXperiment (PODEX) (California, early 2013) and Studies of Emissions and Atmospheric Composition, Clouds and Climate Coupling by Regional Surveys (SEAC4RS, California and Texas, summer 2013). Our role in these efforts has been to serve as an assessment team performing level 1 (calibrated radiance, polarization) and level 2 (retrieved geophysical parameter) instrument intercomparisons, and to promote unified and generalized calibration, uncertainty assessment and retrieval techniques. We will present our progress in this endeavor thus far and describe upcoming research in 2015.
Lidar Penetration Depth Observations for Constraining Cloud Longwave Feedbacks
NASA Astrophysics Data System (ADS)
Vaillant de Guelis, T.; Chepfer, H.; Noel, V.; Guzman, R.; Winker, D. M.; Kay, J. E.; Bonazzola, M.
2017-12-01
Satellite-borne active remote sensing Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations [CALIPSO; Winker et al., 2010] and CloudSat [Stephens et al., 2002] provide direct measurements of the cloud vertical distribution, with a very high vertical resolution. The penetration depth of the laser of the lidar Z_Opaque is directly linked to the LongWave (LW) Cloud Radiative Effect (CRE) at Top Of Atmosphere (TOA) [Vaillant de Guélis et al., in review]. In addition, this measurement is extremely stable in time making it an excellent observational candidate to verify and constrain the cloud LW feedback mechanism [Chepfer et al., 2014]. In this work, we present a method to decompose the variations of the LW CRE at TOA using cloud properties observed by lidar [GOCCP v3.0; Guzman et al., 2017]. We decompose these variations into contributions due to changes in five cloud properties: opaque cloud cover, opaque cloud altitude, thin cloud cover, thin cloud altitude, and thin cloud emissivity [Vaillant de Guélis et al., in review]. We apply this method, in the real world, to the CRE variations of CALIPSO 2008-2015 record, and, in climate model, to LMDZ6 and CESM simulations of the CRE variations of 2008-2015 period and of the CRE difference between a warm climate and the current climate. In climate model simulations, the same cloud properties as those observed by CALIOP are extracted from the CFMIP Observation Simulator Package (COSP) [Bodas-Salcedo et al., 2011] lidar simulator [Chepfer et al., 2008], which mimics the observations that would be performed by the lidar on board CALIPSO satellite. This method, when applied on multi-model simulations of current and future climate, could reveal the altitude of cloud opacity level observed by lidar as a strong constrain for cloud LW feedback, since the altitude feedback mechanism is physically explainable and the altitude of cloud opacity accurately observed by lidar.
A Framework and Improvements of the Korea Cloud Services Certification System.
Jeon, Hangoo; Seo, Kwang-Kyu
2015-01-01
Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed.
A Framework and Improvements of the Korea Cloud Services Certification System
Jeon, Hangoo
2015-01-01
Cloud computing service is an evolving paradigm that affects a large part of the ICT industry and provides new opportunities for ICT service providers such as the deployment of new business models and the realization of economies of scale by increasing efficiency of resource utilization. However, despite benefits of cloud services, there are some obstacles to adopt such as lack of assessing and comparing the service quality of cloud services regarding availability, security, and reliability. In order to adopt the successful cloud service and activate it, it is necessary to establish the cloud service certification system to ensure service quality and performance of cloud services. This paper proposes a framework and improvements of the Korea certification system of cloud service. In order to develop it, the critical issues related to service quality, performance, and certification of cloud service are identified and the systematic framework for the certification system of cloud services and service provider domains are developed. Improvements of the developed Korea certification system of cloud services are also proposed. PMID:26125049
NASA Astrophysics Data System (ADS)
Petschko, Helene; Goetz, Jason; Schmidt, Sven
2017-04-01
Sinkholes are a serious threat on life, personal property and infrastructure in large parts of Thuringia. Over 9000 sinkholes have been documented by the Geological Survey of Thuringia, which are caused by collapsing hollows which formed due to solution processes within the local bedrock material. However, little is known about surface processes and their dynamics at the flanks of the sinkhole once the sinkhole has shaped. These processes are of high interest as they might lead to dangerous situations at or within the vicinity of the sinkhole. Our objective was the analysis of these deformations over time in 3D by applying terrestrial photogrammetry with a simple DSLR camera. Within this study, we performed an analysis of deformations within a sinkhole close to Bad Frankenhausen (Thuringia) using terrestrial photogrammetry and multi-view stereo 3D reconstruction to obtain a 3D point cloud describing the morphology of the sinkhole. This was performed for multiple data collection campaigns over a 6-month period. The photos of the sinkhole were taken with a Nikon D3000 SLR Camera. For the comparison of the point clouds the Multiscale Model to Model Comparison (M3C2) plugin of the software CloudCompare was used. It allows to apply advanced methods of point cloud difference calculation which considers the co-registration error between two point clouds for assessing the significance of the calculated difference (given in meters). Three Styrofoam cuboids of known dimensions (16 cm wide/29 cm high/11.5 cm deep) were placed within the sinkhole to test the accuracy of the point cloud difference calculation. The multi-view stereo 3D reconstruction was performed with Agisoft Photoscan. Preliminary analysis indicates that about 26% of the sinkhole showed changes exceeding the co-registration error of the point clouds. The areas of change can mainly be detected on the flanks of the sinkhole and on an earth pillar that formed in the center of the sinkhole. These changes describe toppling (positive change of a few centimeters at the earth pillar) and a few erosion processes along the flanks (negative change of a few centimeters) compared to the first date of data acquisition. Additionally, the Styrofoam cuboids have successfully been detected with an observed depth change of 10 cm. However, the limitations of this approach related to the co-registration of the point clouds and data acquisition (windy conditions) have to be analyzed in more detail.
Cloud layer thicknesses from a combination of surface and upper-air observations
NASA Technical Reports Server (NTRS)
Poore, Kirk D.; Wang, Junhong; Rossow, William B.
1995-01-01
Cloud layer thicknesses are derived from base and top altitudes by combining 14 years (1975-1988) of surface and upper-air observations at 63 sites in the Northern Hemisphere. Rawinsonde observations are employed to determine the locations of cloud-layer top and base by testing for dewpoint temperature depressions below some threshold value. Surface observations serve as quality checks on the rawinsonde-determined cloud properties and provide cloud amount and cloud-type information. The dataset provides layer-cloud amount, cloud type, high, middle, or low height classes, cloud-top heights, base heights and layer thicknesses, covering a range of latitudes from 0 deg to 80 deg N. All data comes from land sites: 34 are located in continental interiors, 14 are near coasts, and 15 are on islands. The uncertainties in the derived cloud properties are discussed. For clouds classified by low-, mid-, and high-top altitudes, there are strong latitudinal and seasonal variations in the layer thickness only for high clouds. High-cloud layer thickness increases with latitude and exhibits different seasonal variations in different latitude zones: in summer, high-cloud layer thickness is a maximum in the Tropics but a minimum at high latitudes. For clouds classified into three types by base altitude or into six standard morphological types, latitudinal and seasonal variations in layer thickness are very small. The thickness of the clear surface layer decreases with latitude and reaches a summer minimum in the Tropics and summer maximum at higher latitudes over land, but does not vary much over the ocean. Tropical clouds occur in three base-altitude groups and the layer thickness of each group increases linearly with top altitude. Extratropical clouds exhibit two groups, one with layer thickness proportional to their cloud-top altitude and one with small (less than or equal to 1000 m) layer thickness independent of cloud-top altitude.
Short-term solar irradiance forecasting via satellite/model coupling
Miller, Steven D.; Rogers, Matthew A.; Haynes, John M.; ...
2017-12-01
The short-term (0-3 h) prediction of solar insolation for renewable energy production is a problem well-suited to satellite-based techniques. The spatial, spectral, temporal and radiometric resolution of instrumentation hosted on the geostationary platform allows these satellites to describe the current cloud spatial distribution and optical properties. These properties relate directly to the transient properties of the downwelling solar irradiance at the surface, which come in the form of 'ramps' that pose a central challenge to energy load balancing in a spatially distributed network of solar farms. The short-term evolution of the cloud field may be approximated to first order simplymore » as translational, but care must be taken in how the advection is handled and where the impacts are assigned. In this research, we describe how geostationary satellite observations are used with operational cloud masking and retrieval algorithms, wind field data from Numerical Weather Prediction (NWP), and radiative transfer calculations to produce short-term forecasts of solar insolation for applications in solar power generation. The scheme utilizes retrieved cloud properties to group pixels into contiguous cloud objects whose future positions are predicted using four-dimensional (space + time) model wind fields, selecting steering levels corresponding to the cloud height properties of each cloud group. The shadows associated with these clouds are adjusted for sensor viewing parallax displacement and combined with solar geometry and terrain height to determine the actual location of cloud shadows. For mid/high-level clouds at mid-latitudes and high solar zenith angles, the combined displacements from these geometric considerations are non-negligible. The cloud information is used to initialize a radiative transfer model that computes the direct and diffuse-sky solar insolation at both shadow locations and intervening clear-sky regions. Here, we describe the formulation of the algorithm and validate its performance against Surface Radiation (SURFRAD; Augustine et al., 2000, 2005) network observations. Typical errors range from 8.5% to 17.2% depending on the complexity of cloud regimes, and an operational demonstration outperformed persistence-based forecasting of Global Horizontal Irradiance (GHI) under all conditions by ~10 W/m2.« less
Short-term solar irradiance forecasting via satellite/model coupling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Steven D.; Rogers, Matthew A.; Haynes, John M.
The short-term (0-3 h) prediction of solar insolation for renewable energy production is a problem well-suited to satellite-based techniques. The spatial, spectral, temporal and radiometric resolution of instrumentation hosted on the geostationary platform allows these satellites to describe the current cloud spatial distribution and optical properties. These properties relate directly to the transient properties of the downwelling solar irradiance at the surface, which come in the form of 'ramps' that pose a central challenge to energy load balancing in a spatially distributed network of solar farms. The short-term evolution of the cloud field may be approximated to first order simplymore » as translational, but care must be taken in how the advection is handled and where the impacts are assigned. In this research, we describe how geostationary satellite observations are used with operational cloud masking and retrieval algorithms, wind field data from Numerical Weather Prediction (NWP), and radiative transfer calculations to produce short-term forecasts of solar insolation for applications in solar power generation. The scheme utilizes retrieved cloud properties to group pixels into contiguous cloud objects whose future positions are predicted using four-dimensional (space + time) model wind fields, selecting steering levels corresponding to the cloud height properties of each cloud group. The shadows associated with these clouds are adjusted for sensor viewing parallax displacement and combined with solar geometry and terrain height to determine the actual location of cloud shadows. For mid/high-level clouds at mid-latitudes and high solar zenith angles, the combined displacements from these geometric considerations are non-negligible. The cloud information is used to initialize a radiative transfer model that computes the direct and diffuse-sky solar insolation at both shadow locations and intervening clear-sky regions. Here, we describe the formulation of the algorithm and validate its performance against Surface Radiation (SURFRAD; Augustine et al., 2000, 2005) network observations. Typical errors range from 8.5% to 17.2% depending on the complexity of cloud regimes, and an operational demonstration outperformed persistence-based forecasting of Global Horizontal Irradiance (GHI) under all conditions by ~10 W/m2.« less
NASA Technical Reports Server (NTRS)
Leroux, C.; Bertin, F.; Mounir, H.
1991-01-01
Theoretical studies and experimental results obtained at Coulommiers airport showed the capability of Proust radar to detect wind shears, in clear air condition as well as in presence of clouds or rain. Several examples are presented: in a blocking highs situation an atmospheric wave system at the Brunt-Vaisala frequency can be clearly distinguished; in a situation of clouds without rain the limit between clear air and clouds can be easily seen; and a windshear associated with a gust front in rainy conditions is shown. A comparison of 30 cm clear air radar Proust and 5 cm weather Doppler radar Ronsard will allow to select the best candidate for wind shear detection, taking into account the low sensibility to ground clutter of Ronsard radar.
A new airborne sampler for interstitial particles in ice and liquid clouds
NASA Astrophysics Data System (ADS)
Moharreri, A.; Craig, L.; Rogers, D. C.; Brown, M.; Dhaniyala, S.
2011-12-01
In-situ measurements of cloud droplets and aerosols using aircraft platforms are required for understanding aerosol-cloud processes and aiding development of improved aerosol-cloud models. A variety of clouds with different temperature ranges and cloud particle sizes/phases must be studied for comprehensive knowledge about the role of aerosols in the formation and evolution of cloud systems under different atmospheric conditions. While representative aerosol measurements are regularly made from aircrafts under clear air conditions, aerosol measurements in clouds are often contaminated by the generation of secondary particles from the high speed impaction of ice particles and liquid droplets on the surfaces of the aircraft probes/inlets. A new interstitial particle sampler, called the blunt-body aerosol sampler (BASE) has been designed and used for aerosol sampling during two recent airborne campaigns using NCAR/NSF C-130 aircraft: PLOWS (2009-2010) and ICE-T (2011). Central to the design of the new interstitial inlet is an upstream blunt body housing that acts to shield/deflect large cloud droplets and ice particles from an aft sampling region. The blunt-body design also ensures that small shatter particles created from the impaction of cloud-droplets on the blunt-body are not present in the aft region where the interstitial inlet is located. Computational fluid dynamics (CFD) simulations along with particle transport modeling and wind tunnel studies have been utilized in different stages of design and development of this inlet. The initial flights tests during the PLOWS campaign showed that the inlet had satisfactory performance only in warm clouds and when large precipitation droplets were absent. In the presence of large droplets and ice, the inlet samples were contaminated with significant shatter artifacts. These initial results were reanalyzed in conjunction with a computational droplet shatter model and the numerical results were used to arrive at an improved sampler design. Analysis of the data from the recent ICE-T campaign with the improved sampler design shows that the modified version of BASE can provide shatter-artifact free sampling of aerosol particles in the presence of ice particles and significantly reduced shatter artifacts in warm clouds. Detailed design and modeling aspects of the sampler will be discussed and the sampler performance in warm and cold clouds will be presented and compared with measurements made using other aerosol inlets flown on the NCAR/NSF C-130 aircraft.
Airborne LIDAR point cloud tower inclination judgment
NASA Astrophysics Data System (ADS)
liang, Chen; zhengjun, Liu; jianguo, Qian
2016-11-01
Inclined transmission line towers for the safe operation of the line caused a great threat, how to effectively, quickly and accurately perform inclined judgment tower of power supply company safety and security of supply has played a key role. In recent years, with the development of unmanned aerial vehicles, unmanned aerial vehicles equipped with a laser scanner, GPS, inertial navigation is one of the high-precision 3D Remote Sensing System in the electricity sector more and more. By airborne radar scan point cloud to visually show the whole picture of the three-dimensional spatial information of the power line corridors, such as the line facilities and equipment, terrain and trees. Currently, LIDAR point cloud research in the field has not yet formed an algorithm to determine tower inclination, the paper through the existing power line corridor on the tower base extraction, through their own tower shape characteristic analysis, a vertical stratification the method of combining convex hull algorithm for point cloud tower scarce two cases using two different methods for the tower was Inclined to judge, and the results with high reliability.
Computational biology in the cloud: methods and new insights from computing at scale.
Kasson, Peter M
2013-01-01
The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.
A 94 GHz RF Electronics Subsystem for the CloudSat Cloud Profiling Radar
NASA Technical Reports Server (NTRS)
LaBelle, Remi C.; Girard, Ralph; Arbery, Graham
2003-01-01
The CloudSat spacecraft, scheduled for launch in 2004, will carry the 94 GHz Cloud Profiling Radar (CPR) instrument. The design, assembly and test of the flight Radio Frequency Electronics Subsystem (RFES) for this instrument has been completed and is presented here. The RFES consists of an Upconverter (which includes an Exciter and two Drive Amplifiers (DA's)), a Receiver, and a Transmitter Calibrator assembly. Some key performance parameters of the RFES are as follows: dual 100 mW pulse-modulated drive outputs at 94 GHz, overall Receiver noise figure < 5.0 dB, a highly stable W-band noise source to provide knowledge accuracy of Receiver gain of < 0.4 dB over the 2 year mission life, and a W-band peak power detector to monitor the transmitter output power to within 0.5 dB over life. Some recent monolithic microwave integrated circuit (MMIC) designs were utilized which implement the DA's in 0.1 micron GaAs high electron-mobility transistor (HEMT) technology and the Receiver low-noise amplifier (LNA) in 0.1 micron InP HEMT technology.
NASA Astrophysics Data System (ADS)
Ramachandran, R.; Murphy, K. J.; Baynes, K.; Lynnes, C.
2016-12-01
With the volume of Earth observation data expanding rapidly, cloud computing is quickly changing the way Earth observation data is processed, analyzed, and visualized. The cloud infrastructure provides the flexibility to scale up to large volumes of data and handle high velocity data streams efficiently. Having freely available Earth observation data collocated on a cloud infrastructure creates opportunities for innovation and value-added data re-use in ways unforeseen by the original data provider. These innovations spur new industries and applications and spawn new scientific pathways that were previously limited due to data volume and computational infrastructure issues. NASA, in collaboration with Amazon, Google, and Microsoft, have jointly developed a set of recommendations to enable efficient transfer of Earth observation data from existing data systems to a cloud computing infrastructure. The purpose of these recommendations is to provide guidelines against which all data providers can evaluate existing data systems and be used to improve any issues uncovered to enable efficient search, access, and use of large volumes of data. Additionally, these guidelines ensure that all cloud providers utilize a common methodology for bulk-downloading data from data providers thus preventing the data providers from building custom capabilities to meet the needs of individual cloud providers. The intent is to share these recommendations with other Federal agencies and organizations that serve Earth observation to enable efficient search, access, and use of large volumes of data. Additionally, the adoption of these recommendations will benefit data users interested in moving large volumes of data from data systems to any other location. These data users include the cloud providers, cloud users such as scientists, and other users working in a high performance computing environment who need to move large volumes of data.
A cloud-resolving model study of aerosol-cloud correlation in a pristine maritime environment
NASA Astrophysics Data System (ADS)
Nishant, Nidhi; Sherwood, Steven C.
2017-06-01
In convective clouds, satellite-observed deepening or increased amount of clouds with increasing aerosol concentration has been reported and is sometimes interpreted as aerosol-induced invigoration of the clouds. However, such correlations can be affected by meteorological factors that affect both aerosol and clouds, as well as observational issues. In this study, we examine the behavior in a 660 × 660 km2 region of the South Pacific during June 2007, previously found by Koren et al. (2014) to show strong correlation between cloud fraction, cloud top pressure, and aerosols, using a cloud-resolving model with meteorological boundary conditions specified from a reanalysis. The model assumes constant aerosol loading, yet reproduces vigorous clouds at times of high real-world aerosol concentrations. Days with high- and low-aerosol loading exhibit deep-convective and shallow clouds, respectively, in both observations and the simulation. Synoptic analysis shows that vigorous clouds occur at times of strong surface troughs, which are associated with high winds and advection of boundary layer air from the Southern Ocean where sea-salt aerosol is abundant, thus accounting for the high correlation. Our model results show that aerosol-cloud relationships can be explained by coexisting but independent wind-aerosol and wind-cloud relationships and that no cloud condensation nuclei effect is required.
Cloud archiving and data mining of High-Resolution Rapid Refresh forecast model output
NASA Astrophysics Data System (ADS)
Blaylock, Brian K.; Horel, John D.; Liston, Samuel T.
2017-12-01
Weather-related research often requires synthesizing vast amounts of data that need archival solutions that are both economical and viable during and past the lifetime of the project. Public cloud computing services (e.g., from Amazon, Microsoft, or Google) or private clouds managed by research institutions are providing object data storage systems potentially appropriate for long-term archives of such large geophysical data sets. We illustrate the use of a private cloud object store developed by the Center for High Performance Computing (CHPC) at the University of Utah. Since early 2015, we have been archiving thousands of two-dimensional gridded fields (each one containing over 1.9 million values over the contiguous United States) from the High-Resolution Rapid Refresh (HRRR) data assimilation and forecast modeling system. The archive is being used for retrospective analyses of meteorological conditions during high-impact weather events, assessing the accuracy of the HRRR forecasts, and providing initial and boundary conditions for research simulations. The archive is accessible interactively and through automated download procedures for researchers at other institutions that can be tailored by the user to extract individual two-dimensional grids from within the highly compressed files. Characteristics of the CHPC object storage system are summarized relative to network file system storage or tape storage solutions. The CHPC storage system is proving to be a scalable, reliable, extensible, affordable, and usable archive solution for our research.
NASA Astrophysics Data System (ADS)
Behrangi, A.; Kubar, T. L.; Lambrigtsen, B.
2011-12-01
Different cloud types have substantially different characteristics in terms of radiative forcing and microphysical properties, both important components of Earth's climate system. Relationships between tropical cloud type characteristics and sea surface temperature (SST) using two-years of A-train data are investigated in this presentation. Stratocumulus clouds are the dominant cloud type over SSTs less than 301K, and in fact their fraction is strongly inversely related to SST. This is physically logical as both static stability and large-scale subsidence scale well with decreasing SST. At SSTs greater than 301K, high clouds are the most abundant cloud type. All cloud types (except nimbostratus and stratocumulus) become sharply more abundant for SSTs greater than a window between 299K and 300.5K, depending on cloud type. The fraction of high, deep convective, altostratus, and altocumulus clouds peak at an SST close to 303K, while cumulus clouds have a broad cloud fraction peak centered near 301K. Deep convective and other high cloud types decrease sharply above SSTs of 303K. While overall early morning clouds are 10% (4%) more frequent than afternoon clouds as indicated by CloudSat (lidar-radar), certain cloud types occur more frequently in the early afternoon, such as high clouds. We also show that a large amount of warm precipitation mainly from stratocumulus clouds is missed or significantly underestimated by the current suite of satellite-based global precipitation measuring sensors. However, the operational sensitivity of Cloudsat cloud profiling radar permits to capture significant fraction of light drizzle and warm rain.
1979-07-06
Range : 3.2 million km This image returned by Voyager 2 shows one of the long dark clouds observed in the North Equatorial Belt of Jupiter. A high, white cloud is seen moving over the darker cloud, providing an indication of the structure of the cloud layers. Thin white clouds are also seen within the dark cloud. At right, blue areas, free of high clouds, are seen.
Mapping with Small UAS: A Point Cloud Accuracy Assessment
NASA Astrophysics Data System (ADS)
Toth, Charles; Jozkow, Grzegorz; Grejner-Brzezinska, Dorota
2015-12-01
Interest in using inexpensive Unmanned Aerial System (UAS) technology for topographic mapping has recently significantly increased. Small UAS platforms equipped with consumer grade cameras can easily acquire high-resolution aerial imagery allowing for dense point cloud generation, followed by surface model creation and orthophoto production. In contrast to conventional airborne mapping systems, UAS has limited ground coverage due to low flying height and limited flying time, yet it offers an attractive alternative to high performance airborne systems, as the cost of the sensors and platform, and the flight logistics, is relatively low. In addition, UAS is better suited for small area data acquisitions and to acquire data in difficult to access areas, such as urban canyons or densely built-up environments. The main question with respect to the use of UAS is whether the inexpensive consumer sensors installed in UAS platforms can provide the geospatial data quality comparable to that provided by conventional systems. This study aims at the performance evaluation of the current practice of UAS-based topographic mapping by reviewing the practical aspects of sensor configuration, georeferencing and point cloud generation, including comparisons between sensor types and processing tools. The main objective is to provide accuracy characterization and practical information for selecting and using UAS solutions in general mapping applications. The analysis is based on statistical evaluation as well as visual examination of experimental data acquired by a Bergen octocopter with three different image sensor configurations, including a GoPro HERO3+ Black Edition, a Nikon D800 DSLR and a Velodyne HDL-32. In addition, georeferencing data of varying quality were acquired and evaluated. The optical imagery was processed by using three commercial point cloud generation tools. Comparing point clouds created by active and passive sensors by using different quality sensors, and finally, by different commercial software tools, provides essential information for the performance validation of UAS technology.
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-04-01
The cloud processing scheme APOLLO (Avhrr Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. While building upon the physical principles having served well in the original APOLLO a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is not performed as a binary yes/no decision based on these physical principals but is expressed as cloud probability for each satellite pixel. Consequently the outcome of the algorithm can be tuned from clear confident to cloud confident depending on the purpose. The probabilistic approach allows to retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for the application with large amounts of historical satellite data. Thus the radiative transfer solution is approximated by the same two stream approach which also had been used for the original APOLLO. This allows the algorithm to be robust enough for being applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e. within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from on NOAA-18 are presented.
NASA Astrophysics Data System (ADS)
Sinclair, K.; van Diedenhoven, B.; Cairns, B.; Alexandrov, M. D.; Ziemba, L. D.; Moore, R.; Crosbie, E.; Hostetler, C. A.
2016-12-01
Cloud droplet number concentration (CDNC) is a key parameter of of liquid clouds and is essential for the understanding of aerosol-cloud interaction. It couples surface aerosol composition and chemistry on the one hand and cloud reflectivity on the other. It impacts radiative forcing, cloud evolution, precipitation, global climate and, through observation, can be used to monitor the cloud albedo effect, or the first indirect effect. The North Atlantic and Marine Ecosystems Study (NAAMES), which is a NASA-led ship and air campaign that takes place off the east coast of Newfoundland, observed many low cloud decks and aerosols over a marine environment. This campaign has completed two of four deployments and provides an excellent opportunity for the Research Scanning Polarimeter (RSP) to cross-validate its approach of sensing CDNC with the Langley Aerosol Research Group Experiment's (LARGE's) Cloud Droplet Probe (CDP). The RSP is an airborne scanning sensor that provides high-precision measurements of polarized and full-intensity radiances at multiple angles over a wide spectral range. Each of the four NAAMES deployments are aligned to a specific annual event in the plankton cycle, along with other variations in environmental conditions. The Fall 2015 and spring 2016 deployments allow us to demonstrate and characterize the RSP's performance over a range of CDNCs and cloud types. We also assess correlations between the RSP CDNC measurements and atmospheric aerosol load. Using the LARGE Cloud Particle Counter (CPC) and Aerosol Mass Spectrometer (AMS), links between the size and type of aerosols and the RSP CDNC retrievals are explored.
Cloud feedback mechanisms and their representation in global climate models
Ceppi, Paulo; Brient, Florent; Zelinka, Mark D.; ...
2017-05-11
Cloud feedback—the change in top-of-atmosphere radiative flux resulting from the cloud response to warming—constitutes by far the largest source of uncertainty in the climate response to CO 2 forcing simulated by global climate models (GCMs). In this paper, we review the main mechanisms for cloud feedbacks, and discuss their representation in climate models and the sources of intermodel spread. Global-mean cloud feedback in GCMs results from three main effects: (1) rising free-tropospheric clouds (a positive longwave effect); (2) decreasing tropical low cloud amount (a positive shortwave [SW] effect); (3) increasing high-latitude low cloud optical depth (a negative SW effect). Thesemore » cloud responses simulated by GCMs are qualitatively supported by theory, high-resolution modeling, and observations. Rising high clouds are consistent with the fixed anvil temperature (FAT) hypothesis, whereby enhanced upper-tropospheric radiative cooling causes anvil cloud tops to remain at a nearly fixed temperature as the atmosphere warms. Tropical low cloud amount decreases are driven by a delicate balance between the effects of vertical turbulent fluxes, radiative cooling, large-scale subsidence, and lower-tropospheric stability on the boundary-layer moisture budget. High-latitude low cloud optical depth increases are dominated by phase changes in mixed-phase clouds. Finally, the causes of intermodel spread in cloud feedback are discussed, focusing particularly on the role of unresolved parameterized processes such as cloud microphysics, turbulence, and convection.« less
Cloud feedback mechanisms and their representation in global climate models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ceppi, Paulo; Brient, Florent; Zelinka, Mark D.
Cloud feedback—the change in top-of-atmosphere radiative flux resulting from the cloud response to warming—constitutes by far the largest source of uncertainty in the climate response to CO 2 forcing simulated by global climate models (GCMs). In this paper, we review the main mechanisms for cloud feedbacks, and discuss their representation in climate models and the sources of intermodel spread. Global-mean cloud feedback in GCMs results from three main effects: (1) rising free-tropospheric clouds (a positive longwave effect); (2) decreasing tropical low cloud amount (a positive shortwave [SW] effect); (3) increasing high-latitude low cloud optical depth (a negative SW effect). Thesemore » cloud responses simulated by GCMs are qualitatively supported by theory, high-resolution modeling, and observations. Rising high clouds are consistent with the fixed anvil temperature (FAT) hypothesis, whereby enhanced upper-tropospheric radiative cooling causes anvil cloud tops to remain at a nearly fixed temperature as the atmosphere warms. Tropical low cloud amount decreases are driven by a delicate balance between the effects of vertical turbulent fluxes, radiative cooling, large-scale subsidence, and lower-tropospheric stability on the boundary-layer moisture budget. High-latitude low cloud optical depth increases are dominated by phase changes in mixed-phase clouds. Finally, the causes of intermodel spread in cloud feedback are discussed, focusing particularly on the role of unresolved parameterized processes such as cloud microphysics, turbulence, and convection.« less
Study of pre-storm environment by using rawinsonde and satellite observations
NASA Technical Reports Server (NTRS)
Hung, R. J.; Tsao, Y. D.
1987-01-01
Rawinsonde and satellite remote sensing data were utilized to examine the prestorm environment and mechanisms for the initiation of four groups of severe storms. The storms in Altus, Oklahoma, Pampas, Texas, Bennett, Colorado, and Red River Valley, Oklahoma are described. The geographical distributions of the areas of high moisture concentration and variations of tropopause heights for the storm groups are analyzed. It is detected that in the area of a low-level high concentration of moisture, the local tropopause height is lowest at the time of the storm cloud formation and development, and the potential energy storage per unit areas for the overshootiong clouds penetrating above the tropopause is related to the intensity of the storms produced. Numerical cloud modeling was performed for the storms. The model data are compared with the satellite and rawinsonde observations, and it is noted that the data correlate well.
2016-11-01
iii Contents List of Figures v 1. Introduction 1 2. Background 1 3. Yahoo ! Cloud Serving Benchmark (YCSB) 2 3.1 Data Loading and Performance...transactional system. 3. Yahoo ! Cloud Serving Benchmark (YCSB) 3.1 Data Loading and Performance Testing Framework When originally setting out to perform the...that referred to a data loading and performance testing framework, Yahoo ! Cloud Serving Benchmark (YCSB).12 This framework is freely available and
NASA Astrophysics Data System (ADS)
Borsdorff, Tobias; Andrasec, Josip; aan de Brugh, Joost; Hu, Haili; Aben, Ilse; Landgraf, Jochen
2018-05-01
In the perspective of the upcoming TROPOMI Sentinel-5 Precursor carbon monoxide data product, we discuss the benefit of using CO total column retrievals from cloud-contaminated SCIAMACHY 2.3 µm shortwave infrared spectra to detect atmospheric CO enhancements on regional and urban scales due to emissions from cities and wildfires. The study uses the operational Sentinel-5 Precursor algorithm SICOR, which infers the vertically integrated CO column together with effective cloud parameters. We investigate its capability to detect localized CO enhancements distinguishing between clear-sky observations and observations with low (< 1.5 km) and medium-high clouds (1.5-5 km). As an example, we analyse CO enhancements over the cities Paris, Los Angeles and Tehran as well as the wildfire events in Mexico-Guatemala 2005 and Alaska-Canada 2004. The CO average of the SCIAMACHY full-mission data set of clear-sky observations can detect weak CO enhancements of less than 10 ppb due to air pollution in these cities. For low-cloud conditions, the CO data product performs similarly well. For medium-high clouds, the observations show a reduced CO signal both over Tehran and Los Angeles, while for Paris no significant CO enhancement can be detected. This indicates that information about the vertical distribution of CO can be obtained from the SCIAMACHY measurements. Moreover, for the Mexico-Guatemala fires, the low-cloud CO data captures a strong outflow of CO over the Gulf of Mexico and the Pacific Ocean and so provides complementary information to clear-sky retrievals, which can only be obtained over land. For both burning events, enhanced CO values are even detectable with medium-high-cloud retrievals, confirming a distinct vertical extension of the pollution. The larger number of additional measurements, and hence the better spatial coverage, significantly improve the detection of wildfire pollution using both the clear-sky and cloudy CO retrievals. Due to the improved instrument performance of the TROPOMI instrument with respect to its precursor SCIAMACHY, the upcoming Sentinel-5 Precursor CO data product will allow improved detection of CO emissions and their vertical extension over cities and fires, making new research applications possible.
NASA Astrophysics Data System (ADS)
Alvarez, César I.; Teodoro, Ana; Tierra, Alfonso
2017-10-01
Thin clouds in the optical remote sensing data are frequent and in most of the cases don't allow to have a pure surface data in order to calculate some indexes as Normalized Difference Vegetation Index (NDVI). This paper aims to evaluate the Automatic Cloud Removal Method (ACRM) algorithm over a high elevation city like Quito (Ecuador), with an altitude of 2800 meters above sea level, where the clouds are presented all the year. The ACRM is an algorithm that considers a linear regression between each Landsat 8 OLI band and the Cirrus band using the slope obtained with the linear regression established. This algorithm was employed without any reference image or mask to try to remove the clouds. The results of the application of the ACRM algorithm over Quito didn't show a good performance. Therefore, was considered improving this algorithm using a different slope value data (ACMR Improved). After, the NDVI computation was compared with a reference NDVI MODIS data (MOD13Q1). The ACMR Improved algorithm had a successful result when compared with the original ACRM algorithm. In the future, this Improved ACRM algorithm needs to be tested in different regions of the world with different conditions to evaluate if the algorithm works successfully for all conditions.
Ice clouds optical properties in the Far Infrared from the ECOWAR-COBRA Experiment
NASA Astrophysics Data System (ADS)
Rizzi, Rolando; Tosi, Ennio
ECOWAR-COBRA (Earth COoling by WAter vapouR emission -Campagna di Osservazioni della Banda Rotazionale del vapor d'Acqua) field campaign took place in Italy from 3 to 17 March 2007 with the main goal of studying the scarcely sensed atmospheric emission occurring beyond 17 microns. Instrumentation involved in the campaign included two different Fourier Transforms Spectrometers (FTS) : REFIR-PAD (at Testa Grigia Station, 3500 m a.s.l.) and FTIR-ABB (at Cervinia Station, 1990 m a.s.l.). In this work cloudy sky data have been ana-lyzed. A cloud properties retrieval methodology (RT-RET), based on high spectral resolution measurements in the atmospheric window (800-1000 cm-1), is applied to both FTS sensors. Cloud properties determined from the infrared retrievals are compared with those obtained from Raman lidar taken by the BASIL Lidar system that was operating at Cervinia station. Cloud microphysical and optical properties retrieved by RT-RET are used to perform forward simulations over the entire FTSs measurements spectral interval. Results are compared to FTS data to test the ability of single scattering ice crystals models to reproduce cloudy sky radiances in the Far Infra-Red (FIR) part of the spectrum. New methods to retrieve cloud optical and microphysical properties exploiting high spectral resolution FIR measurements are also investigated.
Epiphyte response to drought and experimental warming in an Andean cloud forest
Rapp, Joshua M.; Silman, Miles R.
2014-01-01
The high diversity and abundance of vascular epiphytes in tropical montane cloud forest is associated with frequent cloud immersion, which is thought to protect plants from drought stress. Increasing temperature and rising cloud bases associated with climate change may increase epiphyte drought stress, leading to species and biomass loss. We tested the hypothesis that warmer and drier conditions associated with a lifting cloud base will lead to increased mortality and/or decreased recruitment of epiphyte ramets, altering species composition in epiphyte mats. By using a reciprocal transplant design, where epiphyte mats were transplanted across an altitudinal gradient of increasing cloud immersion, we differentiated between the effects of warmer and drier conditions from the more general prediction of niche theory that transplanting epiphytes in any direction away from their home elevation should result in reduced performance. Effects differed among species, but effects were generally stronger and more negative for epiphytes in mats transplanted down slope from the highest elevation, into warmer and drier conditions, than for epiphyte mats transplanted from other elevations. In contrast, epiphytes from lower elevations showed greater resistance to drought in all treatments. Epiphyte community composition changed with elevation, but over the timescale of the experiment there were no consistent changes in species composition. Our results suggest some epiphytes may show resistance to climate change depending on the environmental and evolutionary context. In particular, sites where high rainfall makes cloud immersion less important for epiphyte water-balance, or where occasional drought has previously selected for drought-resistant taxa, may be less adversely affected by predicted climate changes. PMID:25165534
Epiphyte response to drought and experimental warming in an Andean cloud forest.
Rapp, Joshua M; Silman, Miles R
2014-01-01
The high diversity and abundance of vascular epiphytes in tropical montane cloud forest is associated with frequent cloud immersion, which is thought to protect plants from drought stress. Increasing temperature and rising cloud bases associated with climate change may increase epiphyte drought stress, leading to species and biomass loss. We tested the hypothesis that warmer and drier conditions associated with a lifting cloud base will lead to increased mortality and/or decreased recruitment of epiphyte ramets, altering species composition in epiphyte mats. By using a reciprocal transplant design, where epiphyte mats were transplanted across an altitudinal gradient of increasing cloud immersion, we differentiated between the effects of warmer and drier conditions from the more general prediction of niche theory that transplanting epiphytes in any direction away from their home elevation should result in reduced performance. Effects differed among species, but effects were generally stronger and more negative for epiphytes in mats transplanted down slope from the highest elevation, into warmer and drier conditions, than for epiphyte mats transplanted from other elevations. In contrast, epiphytes from lower elevations showed greater resistance to drought in all treatments. Epiphyte community composition changed with elevation, but over the timescale of the experiment there were no consistent changes in species composition. Our results suggest some epiphytes may show resistance to climate change depending on the environmental and evolutionary context. In particular, sites where high rainfall makes cloud immersion less important for epiphyte water-balance, or where occasional drought has previously selected for drought-resistant taxa, may be less adversely affected by predicted climate changes.
SHOCK-CLOUD INTERACTION AND PARTICLE ACCELERATION IN THE SOUTHWESTERN LIMB OF SN 1006
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miceli, M.; Orlando, S.; Bocchino, F.
2014-02-20
The supernova remnant SN 1006 is a powerful source of high-energy particles and evolves in a relatively tenuous and uniform environment despite interacting with an atomic cloud in its northwestern limb. The X-ray image of SN 1006 reveals an indentation in the southwestern part of the shock front and the H I maps show an isolated (southwestern) cloud, having the same velocity as the northwestern cloud, whose morphology fits perfectly in the indentation. We performed spatially resolved spectral analysis of a set of small regions in the southwestern nonthermal limb and studied the deep X-ray spectra obtained within the XMM-Newton SN 1006 Largemore » Program. We also analyzed archive H I data, obtained by combining single-dish and interferometric observations. We found that the best-fit value of N {sub H} derived from the X-ray spectra significantly increases in regions corresponding to the southwestern cloud, while the cutoff energy of the synchrotron emission decreases. The N {sub H} variation corresponds perfectly with the H I column density of the southwestern cloud, as measured from the radio data. The decrease in the cutoff energy at the indentation clearly reveals that the back side of the cloud is actually interacting with the remnant. The southwestern limb therefore presents a unique combination of efficient particle acceleration and high ambient density, thus being the most promising region for γ-ray hadronic emission in SN 1006. We estimate that such emission will be detectable with the Fermi telescope within a few years.« less
NASA Astrophysics Data System (ADS)
Sedano, Fernando; Kempeneers, Pieter; Strobl, Peter; Kucera, Jan; Vogt, Peter; Seebach, Lucia; San-Miguel-Ayanz, Jesús
2011-09-01
This study presents a novel cloud masking approach for high resolution remote sensing images in the context of land cover mapping. As an advantage to traditional methods, the approach does not rely on thermal bands and it is applicable to images from most high resolution earth observation remote sensing sensors. The methodology couples pixel-based seed identification and object-based region growing. The seed identification stage relies on pixel value comparison between high resolution images and cloud free composites at lower spatial resolution from almost simultaneously acquired dates. The methodology was tested taking SPOT4-HRVIR, SPOT5-HRG and IRS-LISS III as high resolution images and cloud free MODIS composites as reference images. The selected scenes included a wide range of cloud types and surface features. The resulting cloud masks were evaluated through visual comparison. They were also compared with ad-hoc independently generated cloud masks and with the automatic cloud cover assessment algorithm (ACCA). In general the results showed an agreement in detected clouds higher than 95% for clouds larger than 50 ha. The approach produced consistent results identifying and mapping clouds of different type and size over various land surfaces including natural vegetation, agriculture land, built-up areas, water bodies and snow.
NASA Astrophysics Data System (ADS)
Niggemann, F.; Appel, F.; Bach, H.; de la Mar, J.; Schirpke, B.; Dutting, K.; Rucker, G.; Leimbach, D.
2015-04-01
To address the challenges of effective data handling faced by Small and Medium Sized Enterprises (SMEs) a cloud-based infrastructure for accessing and processing of Earth Observation(EO)-data has been developed within the project APPS4GMES(www.apps4gmes.de). To gain homogenous multi mission data access an Input Data Portal (IDP) been implemented on this infrastructure. The IDP consists of an Open Geospatial Consortium (OGC) conformant catalogue, a consolidation module for format conversion and an OGC-conformant ordering framework. Metadata of various EO-sources and with different standards is harvested and transferred to an OGC conformant Earth Observation Product standard and inserted into the catalogue by a Metadata Harvester. The IDP can be accessed for search and ordering of the harvested datasets by the services implemented on the cloud infrastructure. Different land-surface services have been realised by the project partners, using the implemented IDP and cloud infrastructure. Results of these are customer ready products, as well as pre-products (e.g. atmospheric corrected EO data), serving as a basis for other services. Within the IDP an automated access to ESA's Sentinel-1 Scientific Data Hub has been implemented. Searching and downloading of the SAR data can be performed in an automated way. With the implementation of the Sentinel-1 Toolbox and own software, for processing of the datasets for further use, for example for Vista's snow monitoring, delivering input for the flood forecast services, can also be performed in an automated way. For performance tests of the cloud environment a sophisticated model based atmospheric correction and pre-classification service has been implemented. Tests conducted an automated synchronised processing of one entire Landsat 8 (LS-8) coverage for Germany and performance comparisons to standard desktop systems. Results of these tests, showing a performance improvement by the factor of six, proved the high flexibility and computing power of the cloud environment. To make full use of the cloud capabilities a possibility for automated upscaling of the hardware resources has been implemented. Together with the IDP infrastructure fast and automated processing of various satellite sources to deliver market ready products can be realised, thus increasing customer needs and numbers can be satisfied without loss of accuracy and quality.
The Modification of Orographic Snow Growth Processes by Cloud Nucleating Aerosols
NASA Astrophysics Data System (ADS)
Cotton, W. R.; Saleeby, S.
2011-12-01
Cloud nucleating aerosols have been found to modify the amount and spatial distribution of snowfall in mountainous areas where riming growth of snow crystals is known to contribute substantially to the total snow water equivalent precipitation. In the Park Range of Colorado, a 2km deep supercooled liquid water orographic cloud frequently enshrouds the mountaintop during snowfall events. This leads to a seeder-feeder growth regime in which snow falls through the orographic cloud and collects cloud water prior to surface deposition. The addition of higher concentrations of cloud condensation nuclei (CCN) modifies the cloud droplet spectrum toward smaller size droplets and suppresses riming growth. Without rime growth, the density of snow crystals remains low and horizontal trajectories carry them further downwind due to slower vertical fall speeds. This leads to a downwind shift in snowfall accumulation at high CCN concentrations. Cloud resolving model simulations were performed (at 600m horizontal grid spacing) for six snowfall events over the Park Range. The chosen events were well simulated and occurred during intensive observations periods as part of two winter field campaigns in 2007 and 2010 based at Storm Peak Laboratory in Steamboat Springs, CO. For each event, sensitivity simulations were run with various initial CCN concentration vertical profiles that represent clean to polluted aerosol environments. Microphysical budget analyses were performed for these simulations in order to determine the relative importance of the various cloud properties and growth processes that contribute to precipitation production. Observations and modeling results indicate that initial vapor depositional growth of snow tends to be maximized within about 1km of mountaintop above the windward slope while the majority of riming growth occurs within 500m of mountaintop. This suggests that precipitation production is predominantly driven by locally enhanced orography. The large scale synoptic flow simply provides the background dynamics and moisture that impinge upon the steep terrain. The addition of cloud nucleating aerosols to this scenario tends to reduce the amount of riming and leads to greater snow vapor growth. Increased vapor growth leads to larger snow crystals but does not necessarily increase their density or fall speed. There is frequently a zone on the periphery of the orographic cloud where water saturation is low and ice saturation remains high. Here the Bergeron process allows for snow to continue growing at the expense of the cloud water. Furthermore, since less cloud water is removed by riming, and droplets are smaller in polluted conditions, there is an increase in cloud water evaporation along the lee slope. This enhanced droplet evaporation in polluted conditions allows for more saturated air to persist to the lee of the ridge. Higher saturation reduces the amount of snow crystal sublimation prior to surface deposition. In very moist winter events, the lee slope evaporation relative to the primary mountain barrier can saturate the air relative to a downstream ridge and aid in further orographic cloud development. The combination of reduced riming, the Bergeron process, and reduced lee-side sublimation leads to the snowfall spillover effect under polluted conditions.
NASA Astrophysics Data System (ADS)
Li, J.; Menzel, W.; Sun, F.; Schmit, T.
2003-12-01
The Moderate-Resolution Imaging Spectroradiometer (MODIS) and Atmospheric Infrared Sounder (AIRS) measurements from the Earth Observing System's (EOS) Aqua satellite will enable global monitoring of the distribution of clouds. MODIS is able to provide at high spatial resolution (1 ~ 5km) the cloud mask, surface and cloud types, cloud phase, cloud-top pressure (CTP), effective cloud amount (ECA), cloud particle size (CPS), and cloud water path (CWP). AIRS is able to provide CTP, ECA, CPS, and CWP within the AIRS footprint with much better accuracy using its greatly enhanced hyperspectral remote sensing capability. The combined MODIS / AIRS system offers the opportunity for cloud products improved over those possible from either system alone. The algorithm developed was applied to process the AIRS longwave cloudy radiance measurements; results are compared with MODIS cloud products, as well as with the Geostationary Operational Environmental Satellite (GOES) sounder cloud products, to demonstrate the advantage of synergistic use of high spatial resolution MODIS cloud products and high spectral resolution AIRS sounder radiance measurements for optimal cloud retrieval. Data from ground-based instrumentation at the Atmospheric Radiation Measurement (ARM) Program Cloud and Radiation Test Bed (CART) in Oklahoma were used for the validation; results show that AIRS improves the MODIS cloud products in certain cases such as low-level clouds.
NASA Technical Reports Server (NTRS)
Sud, Y. C.; Lee, D.; Oreopoulos, L.; Barahona, D.; Nenes, A.; Suarez, M. J.
2012-01-01
A revised version of the Microphysics of clouds with Relaxed Arakawa-Schubert and Aerosol-Cloud interaction (McRAS-AC), including, among others, the Barahona and Nenes ice nucleation parameterization, is implemented in the GEOS-5 AGCM. Various fields from a 10-year long integration of the AGCM with McRAS-AC were compared with their counterparts from an integration of the baseline GEOS-5 AGCM, and with satellite data as observations. Generally using McRAS-AC reduced biases in cloud fields and cloud radiative effects are much better over most of the regions of the Earth. Two weaknesses are identified in the McRAS-AC runs, namely, too few cloud particles around 40S-60S, and too high cloud water path during northern hemisphere summer over the Gulf Stream and North Pacific. Sensitivity analyses showed that these biases potentially originated from biases in the aerosol input. The first bias is largely eliminated in a sensitivity test using 50% smaller aerosol particles, while the second bias is much reduced when interactive aerosol chemistry was turned on. The main drawback of McRAS-AC is dearth of low-level marine stratus clouds, probably due to lack of dry-convection, not yet implemented into the cloud scheme. Despite these biases, McRAS-AC does simulate realistic clouds and their optical properties that can improve with better aerosol-input and thereby has the potential to be a valuable tool for climate modeling research because of its aerosol indirect effect simulation capabilities involving prediction of cloud particle number concentration and effective particle size for both convective and stratiform clouds is quite realistic.
NASA Astrophysics Data System (ADS)
Wang, Fang; Yang, Song
2018-02-01
Using principal component (PC) analysis, three leading modes of cloud vertical structure (CVS) are revealed by the GCM-Oriented CALIPSO Cloud Product (GOCCP), i.e. tropical high, subtropical anticyclonic and extratropical cyclonic cloud modes (THCM, SACM and ECCM, respectively). THCM mainly reflect the contrast between tropical high clouds and clouds in middle/high latitudes. SACM is closely associated with middle-high clouds in tropical convective cores, few-cloud regimes in subtropical anticyclonic clouds and stratocumulus over subtropical eastern oceans. ECCM mainly corresponds to clouds along extratropical cyclonic regions. Models of phase 2 of Cloud Feedback Model Intercomparison Project (CFMIP2) well reproduce the THCM, but SACM and ECCM are generally poorly simulated compared to GOCCP. Standardized PCs corresponding to CVS modes are generally captured, whereas original PCs (OPCs) are consistently underestimated (overestimated) for THCM (SACM and ECCM) by CFMIP2 models. The effects of CVS modes on relative cloud radiative forcing (RSCRF/RLCRF) (RSCRF being calculated at the surface while RLCRF at the top of atmosphere) are studied in terms of principal component regression method. Results show that CFMIP2 models tend to overestimate (underestimated or simulate the opposite sign) RSCRF/RLCRF radiative effects (REs) of ECCM (THCM and SACM) in unit global mean OPC compared to observations. These RE biases may be attributed to two factors, one of which is underestimation (overestimation) of low/middle clouds (high clouds) (also known as stronger (weaker) REs in unit low/middle (high) clouds) in simulated global mean cloud profiles, the other is eigenvector biases in CVS modes (especially for SACM and ECCM). It is suggested that much more attention should be paid on improvement of CVS, especially cloud parameterization associated with particular physical processes (e.g. downwelling regimes with the Hadley circulation, extratropical storm tracks and others), which may be crucial to reduce the CRF biases in current climate models.
NASA Astrophysics Data System (ADS)
Zou, Xiaoliang; Zhao, Guihua; Li, Jonathan; Yang, Yuanxi; Fang, Yong
2016-06-01
With the rapid developments of the sensor technology, high spatial resolution imagery and airborne Lidar point clouds can be captured nowadays, which make classification, extraction, evaluation and analysis of a broad range of object features available. High resolution imagery, Lidar dataset and parcel map can be widely used for classification as information carriers. Therefore, refinement of objects classification is made possible for the urban land cover. The paper presents an approach to object based image analysis (OBIA) combing high spatial resolution imagery and airborne Lidar point clouds. The advanced workflow for urban land cover is designed with four components. Firstly, colour-infrared TrueOrtho photo and laser point clouds were pre-processed to derive the parcel map of water bodies and nDSM respectively. Secondly, image objects are created via multi-resolution image segmentation integrating scale parameter, the colour and shape properties with compactness criterion. Image can be subdivided into separate object regions. Thirdly, image objects classification is performed on the basis of segmentation and a rule set of knowledge decision tree. These objects imagery are classified into six classes such as water bodies, low vegetation/grass, tree, low building, high building and road. Finally, in order to assess the validity of the classification results for six classes, accuracy assessment is performed through comparing randomly distributed reference points of TrueOrtho imagery with the classification results, forming the confusion matrix and calculating overall accuracy and Kappa coefficient. The study area focuses on test site Vaihingen/Enz and a patch of test datasets comes from the benchmark of ISPRS WG III/4 test project. The classification results show higher overall accuracy for most types of urban land cover. Overall accuracy is 89.5% and Kappa coefficient equals to 0.865. The OBIA approach provides an effective and convenient way to combine high resolution imagery and Lidar ancillary data for classification of urban land cover.
e-Collaboration for Earth observation (E-CEO): the Cloud4SAR interferometry data challenge
NASA Astrophysics Data System (ADS)
Casu, Francesco; Manunta, Michele; Boissier, Enguerran; Brito, Fabrice; Aas, Christina; Lavender, Samantha; Ribeiro, Rita; Farres, Jordi
2014-05-01
The e-Collaboration for Earth Observation (E-CEO) project addresses the technologies and architectures needed to provide a collaborative research Platform for automating data mining and processing, and information extraction experiments. The Platform serves for the implementation of Data Challenge Contests focusing on Information Extraction for Earth Observations (EO) applications. The possibility to implement multiple processors within a Common Software Environment facilitates the validation, evaluation and transparent peer comparison among different methodologies, which is one of the main requirements rose by scientists who develop algorithms in the EO field. In this scenario, we set up a Data Challenge, referred to as Cloud4SAR (http://wiki.services.eoportal.org/tiki-index.php?page=ECEO), to foster the deployment of Interferometric SAR (InSAR) processing chains within a Cloud Computing platform. While a large variety of InSAR processing software tools are available, they require a high level of expertise and a complex user interaction to be effectively run. Computing a co-seismic interferogram or a 20-years deformation time series on a volcanic area are not easy tasks to be performed in a fully unsupervised way and/or in very short time (hours or less). Benefiting from ESA's E-CEO platform, participants can optimise algorithms on a Virtual Sandbox environment without being expert programmers, and compute results on high performing Cloud platforms. Cloud4SAR requires solving a relatively easy InSAR problem by trying to maximize the exploitation of the processing capabilities provided by a Cloud Computing infrastructure. The proposed challenge offers two different frameworks, each dedicated to participants with different skills, identified as Beginners and Experts. For both of them, the contest mainly resides in the degree of automation of the deployed algorithms, no matter which one is used, as well as in the capability of taking effective benefit from a parallel computing environment.
A Cloud-Based Simulation Architecture for Pandemic Influenza Simulation
Eriksson, Henrik; Raciti, Massimiliano; Basile, Maurizio; Cunsolo, Alessandro; Fröberg, Anders; Leifler, Ola; Ekberg, Joakim; Timpka, Toomas
2011-01-01
High-fidelity simulations of pandemic outbreaks are resource consuming. Cluster-based solutions have been suggested for executing such complex computations. We present a cloud-based simulation architecture that utilizes computing resources both locally available and dynamically rented online. The approach uses the Condor framework for job distribution and management of the Amazon Elastic Computing Cloud (EC2) as well as local resources. The architecture has a web-based user interface that allows users to monitor and control simulation execution. In a benchmark test, the best cost-adjusted performance was recorded for the EC2 H-CPU Medium instance, while a field trial showed that the job configuration had significant influence on the execution time and that the network capacity of the master node could become a bottleneck. We conclude that it is possible to develop a scalable simulation environment that uses cloud-based solutions, while providing an easy-to-use graphical user interface. PMID:22195089
A curvature-based weighted fuzzy c-means algorithm for point clouds de-noising
NASA Astrophysics Data System (ADS)
Cui, Xin; Li, Shipeng; Yan, Xiutian; He, Xinhua
2018-04-01
In order to remove the noise of three-dimensional scattered point cloud and smooth the data without damnify the sharp geometric feature simultaneity, a novel algorithm is proposed in this paper. The feature-preserving weight is added to fuzzy c-means algorithm which invented a curvature weighted fuzzy c-means clustering algorithm. Firstly, the large-scale outliers are removed by the statistics of r radius neighboring points. Then, the algorithm estimates the curvature of the point cloud data by using conicoid parabolic fitting method and calculates the curvature feature value. Finally, the proposed clustering algorithm is adapted to calculate the weighted cluster centers. The cluster centers are regarded as the new points. The experimental results show that this approach is efficient to different scale and intensities of noise in point cloud with a high precision, and perform a feature-preserving nature at the same time. Also it is robust enough to different noise model.
Scientific Services on the Cloud
NASA Astrophysics Data System (ADS)
Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong
Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.
NASA Astrophysics Data System (ADS)
Gruber, Simon; Unterstrasser, Simon; Bechtold, Jan; Vogel, Heike; Jung, Martin; Pak, Henry; Vogel, Bernhard
2018-05-01
A high-resolution regional-scale numerical model was extended by a parameterization that allows for both the generation and the life cycle of contrails and contrail cirrus to be calculated. The life cycle of contrails and contrail cirrus is described by a two-moment cloud microphysical scheme that was extended by a separate contrail ice class for a better representation of the high concentration of small ice crystals that occur in contrails. The basic input data set contains the spatially and temporally highly resolved flight trajectories over Central Europe derived from real-time data. The parameterization provides aircraft-dependent source terms for contrail ice mass and number. A case study was performed to investigate the influence of contrails and contrail cirrus on the shortwave radiative fluxes at the earth's surface. Accounting for contrails produced by aircraft enabled the model to simulate high clouds that were otherwise missing on this day. The effect of these extra clouds was to reduce the incoming shortwave radiation at the surface as well as the production of photovoltaic power by up to 10 %.
Repetitive Elements May Comprise Over Two-Thirds of the Human Genome
de Koning, A. P. Jason; Gu, Wanjun; Castoe, Todd A.; Batzer, Mark A.; Pollock, David D.
2011-01-01
Transposable elements (TEs) are conventionally identified in eukaryotic genomes by alignment to consensus element sequences. Using this approach, about half of the human genome has been previously identified as TEs and low-complexity repeats. We recently developed a highly sensitive alternative de novo strategy, P-clouds, that instead searches for clusters of high-abundance oligonucleotides that are related in sequence space (oligo “clouds”). We show here that P-clouds predicts >840 Mbp of additional repetitive sequences in the human genome, thus suggesting that 66%–69% of the human genome is repetitive or repeat-derived. To investigate this remarkable difference, we conducted detailed analyses of the ability of both P-clouds and a commonly used conventional approach, RepeatMasker (RM), to detect different sized fragments of the highly abundant human Alu and MIR SINEs. RM can have surprisingly low sensitivity for even moderately long fragments, in contrast to P-clouds, which has good sensitivity down to small fragment sizes (∼25 bp). Although short fragments have a high intrinsic probability of being false positives, we performed a probabilistic annotation that reflects this fact. We further developed “element-specific” P-clouds (ESPs) to identify novel Alu and MIR SINE elements, and using it we identified ∼100 Mb of previously unannotated human elements. ESP estimates of new MIR sequences are in good agreement with RM-based predictions of the amount that RM missed. These results highlight the need for combined, probabilistic genome annotation approaches and suggest that the human genome consists of substantially more repetitive sequence than previously believed. PMID:22144907
Improving Mixed-phase Cloud Parameterization in Climate Model with the ACRF Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Zhien
Mixed-phase cloud microphysical and dynamical processes are still poorly understood, and their representation in GCMs is a major source of uncertainties in overall cloud feedback in GCMs. Thus improving mixed-phase cloud parameterizations in climate models is critical to reducing the climate forecast uncertainties. This study aims at providing improved knowledge of mixed-phase cloud properties from the long-term ACRF observations and improving mixed-phase clouds simulations in the NCAR Community Atmosphere Model version 5 (CAM5). The key accomplishments are: 1) An improved retrieval algorithm was developed to provide liquid droplet concentration for drizzling or mixed-phase stratiform clouds. 2) A new ice concentrationmore » retrieval algorithm for stratiform mixed-phase clouds was developed. 3) A strong seasonal aerosol impact on ice generation in Arctic mixed-phase clouds was identified, which is mainly attributed to the high dust occurrence during the spring season. 4) A suite of multi-senor algorithms was applied to long-term ARM observations at the Barrow site to provide a complete dataset (LWC and effective radius profile for liquid phase, and IWC, Dge profiles and ice concentration for ice phase) to characterize Arctic stratiform mixed-phase clouds. This multi-year stratiform mixed-phase cloud dataset provides necessary information to study related processes, evaluate model stratiform mixed-phase cloud simulations, and improve model stratiform mixed-phase cloud parameterization. 5). A new in situ data analysis method was developed to quantify liquid mass partition in convective mixed-phase clouds. For the first time, we reliably compared liquid mass partitions in stratiform and convective mixed-phase clouds. Due to the different dynamics in stratiform and convective mixed-phase clouds, the temperature dependencies of liquid mass partitions are significantly different due to much higher ice concentrations in convective mixed phase clouds. 6) Systematic evaluations of mixed-phase cloud simulations by CAM5 were performed. Measurement results indicate that ice concentrations control stratiform mixed-phase cloud properties. The improvement of ice concentration parameterization in the CAM5 was done in close collaboration with Dr. Xiaohong Liu, PNNL (now at University of Wyoming).« less
A High Resolution Hydrometer Phase Classifier Based on Analysis of Cloud Radar Doppler Spectra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luke,E.; Kollias, P.
2007-08-06
The lifecycle and radiative properties of clouds are highly sensitive to the phase of their hydrometeors (i.e., liquid or ice). Knowledge of cloud phase is essential for specifying the optical properties of clouds, or else, large errors can be introduced in the calculation of the cloud radiative fluxes. Current parameterizations of cloud water partition in liquid and ice based on temperature are characterized by large uncertainty (Curry et al., 1996; Hobbs and Rangno, 1998; Intriery et al., 2002). This is particularly important in high geographical latitudes and temperature ranges where both liquid droplets and ice crystal phases can exist (mixed-phasemore » cloud). The mixture of phases has a large effect on cloud radiative properties, and the parameterization of mixed-phase clouds has a large impact on climate simulations (e.g., Gregory and Morris, 1996). Furthermore, the presence of both ice and liquid affects the macroscopic properties of clouds, including their propensity to precipitate. Despite their importance, mixed-phase clouds are severely understudied compared to the arguably simpler single-phase clouds. In-situ measurements in mixed-phase clouds are hindered due to aircraft icing, difficulties distinguishing hydrometeor phase, and discrepancies in methods for deriving physical quantities (Wendisch et al. 1996, Lawson et al. 2001). Satellite-based retrievals of cloud phase in high latitudes are often hindered by the highly reflecting ice-covered ground and persistent temperature inversions. From the ground, the retrieval of mixed-phase cloud properties has been the subject of extensive research over the past 20 years using polarization lidars (e.g., Sassen et al. 1990), dual radar wavelengths (e.g., Gosset and Sauvageot 1992; Sekelsky and McIntosh, 1996), and recently radar Doppler spectra (Shupe et al. 2004). Millimeter-wavelength radars have substantially improved our ability to observe non-precipitating clouds (Kollias et al., 2007) due to their excellent sensitivity that enables the detection of thin cloud layers and their ability to penetrate several non-precipitating cloud layers. However, in mixed-phase clouds conditions, the observed Doppler moments are dominated by the highly reflecting ice crystals and thus can not be used to identify the cloud phase. This limits our ability to identify the spatial distribution of cloud phase and our ability to identify the conditions under which mixed-phase clouds form.« less
APOLLO_NG - a probabilistic interpretation of the APOLLO legacy for AVHRR heritage channels
NASA Astrophysics Data System (ADS)
Klüser, L.; Killius, N.; Gesell, G.
2015-10-01
The cloud processing scheme APOLLO (AVHRR Processing scheme Over cLouds, Land and Ocean) has been in use for cloud detection and cloud property retrieval since the late 1980s. The physics of the APOLLO scheme still build the backbone of a range of cloud detection algorithms for AVHRR (Advanced Very High Resolution Radiometer) heritage instruments. The APOLLO_NG (APOLLO_NextGeneration) cloud processing scheme is a probabilistic interpretation of the original APOLLO method. It builds upon the physical principles that have served well in the original APOLLO scheme. Nevertheless, a couple of additional variables have been introduced in APOLLO_NG. Cloud detection is no longer performed as a binary yes/no decision based on these physical principles. It is rather expressed as cloud probability for each satellite pixel. Consequently, the outcome of the algorithm can be tuned from being sure to reliably identify clear pixels to conditions of reliably identifying definitely cloudy pixels, depending on the purpose. The probabilistic approach allows retrieving not only the cloud properties (optical depth, effective radius, cloud top temperature and cloud water path) but also their uncertainties. APOLLO_NG is designed as a standalone cloud retrieval method robust enough for operational near-realtime use and for application to large amounts of historical satellite data. The radiative transfer solution is approximated by the same two-stream approach which also had been used for the original APOLLO. This allows the algorithm to be applied to a wide range of sensors without the necessity of sensor-specific tuning. Moreover it allows for online calculation of the radiative transfer (i.e., within the retrieval algorithm) giving rise to a detailed probabilistic treatment of cloud variables. This study presents the algorithm for cloud detection and cloud property retrieval together with the physical principles from the APOLLO legacy it is based on. Furthermore a couple of example results from NOAA-18 are presented.
Preparatory studies of zero-g cloud drop coalescence experiment
NASA Technical Reports Server (NTRS)
Telford, J. W.; Keck, T. S.
1979-01-01
Experiments to be performed in a weightless environment in order to study collision and coalescence processes of cloud droplets are described. Rain formation in warm clouds, formation of larger cloud drops, ice and water collision processes, and precipitation in supercooled clouds are among the topics covered.
NASA Astrophysics Data System (ADS)
Chern, J. D.; Tao, W. K.; Lang, S. E.; Matsui, T.; Mohr, K. I.
2014-12-01
Four six-month (March-August 2014) experiments with the Goddard Multi-scale Modeling Framework (MMF) were performed to study the impacts of different Goddard one-moment bulk microphysical schemes and large-scale forcings on the performance of the MMF. Recently a new Goddard one-moment bulk microphysics with four-ice classes (cloud ice, snow, graupel, and frozen drops/hail) has been developed based on cloud-resolving model simulations with large-scale forcings from field campaign observations. The new scheme has been successfully implemented to the MMF and two MMF experiments were carried out with this new scheme and the old three-ice classes (cloud ice, snow graupel) scheme. The MMF has global coverage and can rigorously evaluate microphysics performance for different cloud regimes. The results show MMF with the new scheme outperformed the old one. The MMF simulations are also strongly affected by the interaction between large-scale and cloud-scale processes. Two MMF sensitivity experiments with and without nudging large-scale forcings to those of ERA-Interim reanalysis were carried out to study the impacts of large-scale forcings. The model simulated mean and variability of surface precipitation, cloud types, cloud properties such as cloud amount, hydrometeors vertical profiles, and cloud water contents, etc. in different geographic locations and climate regimes are evaluated against GPM, TRMM, CloudSat/CALIPSO satellite observations. The Goddard MMF has also been coupled with the Goddard Satellite Data Simulation Unit (G-SDSU), a system with multi-satellite, multi-sensor, and multi-spectrum satellite simulators. The statistics of MMF simulated radiances and backscattering can be directly compared with satellite observations to assess the strengths and/or deficiencies of MMF simulations and provide guidance on how to improve the MMF and microphysics.
Improving the Accuracy of Cloud Detection Using Machine Learning
NASA Astrophysics Data System (ADS)
Craddock, M. E.; Alliss, R. J.; Mason, M.
2017-12-01
Cloud detection from geostationary satellite imagery has long been accomplished through multi-spectral channel differencing in comparison to the Earth's surface. The distinction of clear/cloud is then determined by comparing these differences to empirical thresholds. Using this methodology, the probability of detecting clouds exceeds 90% but performance varies seasonally, regionally and temporally. The Cloud Mask Generator (CMG) database developed under this effort, consists of 20 years of 4 km, 15minute clear/cloud images based on GOES data over CONUS and Hawaii. The algorithms to determine cloudy pixels in the imagery are based on well-known multi-spectral techniques and defined thresholds. These thresholds were produced by manually studying thousands of images and thousands of man-hours to determine the success and failure of the algorithms to fine tune the thresholds. This study aims to investigate the potential of improving cloud detection by using Random Forest (RF) ensemble classification. RF is the ideal methodology to employ for cloud detection as it runs efficiently on large datasets, is robust to outliers and noise and is able to deal with highly correlated predictors, such as multi-spectral satellite imagery. The RF code was developed using Python in about 4 weeks. The region of focus selected was Hawaii and includes the use of visible and infrared imagery, topography and multi-spectral image products as predictors. The development of the cloud detection technique is realized in three steps. First, tuning of the RF models is completed to identify the optimal values of the number of trees and number of predictors to employ for both day and night scenes. Second, the RF models are trained using the optimal number of trees and a select number of random predictors identified during the tuning phase. Lastly, the model is used to predict clouds for an independent time period than used during training and compared to truth, the CMG cloud mask. Initial results show 97% accuracy during the daytime, 94% accuracy at night, and 95% accuracy for all times. The total time to train, tune and test was approximately one week. The improved performance and reduced time to produce results is testament to improved computer technology and the use of machine learning as a more efficient and accurate methodology of cloud detection.
Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm.
Yang, Mengzhao; Song, Wei; Mei, Haibin
2017-07-23
The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient.
An incremental anomaly detection model for virtual machines.
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform.
An incremental anomaly detection model for virtual machines
Zhang, Hancui; Chen, Shuyu; Liu, Jun; Zhou, Zhen; Wu, Tianshu
2017-01-01
Self-Organizing Map (SOM) algorithm as an unsupervised learning method has been applied in anomaly detection due to its capabilities of self-organizing and automatic anomaly prediction. However, because of the algorithm is initialized in random, it takes a long time to train a detection model. Besides, the Cloud platforms with large scale virtual machines are prone to performance anomalies due to their high dynamic and resource sharing characters, which makes the algorithm present a low accuracy and a low scalability. To address these problems, an Improved Incremental Self-Organizing Map (IISOM) model is proposed for anomaly detection of virtual machines. In this model, a heuristic-based initialization algorithm and a Weighted Euclidean Distance (WED) algorithm are introduced into SOM to speed up the training process and improve model quality. Meanwhile, a neighborhood-based searching algorithm is presented to accelerate the detection time by taking into account the large scale and high dynamic features of virtual machines on cloud platform. To demonstrate the effectiveness, experiments on a common benchmark KDD Cup dataset and a real dataset have been performed. Results suggest that IISOM has advantages in accuracy and convergence velocity of anomaly detection for virtual machines on cloud platform. PMID:29117245
Efficient Retrieval of Massive Ocean Remote Sensing Images via a Cloud-Based Mean-Shift Algorithm
Song, Wei; Mei, Haibin
2017-01-01
The rapid development of remote sensing (RS) technology has resulted in the proliferation of high-resolution images. There are challenges involved in not only storing large volumes of RS images but also in rapidly retrieving the images for ocean disaster analysis such as for storm surges and typhoon warnings. In this paper, we present an efficient retrieval of massive ocean RS images via a Cloud-based mean-shift algorithm. Distributed construction method via the pyramid model is proposed based on the maximum hierarchical layer algorithm and used to realize efficient storage structure of RS images on the Cloud platform. We achieve high-performance processing of massive RS images in the Hadoop system. Based on the pyramid Hadoop distributed file system (HDFS) storage method, an improved mean-shift algorithm for RS image retrieval is presented by fusion with the canopy algorithm via Hadoop MapReduce programming. The results show that the new method can achieve better performance for data storage than HDFS alone and WebGIS-based HDFS. Speedup and scaleup are very close to linear changes with an increase of RS images, which proves that image retrieval using our method is efficient. PMID:28737699
Experimental study of detonation of large-scale powder-droplet-vapor mixtures
NASA Astrophysics Data System (ADS)
Bai, C.-H.; Wang, Y.; Xue, K.; Wang, L.-F.
2018-05-01
Large-scale experiments were carried out to investigate the detonation performance of a 1600-m3 ternary cloud consisting of aluminum powder, fuel droplets, and vapor, which were dispersed by a central explosive in a cylindrically stratified configuration. High-frame-rate video cameras and pressure gauges were used to analyze the large-scale explosive dispersal of the mixture and the ensuing blast wave generated by the detonation of the cloud. Special attention was focused on the effect of the descending motion of the charge on the detonation performance of the dispersed ternary cloud. The charge was parachuted by an ensemble of apparatus from the designated height in order to achieve the required terminal velocity when the central explosive was detonated. A descending charge with a terminal velocity of 32 m/s produced a cloud with discernably increased concentration compared with that dispersed from a stationary charge, the detonation of which hence generates a significantly enhanced blast wave beyond the scaled distance of 6 m/kg^{1/3}. The results also show the influence of the descending motion of the charge on the jetting phenomenon and the distorted shock front.
Cao, Ya-nan; Wei, He-li; Dai, Cong-ming; Zhang, Xue-hai
2015-05-01
A study was carried out to retrieve optical thickness and cloud top height of cirrus clouds from the Atmospheric Infrared Sounder (AIRS) high spectral resolution data in 1070~1135 cm-1 IR band using a Combined Atmospheric Radiative Transfer model (CART) by brightness temperature difference between model simulation and AIRS observation. The research is based on AIRS LIB high spectral infrared observation data combined with Moderate Resolution Imaging Spectroradiometer (MODIS) cloud product data. Brightness temperature spectra based, on the retrieved cirrus optical thickness and cloud top height were simulated and compared with brightness temperature spectra of AIRS observation in the 650~1150 cm-1 band. The cirrus optical thickness and cloud top height retrieved were compared with brightness temperature of AIRS for channel 760 (900.56 cm-1, 11. 1 µm) and cirrus reflectance of MODIS cloud product. And cloud top height retrieved was compared with cloud top height from MODIS. Results show that the brightness temperature spectra simulated were basically consistent with AIRS observation under the condition of retrieval in the 650~1150 cm-1 band. It means that CART can be used to simulate AIRS brightness temperature spectra. The retrieved cirrus parameters are consistent with brightness temperature of AIRS for channel 11. 1 µm with low brightness temperature corresponding to large cirrus optical thickness and high cloud top height. And the retrieved cirrus parameters are consistent with cirrus reflectance of MODIS cloud product with high cirrus reflectance corresponding to large cirrus optical thickness and high cloud top height. Correlation coefficient of brightness temperature between retrieved cloud top height and MODIS cloud top height was relatively high. They are mostly located in the range of 8. 5~11.5 km, and their probability distribution trend is approximately identical. CART model is feasible to retrieve cirrus properties, and the retrieval is reliable.
NASA Astrophysics Data System (ADS)
Diehl, K.; Simmel, M.; Wurzler, S.
There is some evidence that the initiation of warm rain is suppressed in clouds over regions with vegetation fires. Thus, the ice phase becomes important as another possibility to initiate precipitation. Numerical simulations were performed to investigate heterogeneous drop freezing for a biomass-burning situation. An air parcel model with a sectional two-dimensional description of the cloud microphysics was employed with parameterizations for immersion and contact freezing which consider the different ice nucleating efficiencies of various ice nuclei. Three scenarios were simulated resulting to mixed-phase or completely glaciated clouds. According to the high insoluble fraction of the biomass-burning particles drop freezing via immersion and contact modes was very efficient. The preferential freezing of large drops followed by riming (i.e. the deposition of liquid drops on ice particles) and the evaporation of the liquid drops (Bergeron-Findeisen process) caused a further decrease of the liquid drops' effective radius in higher altitudes. In turn ice particle sizes increased so that they could serve as germs for graupel or hailstone formation. The effects of ice initiation on the vertical cloud dynamics were fairly significant leading to a development of the cloud to much higher altitudes than in a warm cloud without ice formation.
Flight of frigatebirds inside clouds - energy gain, stability and control.
Sachs, Gottfried; Weimerskirch, Henri
2018-07-07
Investigating the unique ability of frigatebirds of flying inside clouds, it is shown that they achieve a large energy gain by ascents to high altitudes in strong updrafts of trade cumulus clouds. Frigatebirds often perform that kind of flight, at daytime as well as in the night. This suggests that they are capable of flying inside clouds in a controlled and stabilized manner. The control requirements for ascents in terms of a circling flight in updrafts of trade cumulus clouds are analyzed, and the necessary aerodynamic control moments are determined. Based on a stability investigation, it is shown that there are restoring effects which act against disturbances causing possible deviations from the circling flight condition. The aerodynamic moments which effectuate that stabilization are identified. Furthermore, the problem of neutral azimuth stability which generally exists in the flight of birds and which is the reason for continually increasing deviations from the course is dealt with. It is shown for the circling flight mode of frigatebirds inside clouds that, here, deviations are small and remain constant, suggesting that a corrective control action is not required. This is particularly important for circling flight in conditions without a visual reference, like inside clouds. Copyright © 2018 Elsevier Ltd. All rights reserved.
DAΦNE operation with electron-cloud-clearing electrodes.
Alesini, D; Drago, A; Gallo, A; Guiducci, S; Milardi, C; Stella, A; Zobov, M; De Santis, S; Demma, T; Raimondi, P
2013-03-22
The effects of an electron cloud (e-cloud) on beam dynamics are one of the major factors limiting performances of high intensity positron, proton, and ion storage rings. In the electron-positron collider DAΦNE, namely, a horizontal beam instability due to the electron-cloud effect has been identified as one of the main limitations on the maximum stored positron beam current and as a source of beam quality deterioration. During the last machine shutdown in order to mitigate such instability, special electrodes have been inserted in all dipole and wiggler magnets of the positron ring. It has been the first installation all over the world of this type since long metallic electrodes have been installed in all arcs of the collider positron ring and are currently used during the machine operation in collision. This has allowed a number of unprecedented measurements (e-cloud instabilities growth rate, transverse beam size variation, tune shifts along the bunch train) where the e-cloud contribution is clearly evidenced by turning the electrodes on and off. In this Letter we briefly describe a novel design of the electrodes, while the main focus is on experimental measurements. Here we report all results that clearly indicate the effectiveness of the electrodes for e-cloud suppression.
Synergistic Measurement of Ice Cloud Microphysics using C- and Ka-Band Radars
NASA Astrophysics Data System (ADS)
Ewald, F.; Gross, S.; Hagen, M.; Li, Q.; Zinner, T.
2017-12-01
Ice clouds play an essential role in the climate system since they have a large effect on the Earth's radiation budget. Uncertainties associated with their spatial and temporal distribution as well as their optical and microphysical properties still account for large uncertainties in climate change predictions. Substantial improvement of our understanding of ice clouds was achieved with the advent of cloud radars into the field of ice cloud remote sensing. Here, highly variable ice crystal size distributions are one of the key issues remaining to be resolved. With radar reflectivity scaling with the sixth moment of the particle size, the assumed ice crystal size distribution has a large impact on the results of microphysical retrievals. Different ice crystal sizes distributions can, however, be distinguished, when cloud radars of different wavelength are used simultaneously.For this study, synchronous RHI scans were performed for a common measurement range of about 30 km between two radar instruments using different wavelengths: the dual-polarization C-band radar POLDIRAD operated at DLR and the Mira-36 Ka-band cloud radar operated at the University of Munich. For a measurement period over several months, the overlapping region for ice clouds turned out to be quite large. This gives evidence on the presence of moderate-sized ice crystals for which the backscatter is sufficient high to be visible in the C-band as well. In the range between -10 to +10 dBz, reflectivity measurements from both radars agreed quite well indicating the absence of large ice crystals. For reflectivities above +10 dBz, we observed differences with smaller values at the Ka-band due to Mie scattering effects at larger ice crystals.In this presentation, we will show how this differential reflectivity can be used to gain insight into ice cloud microphysics on the basis of electromagnetic scattering calculations. We will further explore ice cloud microphysics using the full polarization agility of the C-band radar and compare the results to simultaneous linear depolarization measurements with the Ka-band radar. In summary, we will explore if the scientific understanding of ice cloud microphysics can be advanced by the combination of C- and Ka-band radars.
AIRS Subpixel Cloud Characterization Using MODIS Cloud Products.
NASA Astrophysics Data System (ADS)
Li, Jun; Menzel, W. Paul; Sun, Fengying; Schmit, Timothy J.; Gurka, James
2004-08-01
The Moderate Resolution Imaging Spectroradiometer (MODIS) and the Atmospheric Infrared Sounder (AIRS) measurements from the Earth Observing System's (EOS's) Aqua satellite enable improved global monitoring of the distribution of clouds. MODIS is able to provide, at high spatial resolution (1 5 km), a cloud mask, surface and cloud types, cloud phase, cloud-top pressure (CTP), effective cloud amount (ECA), cloud particle size (CPS), and cloud optical thickness (COT). AIRS is able to provide CTP, ECA, CPS, and COT at coarser spatial resolution (13.5 km at nadir) but with much better accuracy using its high-spectral-resolution measurements. The combined MODIS AIRS system offers the opportunity for improved cloud products over those possible from either system alone. The key steps for synergistic use of imager and sounder radiance measurements are 1) collocation in space and time and 2) imager cloud amount, type, and phase determination within the sounder pixel. The MODIS and AIRS measurements from the EOS Aqua satellite provide the opportunity to study the synergistic use of advanced imager and sounder measurements. As the first step, the MODIS classification procedure is applied to identify various surface and cloud types within an AIRS footprint. Cloud-layer information (lower, midlevel, or high clouds) and phase information (water, ice, or mixed-phase clouds) within the AIRS footprint are sorted and characterized using MODIS 1-km-spatial-resolution data. The combined MODIS and AIRS data for various scenes are analyzed to study the utility of the synergistic use of high-spatial-resolution imager products and high-spectral-resolution sounder radiance measurements. There is relevance to the optimal use of data from the Advanced Baseline Imager (ABI) and Hyperspectral Environmental Suite (HES) systems, which are to fly on the Geostationary Operational Environmental Satellite (GOES)-R.
Improving Scene Classifications with Combined Active/Passive Measurements
NASA Astrophysics Data System (ADS)
Hu, Y.; Rodier, S.; Vaughan, M.; McGill, M.
The uncertainties in cloud and aerosol physical properties derived from passive instruments such as MODIS are not insignificant And the uncertainty increases when the optical depths decrease Lidar observations do much better for the thin clouds and aerosols Unfortunately space-based lidar measurements such as the one onboard CALIPSO satellites are limited to nadir view only and thus have limited spatial coverage To produce climatologically meaningful thin cloud and aerosol data products it is necessary to combine the spatial coverage of MODIS with the highly sensitive CALIPSO lidar measurements Can we improving the quality of cloud and aerosol remote sensing data products by extending the knowledge about thin clouds and aerosols learned from CALIPSO-type of lidar measurements to a larger portion of the off-nadir MODIS-like multi-spectral pixels To answer the question we studied the collocated Cloud Physics Lidar CPL with Modis-Airborne-Simulation MAS observations and established an effective data fusion technique that will be applied in the combined CALIPSO MODIS cloud aerosol product algorithms This technique performs k-mean and Kohonen self-organized map cluster analysis on the entire swath of MAS data as well as on the combined CPL MAS data at the nadir track Interestingly the clusters generated from the two approaches are almost identical It indicates that the MAS multi-spectral data may have already captured most of the cloud and aerosol scene types such as cloud ice water phase multi-layer information aerosols
A cloud-based multimodality case file for mobile devices.
Balkman, Jason D; Loehfelm, Thomas W
2014-01-01
Recent improvements in Web and mobile technology, along with the widespread use of handheld devices in radiology education, provide unique opportunities for creating scalable, universally accessible, portable image-rich radiology case files. A cloud database and a Web-based application for radiologic images were developed to create a mobile case file with reasonable usability, download performance, and image quality for teaching purposes. A total of 75 radiology cases related to breast, thoracic, gastrointestinal, musculoskeletal, and neuroimaging subspecialties were included in the database. Breast imaging cases are the focus of this article, as they best demonstrate handheld display capabilities across a wide variety of modalities. This case subset also illustrates methods for adapting radiologic content to cloud platforms and mobile devices. Readers will gain practical knowledge about storage and retrieval of cloud-based imaging data, an awareness of techniques used to adapt scrollable and high-resolution imaging content for the Web, and an appreciation for optimizing images for handheld devices. The evaluation of this software demonstrates the feasibility of adapting images from most imaging modalities to mobile devices, even in cases of full-field digital mammograms, where high resolution is required to represent subtle pathologic features. The cloud platform allows cases to be added and modified in real time by using only a standard Web browser with no application-specific software. Challenges remain in developing efficient ways to generate, modify, and upload radiologic and supplementary teaching content to this cloud-based platform. Online supplemental material is available for this article. ©RSNA, 2014.
NASA Technical Reports Server (NTRS)
Davis, Richard E.; Maddalon, Dal V.; Wagner, Richard D.; Fisher, David F.; Young, Ronald
1989-01-01
Summary evaluations of the performance of laminar-flow control (LFC) leading edge test articles on a NASA JetStar aircraft are presented. Statistics, presented for the test articles' performance in haze and cloud situations, as well as in clear air, show a significant effect of cloud particle concentrations on the extent of laminar flow. The cloud particle environment was monitored by two instruments, a cloud particle spectrometer (Knollenberg probe) and a charging patch. Both instruments are evaluated as diagnostic aids for avoiding laminar-flow detrimental particle concentrations in future LFC aircraft operations. The data base covers 19 flights in the simulated airline service phase of the NASA Leading-Edge Flight-Test (LEFT) Program.
NASA Astrophysics Data System (ADS)
Kodama, C.; Noda, A. T.; Satoh, M.
2012-06-01
This study presents an assessment of three-dimensional structures of hydrometeors simulated by the NICAM, global nonhydrostatic atmospheric model without cumulus parameterization, using multiple satellite data sets. A satellite simulator package (COSP: the CFMIP Observation Simulator Package) is employed to consistently compare model output with ISCCP, CALIPSO, and CloudSat satellite observations. Special focus is placed on high thin clouds, which are not observable in the conventional ISCCP data set, but can be detected by the CALIPSO observations. For the control run, the NICAM simulation qualitatively captures the geographical distributions of the high, middle, and low clouds, even though the horizontal mesh spacing is as coarse as 14 km. The simulated low cloud is very close to that of the CALIPSO low cloud. Both the CloudSat observations and NICAM simulation show a boomerang-type pattern in the radar reflectivity-height histogram, suggesting that NICAM realistically simulates the deep cloud development process. A striking difference was found in the comparisons of high thin cirrus, showing overestimated cloud and higher cloud top in the model simulation. Several model sensitivity experiments are conducted with different cloud microphysical parameters to reduce the model-observation discrepancies in high thin cirrus. In addition, relationships among clouds, Hadley circulation, outgoing longwave radiation and precipitation are discussed through the sensitivity experiments.
Model Intercomparison of CCN-Limited Arctic Clouds During ASCOS
NASA Astrophysics Data System (ADS)
Stevens, Robin; Dearden, Chris; Dimetrelos, Antonios; Eirund, Gesa; Possner, Anna; Raatikainen, Tomi; Loewe, Katharina; Hill, Adrian; Shipway, Ben; Connolly, Paul; Ekman, Annica; Hoose, Corinna; Laaksonen, Ari; de Leeuw, Gerrit; Kolmonen, Pekka; Saponaro, Giulia; Field, Paul; Carlsaw, Ken
2017-04-01
Future decreases in Arctic sea ice are expected to increase fluxes of aerosol and precursor gases from the open ocean surface within the Arctic. The resulting increase in cloud condensation nuclei (CCN) concentrations would be expected to result in increased cloud albedo (Struthers et al, 2011), leading to potentially large changes in radiative forcings. However, Browse et al. (2014) have shown that these increases in condensable material could also result in the growth of existing particles to sizes where they are more efficiently removed by wet deposition in drizzling stratocumulus clouds, ultimately decreasing CCN concentrations in the high Arctic. Their study was limited in that it did not simulate alterations of dynamics or cloud properties due to either changes in heat and moisture fluxes following sea-ice loss or changing aerosol concentrations. Taken together, these results show that significant uncertainties remain in trying to quantify aerosol-cloud processes in the Arctic system. The current representation of these processes in global climate models is most likely insufficient to realistically simulate long-term changes. In order to better understand the microphysical processes currently governing Arctic clouds, we perform a model intercomparison of summertime high Arctic (>80N) clouds observed during the 2008 ASCOS campaign. The intercomparison includes results from three large eddy simulation models (UCLALES-SALSA, COSMO-LES, and MIMICA) and three numerical weather prediction models (COSMO-NWP, WRF, and UM-CASIM). The results of these experiments will be used as a basis for sensitivity studies on the impact of sea-ice loss on Arctic clouds through changes in aerosol and precursor emissions as well as changes in latent and sensible heat fluxes. Browse, J., et al., Atmos. Chem. Phys., 14(14), 7543-7557, doi:10.5194/acp-14-7543-2014, 2014. Struthers, H., et al., Atmos. Chem. Phys., 11(7), 3459-3477, doi:10.5194/acp-11-3459-2011, 2011.
NASA Astrophysics Data System (ADS)
Jakub, Fabian; Mayer, Bernhard
2017-11-01
The formation of shallow cumulus cloud streets was historically attributed primarily to dynamics. Here, we focus on the interaction between radiatively induced surface heterogeneities and the resulting patterns in the flow. Our results suggest that solar radiative heating has the potential to organize clouds perpendicular to the sun's incidence angle. To quantify the extent of organization, we performed a high-resolution large-eddy simulation (LES) parameter study. We varied the horizontal wind speed, the surface heat capacity, the solar zenith and azimuth angles, and radiative transfer parameterizations (1-D and 3-D). As a quantitative measure we introduce a simple algorithm that provides a scalar quantity for the degree of organization and the alignment. We find that, even in the absence of a horizontal wind, 3-D radiative transfer produces cloud streets perpendicular to the sun's incident direction, whereas the 1-D approximation or constant surface fluxes produce randomly positioned circular clouds. Our reasoning for the enhancement or reduction of organization is the geometric position of the cloud's shadow and its corresponding surface fluxes. Furthermore, when increasing horizontal wind speeds to 5 or 10 m s-1, we observe the development of dynamically induced cloud streets. If, in addition, solar radiation illuminates the surface beneath the cloud, i.e., when the sun is positioned orthogonally to the mean wind field and the solar zenith angle is larger than 20°, the cloud-radiative feedback has the potential to significantly enhance the tendency to organize in cloud streets. In contrast, in the case of the 1-D approximation (or overhead sun), the tendency to organize is weaker or even prohibited because the shadow is cast directly beneath the cloud. In a land-surface-type situation, we find the organization of convection happening on a timescale of half an hour. The radiative feedback, which creates surface heterogeneities, is generally diminished for large surface heat capacities. We therefore expect radiative feedbacks to be strongest over land surfaces and weaker over the ocean. Given the results of this study we expect that simulations including shallow cumulus convection will have difficulties producing cloud streets if they employ 1-D radiative transfer solvers or may need unrealistically high wind speeds to excite cloud street organization.
Duro, Francisco Rodrigo; Blas, Javier Garcia; Isaila, Florin; ...
2016-10-06
The increasing volume of scientific data and the limited scalability and performance of storage systems are currently presenting a significant limitation for the productivity of the scientific workflows running on both high-performance computing (HPC) and cloud platforms. Clearly needed is better integration of storage systems and workflow engines to address this problem. This paper presents and evaluates a novel solution that leverages codesign principles for integrating Hercules—an in-memory data store—with a workflow management system. We consider four main aspects: workflow representation, task scheduling, task placement, and task termination. As a result, the experimental evaluation on both cloud and HPC systemsmore » demonstrates significant performance and scalability improvements over existing state-of-the-art approaches.« less
The Optical Depth Sensor (ODS) for Mars atmosphere
NASA Astrophysics Data System (ADS)
Toledo, D.; Rannou, P.; Pommereau, J.-P.; Sarkissian, A.; Foujols, T.
2015-10-01
A small and sophisticated optical depth sensor (ODS) has been designed to work in both Martian and Earth environments. The principal goal of ODS is to carry out the opacity due to the Martian dust as well as to characterize the high altitude clouds at twilight, crucial parameters in understanding of Martian meteorology. The instrument was initially designed for the failed MARS96 Russian mission, and also was included in the payload of several other missions [1]. Until recently, it was selected (NASA/ESA AO) in the payload of the atmospheric package DREAMS onboard the MARS 2016 mission. But following a decision of the CNES, it is no more included in the payload. In order to study the performance of ODS under a wide range of conditions as well as its capable to provide daily measurements of both dust optical thickness and high altitude clouds properties, the instrument has participated in different terrestrial campaigns. A good performance of ODS prototype (Figure 1) on cirrus clouds detection and in dust opacity estimation was previously archived in Africa during 2004-2005 and in Brasil from 2012 to nowadays. Moreover, a campaign in the arctic is expected before 2016 where fifteen ODSs will be part of an integrated observing system over the Arctic Ocean, allowing test the ODS performance in extreme conditions. In this presentation we present main principle of the retrieval, the instrumental concept, the result of the tests performed and the principal objectives of ODS in Mars.
The Optical Depth Sensor (ODS) for Mars atmosphere
NASA Astrophysics Data System (ADS)
Toledo, D.; Rannou, P.; Pommereau, J.-P.; Sarkissian, A.; Foujols, T.
2013-09-01
A small and sophisticated optical depth sensor (ODS) has been designed to work in the martian atmosphere. The principal goal of ODS is to carry out the opacity due to the Martian dust as well as to characterize the high altitude clouds at twilight, crucial parameters in understanding of Martian meteorology. The instrument was initially designed for the failed MARS96 Russian mission, and also was included in the payload of several other missions [1]. Until recently, it was selected (NASA/ESA AO) in the payload of the atmospheric package DREAMS onboard the MARS 2016 mission. But following a decision of the CNES, it is no more included in the payload. In order to study the performance of ODS under a wide range of conditions as well as its capable to provide daily measurements of both dust optical thickness and high altitude clouds, the instrument has participated in different terrestrial campaigns. A good performance of ODS prototype (Figure 1) on cirrus clouds detection and in dust opacity estimation was previously archived in Africa during 2004-2005 and in Brasil from 2012 to nowadays. Moreover, a campaign in the arctic is expected before 2016 where fifteen ODSs will be part of an integrated observing system over the Arctic Ocean, allowing test the ODS performance in extreme conditions. In this presentation we present main principle of the retrieval, the instrumental concept, the result of the tests performed and the principal objectives of ODS in Mars.
NASA Astrophysics Data System (ADS)
Gross, S.; Gutleben, M.; Wirth, M.; Ewald, F.
2017-12-01
Aerosols and clouds are still main contributors to uncertainties in estimates and interpretation of the Earth's changing energy budget. Their interaction with the Earth's radiation budged has a direct component by scattering and absorbing solar and terrestrial radiation, and an indirect component, e.g. as aerosols modify the properties and thus the life-time of clouds or by changing the atmosphere's stability. Up to know now sufficient understanding in aerosol-cloud interaction and climate feedback is achieved. Thus studies with respect to clouds, aerosols, their interaction and influence on the radiation budged are highly demanded. In August 2016 the NARVAL-II (Next-generation airborne remote sensing for validation studies) mission took place. Measurements with a combined active (high spectral resolution and water vapor differential absorption lidar and cloud radar) and passive remote sensing (microwave radiometer, hyper spectral imager, radiation measurements) payload were performed with the German high altitude and long-range research aircraft HALO over the subtropical North-Atlantic Ocean to study shallow marine convection during the wet and dusty season. With this, NARVAL-II is follow-up of the NARVAL-I mission which took place during the dry and dust free season in December 2013. During NARVAL-II the measurement flights were designed the way to sample dust influenced areas as well as dust free areas in the trades. One main objective was to investigate the optical and macro physical properties of the dust layer, differences in cloud occurrence in dusty and non-dusty areas, and to study the influence of aerosols on the cloud properties and formation. This allows comparisons of cloud and aerosol distribution as well as their environment between the dry and the wet season, and of cloud properties and distribution with and without the influence of long-range transported dust across the Atlantic Ocean. In our presentation we will give an overview of the NARVAL-I and NARVAL-II mission and on the general measurement situation. For the analysis we focus on the lidar measurements during both campaigns. We will show comparisons of the cloud distribution between both measurement seasons and we will show first results of how aerosol distribution and properties change in the presence of long-range transported dust.
Feature extraction and classification of clouds in high resolution panchromatic satellite imagery
NASA Astrophysics Data System (ADS)
Sharghi, Elan
The development of sophisticated remote sensing sensors is rapidly increasing, and the vast amount of satellite imagery collected is too much to be analyzed manually by a human image analyst. It has become necessary for a tool to be developed to automate the job of an image analyst. This tool would need to intelligently detect and classify objects of interest through computer vision algorithms. Existing software called the Rapid Image Exploitation Resource (RAPIER®) was designed by engineers at Space and Naval Warfare Systems Center Pacific (SSC PAC) to perform exactly this function. This software automatically searches for anomalies in the ocean and reports the detections as a possible ship object. However, if the image contains a high percentage of cloud coverage, a high number of false positives are triggered by the clouds. The focus of this thesis is to explore various feature extraction and classification methods to accurately distinguish clouds from ship objects. An examination of a texture analysis method, line detection using the Hough transform, and edge detection using wavelets are explored as possible feature extraction methods. The features are then supplied to a K-Nearest Neighbors (KNN) or Support Vector Machine (SVM) classifier. Parameter options for these classifiers are explored and the optimal parameters are determined.
Feng, Zhe; Hagos, Samson; Rowe, Angela K.; ...
2015-04-03
This paper investigates the mechanisms of convective cloud organization by precipitation-driven cold pools over the warm tropical Indian Ocean during the 2011 Atmospheric Radiation Measurement (ARM) Madden-Julian Oscillation (MJO) Investigation Experiment / Dynamics of the MJO (AMIE/DYNAMO) field campaign. A high-resolution regional model simulation is performed using the Weather Research and Forecasting model during the transition from suppressed to active phases of the November 2011 MJO. The simulated cold pool lifetimes, spatial extent and thermodynamic properties agree well with the radar and ship-borne observations from the field campaign. The thermodynamic and dynamic structures of the outflow boundaries of isolated andmore » intersecting cold pools in the simulation and the associated secondary cloud populations are examined. Intersecting cold pools last more than twice as long, are twice as large, 41% more intense (measured by buoyancy), and 62% deeper than isolated cold pools. Consequently, intersecting cold pools trigger 73% more convective clouds than isolated ones. This is possibly due to stronger outflows that enhance secondary updraft velocities by up to 45%. However, cold pool-triggered convective clouds grow into deep convection not because of the stronger secondary updrafts at cloud base, but rather due to closer spacing (aggregation) between clouds and larger cloud clusters that formed along the cold pool boundaries when they intersect. The close spacing of large clouds moistens the local environment and reduces entrainment drying, allowing the clouds to further develop into deep convection. Implications to the design of future convective parameterization with cold pool-modulated entrainment rates are discussed.« less
NASA Astrophysics Data System (ADS)
Wareing, C. J.; Pittard, J. M.; Falle, S. A. E. G.
2017-09-01
We have used the AMR hydrodynamic code, mg, to perform 3D hydrodynamic simulations with self-gravity of stellar feedback in a spherical clumpy molecular cloud formed through the action of thermal instability. We simulate the interaction of the mechanical energy input from 15, 40, 60 and 120 M⊙ stars into a 100 pc diameter 16 500 M⊙ cloud with a roughly spherical morphology with randomly distributed high-density condensations. The stellar winds are introduced using appropriate non-rotating Geneva stellar evolution models. In the 15 M⊙ star case, the wind has very little effect, spreading around a few neighbouring clumps before becoming overwhelmed by the cloud collapse. In contrast, in the 40, 60 and 120 M⊙ star cases, the more powerful stellar winds create large cavities and carve channels through the cloud, breaking out into the surrounding tenuous medium during the wind phase and considerably altering the cloud structure. After 4.97, 3.97 and 3.01 Myr, respectively, the massive stars explode as supernovae (SNe). The wind-sculpted surroundings considerably affect the evolution of these SN events as they both escape the cloud along wind-carved channels and sweep up remaining clumps of cloud/wind material. The 'cloud' as a coherent structure does not survive the SN from any of these stars, but only in the 120 M⊙ case is the cold molecular material completely destabilized and returned to the unstable thermal phase. In the 40 and 60 M⊙ cases, coherent clumps of cold material are ejected from the cloud by the SN, potentially capable of further star formation.
On the Global Character of Overlap Between Low and High Clouds
NASA Technical Reports Server (NTRS)
Yuan, Tianle; Oreopoulos, Lazaros
2013-01-01
The global character of overlap between low and high clouds is examined using active satellite sensors. Low-cloud fraction has a strong land-ocean contrast with oceanic values double those over land. Major low-cloud regimes include not only the eastern ocean boundary stratocumulus and shallow cumulus but also those associated with cold air outbreaks downwind of wintertime continents and land stratus over particular geographic areas. Globally, about 30% of low clouds are overlapped by high clouds. The overlap rate exhibits strong spatial variability ranging from higher than 90% in the tropics to less than 5% in subsidence areas and is anticorrelated with subsidence rate and low-cloud fraction. The zonal mean of vertical separation between cloud layers is never smaller than 5 km and its zonal variation closely follows that of tropopause height, implying a tight connection with tropopause dynamics. Possible impacts of cloud overlap on low clouds are discussed.
NASA Astrophysics Data System (ADS)
Rizki, Permata Nur Miftahur; Lee, Heezin; Lee, Minsu; Oh, Sangyoon
2017-01-01
With the rapid advance of remote sensing technology, the amount of three-dimensional point-cloud data has increased extraordinarily, requiring faster processing in the construction of digital elevation models. There have been several attempts to accelerate the computation using parallel methods; however, little attention has been given to investigating different approaches for selecting the most suited parallel programming model for a given computing environment. We present our findings and insights identified by implementing three popular high-performance parallel approaches (message passing interface, MapReduce, and GPGPU) on time demanding but accurate kriging interpolation. The performances of the approaches are compared by varying the size of the grid and input data. In our empirical experiment, we demonstrate the significant acceleration by all three approaches compared to a C-implemented sequential-processing method. In addition, we also discuss the pros and cons of each method in terms of usability, complexity infrastructure, and platform limitation to give readers a better understanding of utilizing those parallel approaches for gridding purposes.
The Impact of Microphysics on Intensity and Structure of Hurricanes
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo; Shi, Jainn; Lang, Steve; Peters-Lidard, Christa
2006-01-01
During the past decade, both research and operational numerical weather prediction models, e.g. Weather Research and Forecast (WRF) model, have started using more complex microphysical schemes originally developed for high-resolution cloud resolving models (CRMs) with a 1-2 km or less horizontal resolutions. WFW is a next-generation mesoscale forecast model and assimilation system that has incorporated modern software framework, advanced dynamics, numeric and data assimilation techniques, a multiple moveable nesting capability, and improved physical packages. WFW model can be used for a wide range of applications, from idealized research to operational forecasting, with an emphasis on horizontal grid sizes in the range of 1-10 km. The current WRF includes several different microphysics options such as Lin et al. (1983), WSM 6-class and Thompson microphysics schemes. We have recently implemented three sophisticated cloud microphysics schemes into WRF. The cloud microphysics schemes have been extensively tested and applied for different mesoscale systems in different geographical locations. The performances of these schemes have been compared to those from other WRF microphysics options. We are performing sensitivity tests in using WW to examine the impact of six different cloud microphysical schemes on hurricane track, intensity and rainfall forecast. We are also performing the inline tracer calculation to comprehend the physical processes @e., boundary layer and each quadrant in the boundary layer) related to the development and structure of hurricanes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yoginath, Srikanth B; Perumalla, Kalyan S
2013-01-01
Virtual machine (VM) technologies, especially those offered via Cloud platforms, present new dimensions with respect to performance and cost in executing parallel discrete event simulation (PDES) applications. Due to the introduction of overall cost as a metric, the choice of the highest-end computing configuration is no longer the most economical one. Moreover, runtime dynamics unique to VM platforms introduce new performance characteristics, and the variety of possible VM configurations give rise to a range of choices for hosting a PDES run. Here, an empirical study of these issues is undertaken to guide an understanding of the dynamics, trends and trade-offsmore » in executing PDES on VM/Cloud platforms. Performance results and cost measures are obtained from actual execution of a range of scenarios in two PDES benchmark applications on the Amazon Cloud offerings and on a high-end VM host machine. The data reveals interesting insights into the new VM-PDES dynamics that come into play and also leads to counter-intuitive guidelines with respect to choosing the best and second-best configurations when overall cost of execution is considered. In particular, it is found that choosing the highest-end VM configuration guarantees neither the best runtime nor the least cost. Interestingly, choosing a (suitably scaled) low-end VM configuration provides the least overall cost without adversely affecting the total runtime.« less
Comparison between SAGE II and ISCCP high-level clouds. 2: Locating clouds tops
NASA Technical Reports Server (NTRS)
Liao, Xiaohan; Rossow, William B.; Rind, David
1995-01-01
A comparison is made of the vertical distribution of high-level cloud tops derived from the Stratospheric Aerosol and Gas Experiment II (SAGE II) occultation measurements and from the International Satellite Cloud Climatology Project (ISCCP) for all Julys and Januarys in 1985 to 1990. The results suggest that ISCCP overestimates the pressure of high-level clouds by up to 50-150 mbar, particularly at low latitudes. This is caused by the frequent presence of clouds with diffuse tops (greater than 50% time when cloudy events are observed). The averaged vertical extent of the diffuse top is about 1.5 km. At midlatitudes where the SAGE II and ISCCP cloud top pressure agree best, clouds with distinct tops reach a maximum relative proportion of the total level cloud amount (about 30-40%), and diffuse-topped clouds are reduced to their minimum (30-40%). The ISCCP-defined cloud top pressure should be regarded not as the material physical height of the clouds but as the level which emits the same infrared radiance as observed. SAGE II and ISCCP cloud top pressures agree for clouds with distinct tops. There is also an indication that the cloud top pressures of optically thin clouds not overlying thicker clouds are poorly estimated by ISCCP at middle latitudes. The average vertical extent of these thin clouds is about 2.5 km.
Cloud radiative properties and aerosol - cloud interaction
NASA Astrophysics Data System (ADS)
Viviana Vladutescu, Daniela; Gross, Barry; Li, Clement; Han, Zaw
2015-04-01
The presented research discusses different techniques for improvement of cloud properties measurements and analysis. The need for these measurements and analysis arises from the high errors noticed in existing methods that are currently used in retrieving cloud properties and implicitly cloud radiative forcing. The properties investigated are cloud fraction (cf) and cloud optical thickness (COT) measured with a suite of collocated remote sensing instruments. The novel approach makes use of a ground based "poor man's camera" to detect cloud and sky radiation in red, green, and blue with a high spatial resolution of 30 mm at 1km. The surface-based high resolution photography provides a new and interesting view of clouds. As the cloud fraction cannot be uniquely defined or measured, it depends on threshold and resolution. However as resolution decreases, cloud fraction tends to increase if the threshold is below the mean, and vice versa. Additionally cloud fractal dimension also depends on threshold. Therefore these findings raise concerns over the ability to characterize clouds by cloud fraction or fractal dimension. Our analysis indicate that Principal Component analysis may lead to a robust means of quantifying cloud contribution to radiance. The cloud images are analyzed in conjunction with a collocated CIMEL sky radiometer, Microwave Radiometer and LIDAR to determine homogeneity and heterogeneity. Additionally, MFRSR measurements are used to determine the cloud radiative properties as a validation tool to the results obtained from the other instruments and methods. The cloud properties to be further studied are aerosol- cloud interaction, cloud particle radii, and vertical homogeneity.
NASA Astrophysics Data System (ADS)
Tomljenovic, Ivan; Tiede, Dirk; Blaschke, Thomas
2016-10-01
In the past two decades Object-Based Image Analysis (OBIA) established itself as an efficient approach for the classification and extraction of information from remote sensing imagery and, increasingly, from non-image based sources such as Airborne Laser Scanner (ALS) point clouds. ALS data is represented in the form of a point cloud with recorded multiple returns and intensities. In our work, we combined OBIA with ALS point cloud data in order to identify and extract buildings as 2D polygons representing roof outlines in a top down mapping approach. We performed rasterization of the ALS data into a height raster for the purpose of the generation of a Digital Surface Model (DSM) and a derived Digital Elevation Model (DEM). Further objects were generated in conjunction with point statistics from the linked point cloud. With the use of class modelling methods, we generated the final target class of objects representing buildings. The approach was developed for a test area in Biberach an der Riß (Germany). In order to point out the possibilities of the adaptation-free transferability to another data set, the algorithm has been applied ;as is; to the ISPRS Benchmarking data set of Toronto (Canada). The obtained results show high accuracies for the initial study area (thematic accuracies of around 98%, geometric accuracy of above 80%). The very high performance within the ISPRS Benchmark without any modification of the algorithm and without any adaptation of parameters is particularly noteworthy.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D T; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/.
Chung, Wei-Chun; Chen, Chien-Chih; Ho, Jan-Ming; Lin, Chung-Yen; Hsu, Wen-Lian; Wang, Yu-Chun; Lee, D. T.; Lai, Feipei; Huang, Chih-Wei; Chang, Yu-Jung
2014-01-01
Background Explosive growth of next-generation sequencing data has resulted in ultra-large-scale data sets and ensuing computational problems. Cloud computing provides an on-demand and scalable environment for large-scale data analysis. Using a MapReduce framework, data and workload can be distributed via a network to computers in the cloud to substantially reduce computational latency. Hadoop/MapReduce has been successfully adopted in bioinformatics for genome assembly, mapping reads to genomes, and finding single nucleotide polymorphisms. Major cloud providers offer Hadoop cloud services to their users. However, it remains technically challenging to deploy a Hadoop cloud for those who prefer to run MapReduce programs in a cluster without built-in Hadoop/MapReduce. Results We present CloudDOE, a platform-independent software package implemented in Java. CloudDOE encapsulates technical details behind a user-friendly graphical interface, thus liberating scientists from having to perform complicated operational procedures. Users are guided through the user interface to deploy a Hadoop cloud within in-house computing environments and to run applications specifically targeted for bioinformatics, including CloudBurst, CloudBrush, and CloudRS. One may also use CloudDOE on top of a public cloud. CloudDOE consists of three wizards, i.e., Deploy, Operate, and Extend wizards. Deploy wizard is designed to aid the system administrator to deploy a Hadoop cloud. It installs Java runtime environment version 1.6 and Hadoop version 0.20.203, and initiates the service automatically. Operate wizard allows the user to run a MapReduce application on the dashboard list. To extend the dashboard list, the administrator may install a new MapReduce application using Extend wizard. Conclusions CloudDOE is a user-friendly tool for deploying a Hadoop cloud. Its smart wizards substantially reduce the complexity and costs of deployment, execution, enhancement, and management. Interested users may collaborate to improve the source code of CloudDOE to further incorporate more MapReduce bioinformatics tools into CloudDOE and support next-generation big data open source tools, e.g., Hadoop BigTop and Spark. Availability: CloudDOE is distributed under Apache License 2.0 and is freely available at http://clouddoe.iis.sinica.edu.tw/. PMID:24897343
Cloud prediction of protein structure and function with PredictProtein for Debian.
Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Staniewski, Cedric; Rost, Burkhard
2013-01-01
We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome.
Cloud Prediction of Protein Structure and Function with PredictProtein for Debian
Kaján, László; Yachdav, Guy; Vicedo, Esmeralda; Steinegger, Martin; Mirdita, Milot; Angermüller, Christof; Böhm, Ariane; Domke, Simon; Ertl, Julia; Mertes, Christian; Reisinger, Eva; Rost, Burkhard
2013-01-01
We report the release of PredictProtein for the Debian operating system and derivatives, such as Ubuntu, Bio-Linux, and Cloud BioLinux. The PredictProtein suite is available as a standard set of open source Debian packages. The release covers the most popular prediction methods from the Rost Lab, including methods for the prediction of secondary structure and solvent accessibility (profphd), nuclear localization signals (predictnls), and intrinsically disordered regions (norsnet). We also present two case studies that successfully utilize PredictProtein packages for high performance computing in the cloud: the first analyzes protein disorder for whole organisms, and the second analyzes the effect of all possible single sequence variants in protein coding regions of the human genome. PMID:23971032
NASA Technical Reports Server (NTRS)
Huang, Hung-Lung; Diak, George R.
1992-01-01
The rms retrieval errors in cloud top pressure for fully overcast conditions over both land and water surfaces are shown for AMSU-A oxygen channel pair 3 and 5 and MHS water vapor channel pair 4 and 5. For both pairs, the decrease of retrieval skill from high cloud is evident for almost all liquid water contents. For high cloud and medium cloud, the water vapor pair outperforms the oxygen pair. Retrieval accuracy is the best for high and middle clouds and degrades as the cloud top is lower in the atmosphere.
A Fast Infrared Radiative Transfer Model for Overlapping Clouds
NASA Technical Reports Server (NTRS)
Niu, Jianguo; Yang, Ping; Huang, Huang-Lung; Davies, James E.; Li, Jun; Baum, Bryan A.; Hu, Yong X.
2006-01-01
A fast infrared radiative transfer model (FIRTM2) appropriate for application to both single-layered and overlapping cloud situations is developed for simulating the outgoing infrared spectral radiance at the top of the atmosphere (TOA). In FIRTM2 a pre-computed library of cloud reflectance and transmittance values is employed to account for one or two cloud layers, whereas the background atmospheric optical thickness due to gaseous absorption can be computed from a clear-sky radiative transfer model. FIRTM2 is applicable to three atmospheric conditions: 1) clear-sky, 2) single-layered ice or water cloud, and 3) two simultaneous cloud layers in a column (e.g., ice cloud overlying water cloud). Moreover, FIRTM2 outputs the derivatives (i.e., Jacobians) of the TOA brightness temperature with respect to cloud optical thickness and effective particle size. Sensitivity analyses have been carried out to assess the performance of FIRTM2 for two spectral regions, namely the longwave (LW) band (587.3 - 1179.5/cm) and the short-to-medium wave (SMW) band (1180.1 - 2228.9/cm). The assessment is carried out in terms of brightness temperature differences (BTD) between FIRTM2 and the well-known discrete ordinates radiative transfer model (DISORT), henceforth referred to as BTD (F-D). The BTD (F-D) values for single-layered clouds are generally less than 0.8 K. For the case of two cloud layers (specifically ice cloud over water cloud), the BTD(F-D) values are also generally less than 0.8 K except for the SMW band for the case of a very high altitude (>15 km) cloud comprised of small ice particles. Note that for clear-sky atmospheres, FIRTM2 reduces to the clear-sky radiative transfer model that is incorporated into FIRTM2, and the errors in this case are essentially those of the clear-sky radiative transfer model.
Partial Storage Optimization and Load Control Strategy of Cloud Data Centers
2015-01-01
We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner. PMID:25973444
Partial storage optimization and load control strategy of cloud data centers.
Al Nuaimi, Klaithem; Mohamed, Nader; Al Nuaimi, Mariam; Al-Jaroodi, Jameela
2015-01-01
We present a novel approach to solve the cloud storage issues and provide a fast load balancing algorithm. Our approach is based on partitioning and concurrent dual direction download of the files from multiple cloud nodes. Partitions of the files are saved on the cloud rather than the full files, which provide a good optimization to the cloud storage usage. Only partial replication is used in this algorithm to ensure the reliability and availability of the data. Our focus is to improve the performance and optimize the storage usage by providing the DaaS on the cloud. This algorithm solves the problem of having to fully replicate large data sets, which uses up a lot of precious space on the cloud nodes. Reducing the space needed will help in reducing the cost of providing such space. Moreover, performance is also increased since multiple cloud servers will collaborate to provide the data to the cloud clients in a faster manner.
Hubble Provides Infrared View of Jupiter's Moon, Ring, and Clouds
NASA Technical Reports Server (NTRS)
1997-01-01
Probing Jupiter's atmosphere for the first time, the Hubble Space Telescope's new Near Infrared Camera and Multi-Object Spectrometer (NICMOS) provides a sharp glimpse of the planet's ring, moon, and high-altitude clouds.
The presence of methane in Jupiter's hydrogen- and helium-rich atmosphere has allowed NICMOS to plumb Jupiter's atmosphere, revealing bands of high-altitude clouds. Visible light observations cannot provide a clear view of these high clouds because the underlying clouds reflect so much visible light that the higher level clouds are indistinguishable from the lower layer. The methane gas between the main cloud deck and the high clouds absorbs the reflected infrared light, allowing those clouds that are above most of the atmosphere to appear bright. Scientists will use NICMOS to study the high altitude portion of Jupiter's atmosphere to study clouds at lower levels. They will then analyze those images along with visible light information to compile a clearer picture of the planet's weather. Clouds at different levels tell unique stories. On Earth, for example, ice crystal (cirrus) clouds are found at high altitudes while water (cumulus) clouds are at lower levels.Besides showing details of the planet's high-altitude clouds, NICMOS also provides a clear view of the ring and the moon, Metis. Jupiter's ring plane, seen nearly edge-on, is visible as a faint line on the upper right portion of the NICMOS image. Metis can be seen in the ring plane (the bright circle on the ring's outer edge). The moon is 25 miles wide and about 80,000 miles from Jupiter.Because of the near-infrared camera's narrow field of view, this image is a mosaic constructed from three individual images taken Sept. 17, 1997. The color intensity was adjusted to accentuate the high-altitude clouds. The dark circle on the disk of Jupiter (center of image) is an artifact of the imaging system.This image and other images and data received from the Hubble Space Telescope are posted on the World Wide Web on the Space Telescope Science Institute home page at URL http://oposite.stsci.edu/pubinfo/NASA Astrophysics Data System (ADS)
Buteau, Sylvie; Simard, Jean-Robert; Roy, Gilles; Lahaie, Pierre; Nadeau, Denis; Mathieu, Pierre
2013-10-01
A standoff sensor called BioSense was developed to demonstrate the capacity to map, track and classify bioaerosol clouds from a distant range and over wide area. The concept of the system is based on a two steps dynamic surveillance: 1) cloud detection using an infrared (IR) scanning cloud mapper and 2) cloud classification based on a staring ultraviolet (UV) Laser Induced Fluorescence (LIF) interrogation. The system can be operated either in an automatic surveillance mode or using manual intervention. The automatic surveillance operation includes several steps: mission planning, sensor deployment, background monitoring, surveillance, cloud detection, classification and finally alarm generation based on the classification result. One of the main challenges is the classification step which relies on a spectrally resolved UV LIF signature library. The construction of this library relies currently on in-chamber releases of various materials that are simultaneously characterized with the standoff sensor and referenced with point sensors such as Aerodynamic Particle Sizer® (APS). The system was tested at three different locations in order to evaluate its capacity to operate in diverse types of surroundings and various environmental conditions. The system showed generally good performances even though the troubleshooting of the system was not completed before initiating the Test and Evaluation (T&E) process. The standoff system performances appeared to be highly dependent on the type of challenges, on the climatic conditions and on the period of day. The real-time results combined with the experience acquired during the 2012 T & E allowed to identify future ameliorations and investigation avenues.
Cloud chamber experiments on the origin of ice crystal complexity in cirrus clouds
NASA Astrophysics Data System (ADS)
Schnaiter, Martin; Järvinen, Emma; Vochezer, Paul; Abdelmonem, Ahmed; Wagner, Robert; Jourdan, Olivier; Mioche, Guillaume; Shcherbakov, Valery N.; Schmitt, Carl G.; Tricoli, Ugo; Ulanowski, Zbigniew; Heymsfield, Andrew J.
2016-04-01
This study reports on the origin of small-scale ice crystal complexity and its influence on the angular light scattering properties of cirrus clouds. Cloud simulation experiments were conducted at the AIDA (Aerosol Interactions and Dynamics in the Atmosphere) cloud chamber of the Karlsruhe Institute of Technology (KIT). A new experimental procedure was applied to grow and sublimate ice particles at defined super- and subsaturated ice conditions and for temperatures in the -40 to -60 °C range. The experiments were performed for ice clouds generated via homogeneous and heterogeneous initial nucleation. Small-scale ice crystal complexity was deduced from measurements of spatially resolved single particle light scattering patterns by the latest version of the Small Ice Detector (SID-3). It was found that a high crystal complexity dominates the microphysics of the simulated clouds and the degree of this complexity is dependent on the available water vapor during the crystal growth. Indications were found that the small-scale crystal complexity is influenced by unfrozen H2SO4 / H2O residuals in the case of homogeneous initial ice nucleation. Angular light scattering functions of the simulated ice clouds were measured by the two currently available airborne polar nephelometers: the polar nephelometer (PN) probe of Laboratoire de Métérologie et Physique (LaMP) and the Particle Habit Imaging and Polar Scattering (PHIPS-HALO) probe of KIT. The measured scattering functions are featureless and flat in the side and backward scattering directions. It was found that these functions have a rather low sensitivity to the small-scale crystal complexity for ice clouds that were grown under typical atmospheric conditions. These results have implications for the microphysical properties of cirrus clouds and for the radiative transfer through these clouds.
NASA Astrophysics Data System (ADS)
Dietlicher, Remo; Neubauer, David; Lohmann, Ulrike
2018-04-01
A new scheme for stratiform cloud microphysics has been implemented in the ECHAM6-HAM2 general circulation model. It features a widely used description of cloud water with two categories for cloud droplets and raindrops. The unique aspect of the new scheme is the break with the traditional approach to describe cloud ice analogously. Here we parameterize cloud ice by a single category that predicts bulk particle properties (P3). This method has already been applied in a regional model and most recently also in the Community Atmosphere Model 5 (CAM5). A single cloud ice category does not rely on heuristic conversion rates from one category to another. Therefore, it is conceptually easier and closer to first principles. This work shows that a single category is a viable approach to describe cloud ice in climate models. Prognostic representation of sedimentation is achieved by a nested approach for sub-stepping the cloud microphysics scheme. This yields good results in terms of accuracy and performance as compared to simulations with high temporal resolution. Furthermore, the new scheme allows for a competition between various cloud processes and is thus able to unbiasedly represent the ice formation pathway from nucleation to growth by vapor deposition and collisions to sedimentation. Specific aspects of the P3 method are evaluated. We could not produce a purely stratiform cloud where rime growth dominates growth by vapor deposition and conclude that the lack of appropriate conditions renders the prognostic parameters associated with the rime properties unnecessary. Limitations inherent in a single category are examined.
NASA Technical Reports Server (NTRS)
Wang, Chenxi; Platnick, Steven; Zhang, Zhibo; Meyer, Kerry; Wind, Galina; Yang, Ping
2016-01-01
An infrared-based optimal estimation (OE-IR) algorithm for retrieving ice cloud properties is evaluated. Specifically, the implementation of the algorithm with MODerate resolution Imaging Spectroradiometer (MODIS) observations is assessed in comparison with the operational retrieval products from MODIS on the Aqua satellite (MYD06), Cloud-Aerosol Lidar with Orthogonal Polarization (CALIOP), and the Imaging Infrared Radiometer (IIR); the latter two instruments fly on the Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) satellite in the Afternoon Constellation (A-Train) with Aqua. The results show that OE-IR cloud optical thickness (tau) and effective radius (r(sub eff)) retrievals perform best for ice clouds having 0.5 < tau< 7 and r(sub eff) < 50microns. For global ice clouds, the averaged retrieval uncertainties of tau and r(sub eff) are 19% and 33%, respectively. For optically thick ice clouds with tau larger than 10, however, the tau and r(sub eff) retrieval uncertainties can exceed 30% and 50%, respectively. For ice cloud top height (h), the averaged global uncertainty is 0.48km. Relatively large h uncertainty (e.g., > 1km) occurs for tau < 0.5. Analysis of 1month of the OE-IR retrievals shows large tau and r(sub eff) uncertainties in storm track regions and the southern oceans where convective clouds are frequently observed, as well as in high-latitude regions where temperature differences between the surface and cloud top are more ambiguous. Generally, comparisons between the OE-IR and the operational products show consistent tau and h retrievals. However, obvious differences between the OE-IR and the MODIS Collection 6 r(sub eff) are found.
NASA Astrophysics Data System (ADS)
Yamasoe, M. A.; do Rosário, N. M. E.; Barros, K. M.
2017-01-01
We analyzed the variability of downward solar irradiance reaching the surface at São Paulo city, Brazil, and estimated the climatological aerosol and cloud radiative effects. Eleven years of irradiance were analyzed, from 2005 to 2015. To distinguish the aerosol from the cloud effect, the radiative transfer code LibRadtran was used to calculate downward solar irradiance. Two runs were performed, one considering only ozone and water vapor daily variability, with AOD set to zero and the second allowing the three variables to change, according to mean climatological values. The difference of the 24 h mean irradiance calculated with and without aerosol resulted in the shortwave aerosol direct radiative effect, while the difference between the measured and calculated, including the aerosol, represented the cloud effect. Results showed that, climatologically, clouds can be 4 times more effective than aerosols. The cloud shortwave radiative effect presented a maximum reduction of about -170 W m-2 in January and a minimum in July, of -37 W m-2. The aerosol direct radiative effect was maximum in spring, when the transport of smoke from the Amazon and central parts of South America is frequent toward São Paulo. Around mid-September, the 24 h radiative effect due to aerosol only was estimated to be -50 W m-2. Throughout the rest of the year, the mean aerosol effect was around -20 W m-2 and was attributed to local urban sources. The effect of the cloud fraction on the cloud modification factor, defined as the ratio of all-sky irradiation to cloudless sky irradiation, showed dependence on the cloud height. Low clouds presented the highest impact while the presence of high clouds only almost did not affect solar transmittance, even in overcast conditions.
Monte Carlo verification of radiotherapy treatments with CloudMC.
Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José
2018-06-27
A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.
A service based adaptive U-learning system using UX.
Jeong, Hwa-Young; Yi, Gangman
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques.
A Service Based Adaptive U-Learning System Using UX
Jeong, Hwa-Young
2014-01-01
In recent years, traditional development techniques for e-learning systems have been changing to become more convenient and efficient. One new technology in the development of application systems includes both cloud and ubiquitous computing. Cloud computing can support learning system processes by using services while ubiquitous computing can provide system operation and management via a high performance technical process and network. In the cloud computing environment, a learning service application can provide a business module or process to the user via the internet. This research focuses on providing the learning material and processes of courses by learning units using the services in a ubiquitous computing environment. And we also investigate functions that support users' tailored materials according to their learning style. That is, we analyzed the user's data and their characteristics in accordance with their user experience. We subsequently applied the learning process to fit on their learning performance and preferences. Finally, we demonstrate how the proposed system outperforms learning effects to learners better than existing techniques. PMID:25147832
STAR FORMATION IN TURBULENT MOLECULAR CLOUDS WITH COLLIDING FLOW
DOE Office of Scientific and Technical Information (OSTI.GOV)
Matsumoto, Tomoaki; Dobashi, Kazuhito; Shimoikura, Tomomi, E-mail: matsu@hosei.ac.jp
2015-03-10
Using self-gravitational hydrodynamical numerical simulations, we investigated the evolution of high-density turbulent molecular clouds swept by a colliding flow. The interaction of shock waves due to turbulence produces networks of thin filamentary clouds with a sub-parsec width. The colliding flow accumulates the filamentary clouds into a sheet cloud and promotes active star formation for initially high-density clouds. Clouds with a colliding flow exhibit a finer filamentary network than clouds without a colliding flow. The probability distribution functions (PDFs) for the density and column density can be fitted by lognormal functions for clouds without colliding flow. When the initial turbulence ismore » weak, the column density PDF has a power-law wing at high column densities. The colliding flow considerably deforms the PDF, such that the PDF exhibits a double peak. The stellar mass distributions reproduced here are consistent with the classical initial mass function with a power-law index of –1.35 when the initial clouds have a high density. The distribution of stellar velocities agrees with the gas velocity distribution, which can be fitted by Gaussian functions for clouds without colliding flow. For clouds with colliding flow, the velocity dispersion of gas tends to be larger than the stellar velocity dispersion. The signatures of colliding flows and turbulence appear in channel maps reconstructed from the simulation data. Clouds without colliding flow exhibit a cloud-scale velocity shear due to the turbulence. In contrast, clouds with colliding flow show a prominent anti-correlated distribution of thin filaments between the different velocity channels, suggesting collisions between the filamentary clouds.« less
THE INFLUENCE OF NONUNIFORM CLOUD COVER ON TRANSIT TRANSMISSION SPECTRA
DOE Office of Scientific and Technical Information (OSTI.GOV)
Line, Michael R.; Parmentier, Vivien, E-mail: mrline@ucsc.edu
2016-03-20
We model the impact of nonuniform cloud cover on transit transmission spectra. Patchy clouds exist in nearly every solar system atmosphere, brown dwarfs, and transiting exoplanets. Our major findings suggest that fractional cloud coverage can exactly mimic high mean molecular weight atmospheres and vice versa over certain wavelength regions, in particular, over the Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3) bandpass (1.1–1.7 μm). We also find that patchy cloud coverage exhibits a signature that is different from uniform global clouds. Furthermore, we explain analytically why the “patchy cloud-high mean molecular weight” degeneracy exists. We also explore the degeneracy ofmore » nonuniform cloud coverage in atmospheric retrievals on both synthetic and real planets. We find from retrievals on a synthetic solar composition hot Jupiter with patchy clouds and a cloud-free high mean molecular weight warm Neptune that both cloud-free high mean molecular weight atmospheres and partially cloudy atmospheres can explain the data equally well. Another key finding is that the HST WFC3 transit transmission spectra of two well-observed objects, the hot Jupiter HD 189733b and the warm Neptune HAT-P-11b, can be explained well by solar composition atmospheres with patchy clouds without the need to invoke high mean molecular weight or global clouds. The degeneracy between high molecular weight and solar composition partially cloudy atmospheres can be broken by observing the molecular Rayleigh scattering differences between the two. Furthermore, the signature of partially cloudy limbs also appears as a ∼100 ppm residual in the ingress and egress of the transit light curves, provided that the transit timing is known to seconds.« less
Detecting Abnormal Machine Characteristics in Cloud Infrastructures
NASA Technical Reports Server (NTRS)
Bhaduri, Kanishka; Das, Kamalika; Matthews, Bryan L.
2011-01-01
In the cloud computing environment resources are accessed as services rather than as a product. Monitoring this system for performance is crucial because of typical pay-peruse packages bought by the users for their jobs. With the huge number of machines currently in the cloud system, it is often extremely difficult for system administrators to keep track of all machines using distributed monitoring programs such as Ganglia1 which lacks system health assessment and summarization capabilities. To overcome this problem, we propose a technique for automated anomaly detection using machine performance data in the cloud. Our algorithm is entirely distributed and runs locally on each computing machine on the cloud in order to rank the machines in order of their anomalous behavior for given jobs. There is no need to centralize any of the performance data for the analysis and at the end of the analysis, our algorithm generates error reports, thereby allowing the system administrators to take corrective actions. Experiments performed on real data sets collected for different jobs validate the fact that our algorithm has a low overhead for tracking anomalous machines in a cloud infrastructure.
Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support
Camargo, João; Rochol, Juergen; Gerla, Mario
2018-01-01
A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends. PMID:29364172
Symmetrical compression distance for arrhythmia discrimination in cloud-based big-data services.
Lillo-Castellano, J M; Mora-Jiménez, I; Santiago-Mozos, R; Chavarría-Asso, F; Cano-González, A; García-Alberola, A; Rojo-Álvarez, J L
2015-07-01
The current development of cloud computing is completely changing the paradigm of data knowledge extraction in huge databases. An example of this technology in the cardiac arrhythmia field is the SCOOP platform, a national-level scientific cloud-based big data service for implantable cardioverter defibrillators. In this scenario, we here propose a new methodology for automatic classification of intracardiac electrograms (EGMs) in a cloud computing system, designed for minimal signal preprocessing. A new compression-based similarity measure (CSM) is created for low computational burden, so-called weighted fast compression distance, which provides better performance when compared with other CSMs in the literature. Using simple machine learning techniques, a set of 6848 EGMs extracted from SCOOP platform were classified into seven cardiac arrhythmia classes and one noise class, reaching near to 90% accuracy when previous patient arrhythmia information was available and 63% otherwise, hence overcoming in all cases the classification provided by the majority class. Results show that this methodology can be used as a high-quality service of cloud computing, providing support to physicians for improving the knowledge on patient diagnosis.
Service Migration from Cloud to Multi-tier Fog Nodes for Multimedia Dissemination with QoE Support.
Rosário, Denis; Schimuneck, Matias; Camargo, João; Nobre, Jéferson; Both, Cristiano; Rochol, Juergen; Gerla, Mario
2018-01-24
A wide range of multimedia services is expected to be offered for mobile users via various wireless access networks. Even the integration of Cloud Computing in such networks does not support an adequate Quality of Experience (QoE) in areas with high demands for multimedia contents. Fog computing has been conceptualized to facilitate the deployment of new services that cloud computing cannot provide, particularly those demanding QoE guarantees. These services are provided using fog nodes located at the network edge, which is capable of virtualizing their functions/applications. Service migration from the cloud to fog nodes can be actuated by request patterns and the timing issues. To the best of our knowledge, existing works on fog computing focus on architecture and fog node deployment issues. In this article, we describe the operational impacts and benefits associated with service migration from the cloud to multi-tier fog computing for video distribution with QoE support. Besides that, we perform the evaluation of such service migration of video services. Finally, we present potential research challenges and trends.
Monte Carlo Calculations of Polarized Microwave Radiation Emerging from Cloud Structures
NASA Technical Reports Server (NTRS)
Kummerow, Christian; Roberti, Laura
1998-01-01
The last decade has seen tremendous growth in cloud dynamical and microphysical models that are able to simulate storms and storm systems with very high spatial resolution, typically of the order of a few kilometers. The fairly realistic distributions of cloud and hydrometeor properties that these models generate has in turn led to a renewed interest in the three-dimensional microwave radiative transfer modeling needed to understand the effect of cloud and rainfall inhomogeneities upon microwave observations. Monte Carlo methods, and particularly backwards Monte Carlo methods have shown themselves to be very desirable due to the quick convergence of the solutions. Unfortunately, backwards Monte Carlo methods are not well suited to treat polarized radiation. This study reviews the existing Monte Carlo methods and presents a new polarized Monte Carlo radiative transfer code. The code is based on a forward scheme but uses aliasing techniques to keep the computational requirements equivalent to the backwards solution. Radiative transfer computations have been performed using a microphysical-dynamical cloud model and the results are presented together with the algorithm description.
Discrimination of Biomass Burning Smoke and Clouds in MAIAC Algorithm
NASA Technical Reports Server (NTRS)
Lyapustin, A.; Korkin, S.; Wang, Y.; Quayle, B.; Laszlo, I.
2012-01-01
The multi-angle implementation of atmospheric correction (MAIAC) algorithm makes aerosol retrievals from MODIS data at 1 km resolution providing information about the fine scale aerosol variability. This information is required in different applications such as urban air quality analysis, aerosol source identification etc. The quality of high resolution aerosol data is directly linked to the quality of cloud mask, in particular detection of small (sub-pixel) and low clouds. This work continues research in this direction, describing a technique to detect small clouds and introducing the smoke test to discriminate the biomass burning smoke from the clouds. The smoke test relies on a relative increase of aerosol absorption at MODIS wavelength 0.412 micrometers as compared to 0.47-0.67 micrometers due to multiple scattering and enhanced absorption by organic carbon released during combustion. This general principle has been successfully used in the OMI detection of absorbing aerosols based on UV measurements. This paper provides the algorithm detail and illustrates its performance on two examples of wildfires in US Pacific North-West and in Georgia/Florida of 2007.
Wang, Minghuai; Larson, Vincent E.; Ghan, Steven; ...
2015-04-18
In this study, a higher-order turbulence closure scheme, called Cloud Layers Unified by Binormals (CLUBB), is implemented into a Multi-scale Modeling Framework (MMF) model to improve low cloud simulations. The performance of CLUBB in MMF simulations with two different microphysics configurations (one-moment cloud microphysics without aerosol treatment and two-moment cloud microphysics coupled with aerosol treatment) is evaluated against observations and further compared with results from the Community Atmosphere Model, Version 5 (CAM5) with conventional cloud parameterizations. CLUBB is found to improve low cloud simulations in the MMF, and the improvement is particularly evident in the stratocumulus-to-cumulus transition regions. Compared tomore » the single-moment cloud microphysics, CLUBB with two-moment microphysics produces clouds that are closer to the coast, and agrees better with observations. In the stratocumulus-to cumulus transition regions, CLUBB with two-moment cloud microphysics produces shortwave cloud forcing in better agreement with observations, while CLUBB with single moment cloud microphysics overestimates shortwave cloud forcing. CLUBB is further found to produce quantitatively similar improvements in the MMF and CAM5, with slightly better performance in the MMF simulations (e.g., MMF with CLUBB generally produces low clouds that are closer to the coast than CAM5 with CLUBB). As a result, improved low cloud simulations in MMF make it an even more attractive tool for studying aerosol-cloud-precipitation interactions.« less
NASA Engine Icing Research Overview: Aeronautics Evaluation and Test Capabilities (AETC) Project
NASA Technical Reports Server (NTRS)
Veres, Joseph P.
2015-01-01
The occurrence of ice accretion within commercial high bypass aircraft turbine engines has been reported by airlines under certain atmospheric conditions. Engine anomalies have taken place at high altitudes that have been attributed to ice crystal ingestion by the engine. The ice crystals can result in degraded engine performance, loss of thrust control, compressor surge or stall, and flameout of the combustor. The Aviation Safety Program at NASA has taken on the technical challenge of a turbofan engine icing caused by ice crystals which can exist in high altitude convective clouds. The NASA engine icing project consists of an integrated approach with four concurrent and ongoing research elements, each of which feeds critical information to the next element. The project objective is to gain understanding of high altitude ice crystals by developing knowledge bases and test facilities for testing full engines and engine components. The first element is to utilize a highly instrumented aircraft to characterize the high altitude convective cloud environment. The second element is the enhancement of the Propulsion Systems Laboratory altitude test facility for gas turbine engines to include the addition of an ice crystal cloud. The third element is basic research of the fundamental physics associated with ice crystal ice accretion. The fourth and final element is the development of computational tools with the goal of simulating the effects of ice crystal ingestion on compressor and gas turbine engine performance. The NASA goal is to provide knowledge to the engine and aircraft manufacturing communities to help mitigate, or eliminate turbofan engine interruptions, engine damage, and failures due to ice crystal ingestion.
Comparison between SAGE II and ISCCP high-level clouds. 1: Global and zonal mean cloud amounts
NASA Technical Reports Server (NTRS)
Liao, Xiaohan; Rossow, William B.; Rind, David
1995-01-01
Global high-level clouds identified in Stratospheric Aerosol and Gas Experiment II (SAGE II) occultation measurements for January and July in the period 1985 to 1990 are compared with near-nadir-looking observations from the International Satellite Cloud Climatology Project (ISCCP). Global and zonal mean high-level cloud amounts from the two data sets agree very well, if clouds with layer extinction coefficients of less than 0.008/km at 1.02 micrometers wavelength are removed from the SAGE II results and all detected clouds are interpreted to have an average horizontal size of about 75 km along the 200 km transimission path length of the SAGE II observations. The SAGE II results are much more sensitive to variations of assumed cloud size than to variations of detection threshold. The geographical distribution of cloud fractions shows good agreement, but systematic regional differences also indicate that the average cloud size varies somewhat among different climate regimes. The more sensitive SAGE II results show that about one third of all high-level clouds are missed by ISCCP but that these clouds have very low optical thicknesses (less than 0.1 at 0.6 micrometers wavelength). SAGE II sampling error in monthly zonal cloud fraction is shown to produce no bias, to be less than the intraseasonal natural variability, but to be comparable with the natural variability at longer time scales.
GATE Monte Carlo simulation in a cloud computing environment
NASA Astrophysics Data System (ADS)
Rowedder, Blake Austin
The GEANT4-based GATE is a unique and powerful Monte Carlo (MC) platform, which provides a single code library allowing the simulation of specific medical physics applications, e.g. PET, SPECT, CT, radiotherapy, and hadron therapy. However, this rigorous yet flexible platform is used only sparingly in the clinic due to its lengthy calculation time. By accessing the powerful computational resources of a cloud computing environment, GATE's runtime can be significantly reduced to clinically feasible levels without the sizable investment of a local high performance cluster. This study investigated a reliable and efficient execution of GATE MC simulations using a commercial cloud computing services. Amazon's Elastic Compute Cloud was used to launch several nodes equipped with GATE. Job data was initially broken up on the local computer, then uploaded to the worker nodes on the cloud. The results were automatically downloaded and aggregated on the local computer for display and analysis. Five simulations were repeated for every cluster size between 1 and 20 nodes. Ultimately, increasing cluster size resulted in a decrease in calculation time that could be expressed with an inverse power model. Comparing the benchmark results to the published values and error margins indicated that the simulation results were not affected by the cluster size and thus that integrity of a calculation is preserved in a cloud computing environment. The runtime of a 53 minute long simulation was decreased to 3.11 minutes when run on a 20-node cluster. The ability to improve the speed of simulation suggests that fast MC simulations are viable for imaging and radiotherapy applications. With high power computing continuing to lower in price and accessibility, implementing Monte Carlo techniques with cloud computing for clinical applications will continue to become more attractive.
Incorporation of a Cumulus Fraction Scheme in the GRAPES_Meso and Evaluation of Its Performance
NASA Astrophysics Data System (ADS)
Zheng, X.
2016-12-01
Accurate simulation of cloud cover fraction is a key and difficult issue in numerical modeling studies. Preliminary evaluations have indicated that cloud fraction is generally underestimated in GRAPES_Meso simulations, while the cloud fraction scheme (CFS) of ECMWF can provide more realistic results. Therefore, the ECMWF cumulus fraction scheme is introduced into GRAPES_Meso to replace the original CFS, and the model performance with the new CFS is evaluated based on simulated three-dimensional cloud fractions and surface temperature. Results indicate that the simulated cloud fractions increase and become more accurate with the new CFS; the simulation for vertical cloud structure has improved too; errors in surface temperature simulation have decreased. The above analysis and results suggest that the new CFS has a positive impact on cloud fraction and surface temperature simulation.
Assessing the Performance of a Machine Learning Algorithm in Identifying Bubbles in Dust Emission
NASA Astrophysics Data System (ADS)
Xu, Duo; Offner, Stella S. R.
2017-12-01
Stellar feedback created by radiation and winds from massive stars plays a significant role in both physical and chemical evolution of molecular clouds. This energy and momentum leaves an identifiable signature (“bubbles”) that affects the dynamics and structure of the cloud. Most bubble searches are performed “by eye,” which is usually time-consuming, subjective, and difficult to calibrate. Automatic classifications based on machine learning make it possible to perform systematic, quantifiable, and repeatable searches for bubbles. We employ a previously developed machine learning algorithm, Brut, and quantitatively evaluate its performance in identifying bubbles using synthetic dust observations. We adopt magnetohydrodynamics simulations, which model stellar winds launching within turbulent molecular clouds, as an input to generate synthetic images. We use a publicly available three-dimensional dust continuum Monte Carlo radiative transfer code, HYPERION, to generate synthetic images of bubbles in three Spitzer bands (4.5, 8, and 24 μm). We designate half of our synthetic bubbles as a training set, which we use to train Brut along with citizen-science data from the Milky Way Project (MWP). We then assess Brut’s accuracy using the remaining synthetic observations. We find that Brut’s performance after retraining increases significantly, and it is able to identify yellow bubbles, which are likely associated with B-type stars. Brut continues to perform well on previously identified high-score bubbles, and over 10% of the MWP bubbles are reclassified as high-confidence bubbles, which were previously marginal or ambiguous detections in the MWP data. We also investigate the influence of the size of the training set, dust model, evolutionary stage, and background noise on bubble identification.
Fielding, M. D.; Chiu, J. C.; Hogan, R. J.; ...
2015-02-16
Active remote sensing of marine boundary-layer clouds is challenging as drizzle drops often dominate the observed radar reflectivity. We present a new method to simultaneously retrieve cloud and drizzle vertical profiles in drizzling boundary-layer cloud using surface-based observations of radar reflectivity, lidar attenuated backscatter, and zenith radiances. Specifically, the vertical structure of droplet size and water content of both cloud and drizzle is characterised throughout the cloud. An ensemble optimal estimation approach provides full error statistics given the uncertainty in the observations. To evaluate the new method, we first perform retrievals using synthetic measurements from large-eddy simulation snapshots of cumulusmore » under stratocumulus, where cloud water path is retrieved with an error of 31 g m −2. The method also performs well in non-drizzling clouds where no assumption of the cloud profile is required. We then apply the method to observations of marine stratocumulus obtained during the Atmospheric Radiation Measurement MAGIC deployment in the northeast Pacific. Here, retrieved cloud water path agrees well with independent 3-channel microwave radiometer retrievals, with a root mean square difference of 10–20 g m −2.« less
Formation of highly porous aerosol particles by atmospheric freeze-drying in ice clouds
Adler, Gabriela; Koop, Thomas; Haspel, Carynelisa; Taraniuk, Ilya; Moise, Tamar; Koren, Ilan; Heiblum, Reuven H.; Rudich, Yinon
2013-01-01
The cycling of atmospheric aerosols through clouds can change their chemical and physical properties and thus modify how aerosols affect cloud microphysics and, subsequently, precipitation and climate. Current knowledge about aerosol processing by clouds is rather limited to chemical reactions within water droplets in warm low-altitude clouds. However, in cold high-altitude cirrus clouds and anvils of high convective clouds in the tropics and midlatitudes, humidified aerosols freeze to form ice, which upon exposure to subsaturation conditions with respect to ice can sublimate, leaving behind residual modified aerosols. This freeze-drying process can occur in various types of clouds. Here we simulate an atmospheric freeze-drying cycle of aerosols in laboratory experiments using proxies for atmospheric aerosols. We find that aerosols that contain organic material that undergo such a process can form highly porous aerosol particles with a larger diameter and a lower density than the initial homogeneous aerosol. We attribute this morphology change to phase separation upon freezing followed by a glass transition of the organic material that can preserve a porous structure after ice sublimation. A porous structure may explain the previously observed enhancement in ice nucleation efficiency of glassy organic particles. We find that highly porous aerosol particles scatter solar light less efficiently than nonporous aerosol particles. Using a combination of satellite and radiosonde data, we show that highly porous aerosol formation can readily occur in highly convective clouds, which are widespread in the tropics and midlatitudes. These observations may have implications for subsequent cloud formation cycles and aerosol albedo near cloud edges. PMID:24297908
Formation of highly porous aerosol particles by atmospheric freeze-drying in ice clouds.
Adler, Gabriela; Koop, Thomas; Haspel, Carynelisa; Taraniuk, Ilya; Moise, Tamar; Koren, Ilan; Heiblum, Reuven H; Rudich, Yinon
2013-12-17
The cycling of atmospheric aerosols through clouds can change their chemical and physical properties and thus modify how aerosols affect cloud microphysics and, subsequently, precipitation and climate. Current knowledge about aerosol processing by clouds is rather limited to chemical reactions within water droplets in warm low-altitude clouds. However, in cold high-altitude cirrus clouds and anvils of high convective clouds in the tropics and midlatitudes, humidified aerosols freeze to form ice, which upon exposure to subsaturation conditions with respect to ice can sublimate, leaving behind residual modified aerosols. This freeze-drying process can occur in various types of clouds. Here we simulate an atmospheric freeze-drying cycle of aerosols in laboratory experiments using proxies for atmospheric aerosols. We find that aerosols that contain organic material that undergo such a process can form highly porous aerosol particles with a larger diameter and a lower density than the initial homogeneous aerosol. We attribute this morphology change to phase separation upon freezing followed by a glass transition of the organic material that can preserve a porous structure after ice sublimation. A porous structure may explain the previously observed enhancement in ice nucleation efficiency of glassy organic particles. We find that highly porous aerosol particles scatter solar light less efficiently than nonporous aerosol particles. Using a combination of satellite and radiosonde data, we show that highly porous aerosol formation can readily occur in highly convective clouds, which are widespread in the tropics and midlatitudes. These observations may have implications for subsequent cloud formation cycles and aerosol albedo near cloud edges.
NASA Astrophysics Data System (ADS)
Kovalskyy, V.; Roy, D. P.
2014-12-01
The successful February 2013 launch of the Landsat 8 satellite is continuing the 40+ year legacy of the Landsat mission. The payload includes the Operational Land Imager (OLI) that has a new 1370 mm band designed to monitor cirrus clouds and the Thermal Infrared Sensor (TIRS) that together provide 30m low, medium and high confidence cloud detections and 30m low and high confidence cirrus cloud detections. A year of Landsat 8 data over the Conterminous United States (CONUS), composed of 11,296 acquisitions, was analyzed comparing the spatial and temporal incidence of these cloud and cirrus states. This revealed (i) 36.5% of observations were detected with high confidence cloud with spatio-temporal patterns similar to those observed by previous Landsat 7 cloud analyses, (ii) 29.2% were high confidence cirrus, (iii) 20.9% were both high confidence cloud and high confidence cirrus, (iv) 8.3% were detected as high confidence cirrus but not as high confidence cloud. The results illustrate the value of the cirrus band for improved Landsat 8 terrestrial monitoring but imply that the historical CONUS Landsat archive has a similar 8% of undetected cirrus contaminated pixels. The implications for long term Landsat time series records, including the global Web Enabled Landsat Data (WELD) product record, are discussed.
NASA Technical Reports Server (NTRS)
Welch, R. M.; Sengupta, S. K.; Chen, D. W.
1988-01-01
Stratocumulus, cumulus, and cirrus clouds were identified on the basis of cloud textural features which were derived from a single high-resolution Landsat MSS NIR channel using a stepwise linear discriminant analysis. It is shown that, using this method, it is possible to distinguish high cirrus clouds from low clouds with high accuracy on the basis of spatial brightness patterns. The largest probability of misclassification is associated with confusion between the stratocumulus breakup regions and the fair-weather cumulus.
Overview of the CERES Edition-4 Multilayer Cloud Property Datasets
NASA Astrophysics Data System (ADS)
Chang, F. L.; Minnis, P.; Sun-Mack, S.; Chen, Y.; Smith, R. A.; Brown, R. R.
2014-12-01
Knowledge of the cloud vertical distribution is important for understanding the role of clouds on earth's radiation budget and climate change. Since high-level cirrus clouds with low emission temperatures and small optical depths can provide a positive feedback to a climate system and low-level stratus clouds with high emission temperatures and large optical depths can provide a negative feedback effect, the retrieval of multilayer cloud properties using satellite observations, like Terra and Aqua MODIS, is critically important for a variety of cloud and climate applications. For the objective of the Clouds and the Earth's Radiant Energy System (CERES), new algorithms have been developed using Terra and Aqua MODIS data to allow separate retrievals of cirrus and stratus cloud properties when the two dominant cloud types are simultaneously present in a multilayer system. In this paper, we will present an overview of the new CERES Edition-4 multilayer cloud property datasets derived from Terra as well as Aqua. Assessment of the new CERES multilayer cloud datasets will include high-level cirrus and low-level stratus cloud heights, pressures, and temperatures as well as their optical depths, emissivities, and microphysical properties.
Observed Cloud Properties Above the Northern Indian Ocean During CARDEX 2012
NASA Astrophysics Data System (ADS)
Gao, L.; Wilcox, E. M.
2016-12-01
An analysis of cloud microphysical, macrophysical and radiative properties during the dry winter monsoon season above the northern Indian Ocean is presented. The Cloud Aerosol Radiative Forcing Experiment (CARDEX), conducted from 16 February to 30 March 2012 at the Maldives Climate Observatory on Hanimaadhoo (MCOH), used autonomous unmanned aerial vehicles (UAVs) to measure the aerosol profiles, water vapor flux and cloud properties concurrent with continuous ground measurements of surface aerosol and meteorological variables as well as the total-column precipitable water vapor (PWV) and the cloud liquid water path (LWP). Here we present the cloud properties only for the cases with lower atmospheric water vapor using the criterion that the PWV less than 40 kg/m2. This criterion acts to filter the data to control for the natural meteorological variability in the region according to previous studies. The high polluted case is found to correlate with warmer temperature, higher relative humidity in boundary layer and lower lifted condensation level (LCL). Micro Pulse Lidar (MPL) retrieved cloud base height coincides with calculated LCL height which is lower for high polluted case. Meanwhile satellite retrieved cloud top height didn't show obvious variation indicating cloud deepening which is consistent with the observed greater cloud LWP in high polluted case. Those high polluted clouds are associated with more cloud droplets and smaller effective radius and are generally becoming narrower due to the stronger cloud side evaporation-entrainment effect and becoming deeper due to more moist static energy. Clouds in high polluted condition become brighter with higher albedo which can cause a net shortwave forcing over -40 W/m2 in this region.
Secure data sharing in public cloud
NASA Astrophysics Data System (ADS)
Venkataramana, Kanaparti; Naveen Kumar, R.; Tatekalva, Sandhya; Padmavathamma, M.
2012-04-01
Secure multi-party protocols have been proposed for entities (organizations or individuals) that don't fully trust each other to share sensitive information. Many types of entities need to collect, analyze, and disseminate data rapidly and accurately, without exposing sensitive information to unauthorized or untrusted parties. Solutions based on secure multiparty computation guarantee privacy and correctness, at an extra communication (too costly in communication to be practical) and computation cost. The high overhead motivates us to extend this SMC to cloud environment which provides large computation and communication capacity which makes SMC to be used between multiple clouds (i.e., it may between private or public or hybrid clouds).Cloud may encompass many high capacity servers which acts as a hosts which participate in computation (IaaS and PaaS) for final result, which is controlled by Cloud Trusted Authority (CTA) for secret sharing within the cloud. The communication between two clouds is controlled by High Level Trusted Authority (HLTA) which is one of the hosts in a cloud which provides MgaaS (Management as a Service). Due to high risk for security in clouds, HLTA generates and distributes public keys and private keys by using Carmichael-R-Prime- RSA algorithm for exchange of private data in SMC between itself and clouds. In cloud, CTA creates Group key for Secure communication between the hosts in cloud based on keys sent by HLTA for exchange of Intermediate values and shares for computation of final result. Since this scheme is extended to be used in clouds( due to high availability and scalability to increase computation power) it is possible to implement SMC practically for privacy preserving in data mining at low cost for the clients.
Distributed MRI reconstruction using Gadgetron-based cloud computing.
Xue, Hui; Inati, Souheil; Sørensen, Thomas Sangild; Kellman, Peter; Hansen, Michael S
2015-03-01
To expand the open source Gadgetron reconstruction framework to support distributed computing and to demonstrate that a multinode version of the Gadgetron can be used to provide nonlinear reconstruction with clinically acceptable latency. The Gadgetron framework was extended with new software components that enable an arbitrary number of Gadgetron instances to collaborate on a reconstruction task. This cloud-enabled version of the Gadgetron was deployed on three different distributed computing platforms ranging from a heterogeneous collection of commodity computers to the commercial Amazon Elastic Compute Cloud. The Gadgetron cloud was used to provide nonlinear, compressed sensing reconstruction on a clinical scanner with low reconstruction latency (eg, cardiac and neuroimaging applications). The proposed setup was able to handle acquisition and 11 -SPIRiT reconstruction of nine high temporal resolution real-time, cardiac short axis cine acquisitions, covering the ventricles for functional evaluation, in under 1 min. A three-dimensional high-resolution brain acquisition with 1 mm(3) isotropic pixel size was acquired and reconstructed with nonlinear reconstruction in less than 5 min. A distributed computing enabled Gadgetron provides a scalable way to improve reconstruction performance using commodity cluster computing. Nonlinear, compressed sensing reconstruction can be deployed clinically with low image reconstruction latency. © 2014 Wiley Periodicals, Inc.
Grids, virtualization, and clouds at Fermilab
Timm, S.; Chadwick, K.; Garzoglio, G.; ...
2014-06-11
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture andmore » the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). Lastly, this work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.« less
Grids, virtualization, and clouds at Fermilab
NASA Astrophysics Data System (ADS)
Timm, S.; Chadwick, K.; Garzoglio, G.; Noh, S.
2014-06-01
Fermilab supports a scientific program that includes experiments and scientists located across the globe. To better serve this community, in 2004, the (then) Computing Division undertook the strategy of placing all of the High Throughput Computing (HTC) resources in a Campus Grid known as FermiGrid, supported by common shared services. In 2007, the FermiGrid Services group deployed a service infrastructure that utilized Xen virtualization, LVS network routing and MySQL circular replication to deliver highly available services that offered significant performance, reliability and serviceability improvements. This deployment was further enhanced through the deployment of a distributed redundant network core architecture and the physical distribution of the systems that host the virtual machines across multiple buildings on the Fermilab Campus. In 2010, building on the experience pioneered by FermiGrid in delivering production services in a virtual infrastructure, the Computing Sector commissioned the FermiCloud, General Physics Computing Facility and Virtual Services projects to serve as platforms for support of scientific computing (FermiCloud 6 GPCF) and core computing (Virtual Services). This work will present the evolution of the Fermilab Campus Grid, Virtualization and Cloud Computing infrastructure together with plans for the future.
Facilitating NASA Earth Science Data Processing Using Nebula Cloud Computing
NASA Astrophysics Data System (ADS)
Chen, A.; Pham, L.; Kempler, S.; Theobald, M.; Esfandiari, A.; Campino, J.; Vollmer, B.; Lynnes, C.
2011-12-01
Cloud Computing technology has been used to offer high-performance and low-cost computing and storage resources for both scientific problems and business services. Several cloud computing services have been implemented in the commercial arena, e.g. Amazon's EC2 & S3, Microsoft's Azure, and Google App Engine. There are also some research and application programs being launched in academia and governments to utilize Cloud Computing. NASA launched the Nebula Cloud Computing platform in 2008, which is an Infrastructure as a Service (IaaS) to deliver on-demand distributed virtual computers. Nebula users can receive required computing resources as a fully outsourced service. NASA Goddard Earth Science Data and Information Service Center (GES DISC) migrated several GES DISC's applications to the Nebula as a proof of concept, including: a) The Simple, Scalable, Script-based Science Processor for Measurements (S4PM) for processing scientific data; b) the Atmospheric Infrared Sounder (AIRS) data process workflow for processing AIRS raw data; and c) the GES-DISC Interactive Online Visualization ANd aNalysis Infrastructure (GIOVANNI) for online access to, analysis, and visualization of Earth science data. This work aims to evaluate the practicability and adaptability of the Nebula. The initial work focused on the AIRS data process workflow to evaluate the Nebula. The AIRS data process workflow consists of a series of algorithms being used to process raw AIRS level 0 data and output AIRS level 2 geophysical retrievals. Migrating the entire workflow to the Nebula platform is challenging, but practicable. After installing several supporting libraries and the processing code itself, the workflow is able to process AIRS data in a similar fashion to its current (non-cloud) configuration. We compared the performance of processing 2 days of AIRS level 0 data through level 2 using a Nebula virtual computer and a local Linux computer. The result shows that Nebula has significantly better performance than the local machine. Much of the difference was due to newer equipment in the Nebula than the legacy computer, which is suggestive of a potential economic advantage beyond elastic power, i.e., access to up-to-date hardware vs. legacy hardware that must be maintained past its prime to amortize the cost. In addition to a trade study of advantages and challenges of porting complex processing to the cloud, a tutorial was developed to enable further progress in utilizing the Nebula for Earth Science applications and understanding better the potential for Cloud Computing in further data- and computing-intensive Earth Science research. In particular, highly bursty computing such as that experienced in the user-demand-driven Giovanni system may become more tractable in a Cloud environment. Our future work will continue to focus on migrating more GES DISC's applications/instances, e.g. Giovanni instances, to the Nebula platform and making matured migrated applications to be in operation on the Nebula.
NASA Astrophysics Data System (ADS)
Bianco, A.; Chaumerliac, N.; Vaitilingom, M.; Deguillaume, L.; Bridoux, M. C.
2017-12-01
The chemical composition of organic matter in cloud water is highly complex. The organic species result from their dissolution from the gas phase or from the soluble fraction of the particle phase. They are also produced by aqueous phase reactivity. Several low molecular weight organic species have been quantified such as aldehydes and carboxylic acids. Recently, amino acids were also detected in cloud water and their presence is related to the presence of microorganisms. Compounds presenting similarities with high molecular weight organic substances or HULIS found in aerosols were also observed in clouds. Overall, these studies mainly focused on individual compounds or functional groups rather than the complex mixture at the molecular level. This study presents a non-targeted approach to characterize the organic matter in clouds. Samples were collected at the puy de Dôme Mountain (France). Two cloud water samples (June & July 2016) were analyzed using high resolution mass spectrometry (ESI-FT-ICR-MS 9.4T). A reversed solid phase extraction (SPE) procedure was performed to concentrate dissolved organic matter components. Composer (v.1.5.3) software was used to filter the mass spectral data, recalibrate externally the dataset and calculate all possible formulas for detected anions. The first cloud sample (June) resulted from air mass coming from the North (North Sea) while the second one (July) resulted from air mass coming from the West (Atlantic Ocean). Thus, both cloud events derived from marine air masses but were characterized by different hydrogen peroxide concentration and dissolved organic carbon content and were sampled at different periods during the day. Elemental compositions of 6487 and 3284 unique molecular species were identified in each sample. Nitrogen-containing compounds (CHNO compounds), sulfur-containing compounds (CHOS & CHNOS compounds) and other oxygen-containing compounds (CHO compounds) with molecular weights up to 800 Da were detected. The main class is CHNO (53% for both samples) while sulfur-containing compounds represent for the two samples respectively 21 & 14% of the total assigned molecular formulas. CHO compounds molecular formulas are respectively 25 & 32%. Among the two samples, only 2490 molecular formulas were found common to the two samples.
Retrievals of Cloud Droplet Size from the RSP Data: Validation Using in Situ Measurements
NASA Technical Reports Server (NTRS)
Alexandrov, Mikhail D.; Cairns, Brian; Sinclair, Kenneth; Wasilewski, Andrzej P.; Ziemba, Luke; Crosbie, Ewan; Hair, John; Hu, Yongxiang; Hostetler, Chris; Stamnes, Snorre
2016-01-01
We present comparisons of cloud droplet size distributions retrieved from the Research Scanning Polarimeter (RSP) data with correlative in situ measurements made during the North Atlantic Aerosols and Marine Ecosystems Study (NAAMES). This field experiment was based at St. Johns airport, Newfoundland, Canada with the latest deployment in May - June 2016. RSP was onboard the NASA C-130 aircraft together with an array of in situ and other remote sensing instrumentation. The RSP is an along-track scanner measuring polarized and total reflectances in9 spectral channels. Its unique high angular resolution allows for characterization of liquid water droplet size using the rainbow structure observed in the polarized reflectances in the scattering angle range between 135 and 165 degrees. A parametric fitting algorithm applied to the polarized reflectances provides retrievals of the droplet effective radius and variance assuming a prescribed size distribution shape (gamma distribution). In addition to this, we use a non-parametric method, Rainbow Fourier Transform (RFT), which allows us to retrieve the droplet size distribution (DSD) itself. The latter is important in the case of clouds with complex structure, which results in multi-modal DSDs. During NAAMES the aircraft performed a number of flight patterns specifically designed for comparison of remote sensing retrievals and in situ measurements. These patterns consisted of two flight segments above the same straight ground track. One of these segments was flown above clouds allowing for remote sensing measurements, while the other was at the cloud top where cloud droplets were sampled. We compare the DSDs retrieved from the RSP data with in situ measurements made by the Cloud Droplet Probe (CDP). The comparisons show generally good agreement with deviations explainable by the position of the aircraft within cloud and by presence of additional cloud layers in RSP view that do not contribute to the in situ DSDs. In the latter case the distributions retrieved from the RSP data were consistent with the multi-layer cloud structures observed in the correlative High Spectral Resolution Lidar (HSRL) profiles. The comparison results provide a rare validation of polarimetric droplet size retrieval techniques, which can be used for analysis of satellite data on global scale.
NASA Astrophysics Data System (ADS)
Tanaka, Kunihiko; Oka, Tomoharu; Nagai, Makoto; Kamegai, Kazuhisa
2015-08-01
The central 400 pc region of the Milky Way Galaxy is the closest galactic central region to us, providing a unique opportunity to detailedly investigate gas dynamics, star formation activity, and chemistry under the extreme environment of galactic centers, where the presence of bar, intense UV/cosmic-ray fluxes, high degree of turbulence may significantly affect those processes. We report the results of molecular line surveys toward the Milky Way's central molecular zone (CMZ) performed with the ASTE 10m telescope, the Mopra 22m telescope, and the Nobeyama 45 m telescope. With the observations of the 500 GHz [CI] fine structure line of atomic carbon (C0), we have found a molecular cloud structure with remarkably bright [CI] emission in the Sgr A comlex in the innermost 20 pc region. The [CI] cloud is more extended than the GMCs in the region, and appears to connect the northern part of the 50 kms-1 (M-0.02-0.07) and the circumnuclear disk (CND), though no corresponding structures are visible in other molecular lines. The [C0]/[CO] abundance ratio is measured to be 0.5-2, which is 2-10 times those measured to the clouds at larger Galactic radii. This high ratio is close to the values measured toward centers of galaxies with starburst and AGN, suggesting that the chemical state of the cloud is similar to that in those active galaxies. We have also found a large scale gradient of the cyano radical (CN) abundance toward the Galactic center in the innermost 100 pc radius, showing near the Sgr A complex. We suggest that the cloud with high C0 and CN abundance is a feature formed as a result of inward transfer of diffuse molecular gas by the bar potential in the inner Galaxy, in which PDR-like chemical composition remains preserved, and that thus the [CI] cloud could be deeply related to formation of the GMCs and star formation in the CMZ. We also discuss other possible mechanisms to enhance C0 and CN abundances, including the enhanced cosmic-ray dissociation ratio.
Photogrammetric Characterization of a Brownout Cloud
NASA Technical Reports Server (NTRS)
Tanner, Philip E.
2011-01-01
Brownout is a dangerous problem for rotorcraft operating in arid and dusty environments such as the current operating theaters in Iraq and Afghanistan. Although the interest in brownout has increased in the past decade, the fundamental physics that govern the shape and size of the cloud are not yet well understood. Many computational and scaled experimental studies have been performed in an attempt to further this understanding and to simulate and predict the brownout cloud formation. However, the phenomenon significantly lacks experimental data, particularly at full-scale, which is needed to help validate the brownout simulations being performed. In an effort to increase the data set needed for this validation, tests were performed at the US Army Yuma Proving Ground using photogrammetry to obtain brownout cloud data of an EH-60L Black Hawk. Particle testing was performed on a sample of sand from the landing zone to gain more understanding on the nature of the soil. The photogrammetry technique applied to obtaining data on the formation and evolution of a brownout cloud was verified in an earlier study. The data for a landing approach was examined in greater detail and enabled velocity components of points on the cloud to be determined, as well as the dimensions of structures within the cloud.
NASA Astrophysics Data System (ADS)
Schulz, Christiane; Schneider, Johannes; Mertes, Stephan; Kästner, Udo; Weinzierl, Bernadett; Sauer, Daniel; Fütterer, Daniel; Walser, Adrian; Borrmann, Stephan
2015-04-01
Airborne measurements of submicron aerosol and cloud particles were conducted in the region of Manaus (Amazonas, Brazil) during the ACRIDICON-CHUVA campaign in September 2014. ACRIDICON-CHUVA aimed at the investigation of convective cloud systems in order to get a better understanding and quantification of aerosol-cloud-interactions and radiative effects of convective clouds. For that, data from airborne measurements within convective cloud systems are combined with satellite and ground-based data. We used a C-ToF-AMS (Compact-Time-of-Flight-Aerosol-Mass-Spectrometer) to obtain information on aerosol composition and vertical profiles of different aerosol species, like organics, sulphate, nitrate, ammonium and chloride. The instrument was operated behind two different inlets: The HASI (HALO Aerosol Submicrometer Inlet) samples aerosol particles, whereas the CVI (Counterflow Virtual Impactor) samples cloud droplets and ice particles during in-cloud measurements, such that cloud residual particles can be analyzed. Differences in aerosol composition inside and outside of clouds and cloud properties over forested or deforested region were investigated. Additionally, the in- and outflow of convective clouds was sampled on dedicated cloud missions in order to study the evolution of the clouds and the processing of aerosol particles. First results show high organic aerosol mass concentrations (typically 15 μg/m3 and during one flight up to 25 μg/m3). Although high amounts of organic aerosol in tropic air over rainforest regions were expected, such high mass concentrations were not anticipated. Next to that, high sulphate aerosol mass concentrations (about 4 μg/m3) were measured at low altitudes (up to 5 km). During some flights organic and nitrate aerosol was observed with higher mass concentrations at high altitudes (10-12 km) than at lower altitudes, indicating redistribution of boundary layer particles by convection. The cloud residuals measured during in-cloud sampling through the CVI contained mainly organic material and, to a lesser extent, nitrate.
Using a cloud to replenish parched groundwater modeling efforts.
Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B
2010-01-01
Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.
Using a cloud to replenish parched groundwater modeling efforts
Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.
2010-01-01
Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.
Interannual variability of high ice cloud properties over the tropics
NASA Astrophysics Data System (ADS)
Tamura, S.; Iwabuchi, H.
2015-12-01
The El Niño/Southern Oscillation (ENSO) affects atmospheric conditions and cloud physical properties such as cloud fraction (CF) and cloud top height (CTH). However, an impact of the ENSO on physical properties in high-ice cloud is not well known. Therefore, this study attempts to reveal relationship between variability of ice cloud physical properties and ENSO. Ice clouds are inferred with the multiband IR method in this study. Ice clouds are categorized in terms of cloud optical thickness (COT) as thin (0.1< COT <0.3), opaque (0.3< COT <3.6), thick (3.6< COT <11), and deep convective (DC) (11< COT) clouds, and relationship between ENSO and interannual variability of cloud physical properties is investigated for each category during the period from January 2003 to December 2014. The deseasonalized anomalies of CF and CTH in all categories correlate well with Niño3.4 index, with positive anomaly over the eastern Pacific and negative anomaly over the western Pacific during El Niño condition. However, the global distribution of these correlation coefficients is different by cloud categories. For example, CF of DC correlates well with Niño3.4 index over the convergence zone, while, that of thin cloud shows high correlation extending to high latitude from convergence zone, suggesting a connection with cloud formation. The global distributions of average rate of change differ by cloud category, because the different associate with ENSO and gradual trend toward La Niña condition had occurred over the analysis period. In this conference, detailed results and relationship between variability of cloud physical properties and atmospheric conditions will be shown.
NASA Astrophysics Data System (ADS)
Wiacek, A.; Peter, T.; Lohmann, U.
2010-09-01
This modelling study explores the availability of mineral dust particles as ice nuclei for interactions with ice, mixed-phase and liquid water clouds, also tracking the particles' history of cloud-processing. We performed 61 320 one-week forward trajectory calculations originating near the surface of major dust emitting regions in Africa and Asia using high-resolution meteorological analysis fields for the year 2007. Dust-bearing trajectories were assumed to be those coinciding with known dust emission seasons, without explicitly modelling dust emission and deposition processes. We found that dust emissions from Asian deserts lead to a higher potential for interactions with high ice clouds, despite being the climatologically much smaller dust emission source. This is due to Asian regions experiencing significantly more ascent than African regions, with strongest ascent in the Asian Taklimakan desert at ~25%, ~40% and 10% of trajectories ascending to 300 hPa in spring, summer and fall, respectively. The specific humidity at each trajectory's starting point was transported in a Lagrangian manner and relative humidities with respect to water and ice were calculated in 6-h steps downstream, allowing us to estimate the formation of liquid, mixed-phase and ice clouds. Downstream of the investigated dust sources, practically none of the simulated air parcels reached conditions of homogeneous ice nucleation (T≲-40 °C) along trajectories that have not experienced water saturation first. By far the largest fraction of cloud forming trajectories entered conditions of mixed-phase clouds, where mineral dust will potentially exert the biggest influence. The majority of trajectories also passed through atmospheric regions supersaturated with respect to ice but subsaturated with respect to water, where so-called "warm ice clouds" (T≳-40 °C) theoretically may form prior to supercooled water or mixed-phase clouds. The importance of "warm ice clouds" and the general influence of dust in the mixed-phase cloud region are highly uncertain due to both a considerable scatter in recent laboratory data from ice nucleation experiments, which we briefly review in this work, and due to uncertainties in sub-grid scale vertical transport processes unresolved by the present trajectory analysis. For "classical" cirrus-forming temperatures (T≲-40 °C), our results show that only mineral dust ice nuclei that underwent mixed-phase cloud-processing, most likely acquiring coatings of organic or inorganic material, are likely to be relevant. While the potential paucity of deposition ice nuclei shown in this work dimishes the possibility of deposition nucleation, the absence of liquid water droplets at T≲-40 °C makes the less explored contact freezing mechanism (involving droplet collisions with bare ice nuclei) highly inefficient. These factors together indicate the necessity of further systematic studies of immersion mode ice nucleation on mineral dust suspended in atmospherically relevant coatings.
NASA Technical Reports Server (NTRS)
Matsui, Toshihisa; Masunaga, Hirohiko; Kreidenweis, Sonia M.; Pielke, Roger A., Sr.; Tao, Wei-Kuo; Chin, Mian; Kaufman, Yoram J.
2006-01-01
This study examines variability in marine low cloud properties derived from semi-global observations by the Tropical Rainfall Measuring Mission (TRMM) satellite, as linked to the aerosol index (AI) and lower-tropospheric stability (LTS). AI is derived from the Moderate Resolution Imaging Spectroradiometer (Terra MODIS) sensor and the Goddard Chemistry Aerosol Radiation and Transportation (GOCART) model, and is used to represent column-integrated aerosol concentrations. LTS is derived from the NCEP/NCAR reanalysis, and represents the background thermodynamic environment in which the clouds form. Global statistics reveal that cloud droplet size tends to be smallest in polluted (high-AI) and strong inversion (high-LTS) environments. Statistical quantification shows that cloud droplet size is better correlated with AI than it is with LTS. Simultaneously, the cloud liquid water path (CLWP) tends to decrease as AI increases. This correlation does not support the hypothesis or assumption that constant or increased CLWP is associated with high aerosol concentrations. Global variability in corrected cloud albedo (CCA), the product of cloud optical depth and cloud fraction, is very well explained by LTS, while both AI and LTS are needed to explain local variability in CCA. Most of the local correlations between AI and cloud properties are similar to the results from the global statistics, while weak anomalous aerosol-cloud correlations appear locally in the regions where simultaneous high (low) AI and low (high) LTS compensate each other. Daytime diurnal cycles explain additional variability in cloud properties. CCA has the largest diurnal cycle in high-LTS regions. Cloud droplet size and CLWP have weak diurnal cycles that differ between clean and polluted environments. The combined results suggest that investigations of marine low cloud radiative forcing and its relationship to hypothesized aerosol indirect effects must consider the combined effects of aerosols, thermodynamics, and the diurnal cycle.
NASA Astrophysics Data System (ADS)
Turtle, E. P.; Barnes, J. W.; Perry, J.; Barbara, J.; Hayes, A.; Corlies, P.; Kelland, J.; West, R. A.; Del Genio, A. D.; Soderblom, J. M.; McEwen, A. S.; Sotin, C.
2016-12-01
As northern summer approaches, atmospheric circulation models predict storm activity will pick up at Titan's high northern latitudes, as was observed at high southern latitudes upon Cassini's arrival during late southern summer in 2004. Cassini's Imaging Science Subsystem (ISS) and Visual and Infrared Mapping Spectrometer (VIMS) teams have been targeting Titan to document changes in weather patterns over the course of the mission, and there is particular interest in following the onset of clouds in the north polar region where Titan's lakes and seas are concentrated. The T120 and T121 flybys of Titan, on 7 June and 25 July 2016, respectively, provided views of high northern latitudes, and each instrument performed a series of observations over more than 24 hours during both flybys. Intriguingly, at first look the ISS and VIMS observations appear strikingly different from each other: in the ISS observations made during each flyby, surface features are apparent and only a few isolated clouds are detected; however, the VIMS observations suggest widespread cloud cover at high northern latitudes during both flybys. Although the instruments achieve different resolutions, that alone cannot explain the differences. The observations were made over the same time periods, so differences in illumination geometry or changes in the clouds themselves are also unlikely to be the cause for the apparent discrepancy; VIMS shows persistent atmospheric features over the entire observation period and ISS consistently detects surface features with just a few localized clouds. Clouds with low optical depth (lower than the optical depth of Titan's atmospheric haze at the same wavelength) might be more easily apparent at the longer wavelengths of the VIMS observations, which extend out to 5 µm (haze optical depth 0.2), compared to the ISS observations at 938 nm (haze optical depth 2). However, the lack of any apparent change in the visibility of lakes and seas in the ISS images compared to previous flybys where no clouds were observed is still difficult to explain. We will present our analyses of the sequences of observations made by ISS and VIMS during T120 and T121, as well as an ongoing ground-based observing campaign (including data from 8 June and 23 July), and the implications for the behavior of Titan's atmosphere leading up to northern summer.
NASA Astrophysics Data System (ADS)
Yu, Xiaoyuan; Yuan, Jian; Chen, Shi
2013-03-01
Cloud computing is one of the most popular topics in the IT industry and is recently being adopted by many companies. It has four development models, as: public cloud, community cloud, hybrid cloud and private cloud. Except others, private cloud can be implemented in a private network, and delivers some benefits of cloud computing without pitfalls. This paper makes a comparison of typical open source platforms through which we can implement a private cloud. After this comparison, we choose Eucalyptus and Wavemaker to do a case study on the private cloud. We also do some performance estimation of cloud platform services and development of prototype software as cloud services.
Aerosol partitioning in mixed-phase clouds at the Jungfraujoch (3580 m asl)
NASA Astrophysics Data System (ADS)
Henning, S.; Bojinski, S.; Diehl, K.; Ghan, S.; Nyeki, S.; Weingartner, E.; Wurzler, S.; Baltensperger, U.
2003-04-01
Field measurements on the partitioning between the interstitial and the liquid/ice phase in natural clouds were performed at the high-alpine research station Jungfraujoch (3580 m asl, Switzerland) during a summer and a winter campaign. The size distributions of the total and the interstitial aerosol were determined by means of a scanning mobility particle sizer (SMPS). From these, size resolved scavenging ratios were calculated. Simultaneously, cloud water content (CWC) and cloud particle size distributions along with meteorological data were obtained. In cold mixed phase clouds (existing of liquid droplets and ice crystals), strong differences were found in comparison to the warm summer clouds. In the warm cloud types all particles above a certain diameter were activated and thereby the scavenging ratio (number of activated particles divided by the total number concentration) above a certain threshold diameter approached 1. In the winter clouds, the scavenging ratio never reached the value of 1 and could be as low as 0. These observations are explained by the Bergeron-Findeisen process: Here, particles are also activated to droplets in the first step, but after the formation of the ice phase droplets evaporate while the ice crystals grow, due to difference in the saturation vapor pressure over water and ice. This release of aerosol particles to the interstitial aerosol has significant implications for the climate forcing: It can be expected that the number of CCN is of less importance as soon as ice crystals are formed.
NAFFS: network attached flash file system for cloud storage on portable consumer electronics
NASA Astrophysics Data System (ADS)
Han, Lin; Huang, Hao; Xie, Changsheng
Cloud storage technology has become a research hotspot in recent years, while the existing cloud storage services are mainly designed for data storage needs with stable high speed Internet connection. Mobile Internet connections are often unstable and the speed is relatively low. These native features of mobile Internet limit the use of cloud storage in portable consumer electronics. The Network Attached Flash File System (NAFFS) presented the idea of taking the portable device built-in NAND flash memory as the front-end cache of virtualized cloud storage device. Modern portable devices with Internet connection have built-in more than 1GB NAND Flash, which is quite enough for daily data storage. The data transfer rate of NAND flash device is much higher than mobile Internet connections[1], and its non-volatile feature makes it very suitable as the cache device of Internet cloud storage on portable device, which often have unstable power supply and intermittent Internet connection. In the present work, NAFFS is evaluated with several benchmarks, and its performance is compared with traditional network attached file systems, such as NFS. Our evaluation results indicate that the NAFFS achieves an average accessing speed of 3.38MB/s, which is about 3 times faster than directly accessing cloud storage by mobile Internet connection, and offers a more stable interface than that of directly using cloud storage API. Unstable Internet connection and sudden power off condition are tolerable, and no data in cache will be lost in such situation.
Electron temperatures within magnetic clouds between 2 and 4 AU: Voyager 2 observations
NASA Astrophysics Data System (ADS)
Sittler, E. C.; Burlaga, L. F.
1998-08-01
We have performed an analysis of Voyager 2 plasma electron observations within magnetic clouds between 2 and 4 AU identified by Burlaga and Behannon [1982]. The analysis has been confined to three of the magnetic clouds identified by Burlaga and Behannon that had high-quality data. The general properties of the plasma electrons within a magnetic cloud are that (1) the moment electron temperature anticorrelates with the electron density within the cloud, (2) the ratio Te/Tp tends to be >1, and (3) on average, Te/Tp~7.0. All three results are consistent with previous electron observations within magnetic clouds. Detailed analyses of the core and halo populations within the magnetic clouds show no evidence of either an anticorrelation between the core temperature TC and the electron density Ne or an anticorrelation between the halo temperature TH and the electron density. Within the magnetic clouds the halo component can contribute more than 50% of the electron pressure. The anticorrelation of Te relative to Ne can be traced to the density of the halo component relative to the density of the core component. The core electrons dominate the electron density. When the density goes up, the halo electrons contribute less to the electron pressure, so we get a lower Te. When the electron density goes down, the halo electrons contribute more to the electron pressure, and Te goes up. We find a relation between the electron pressure and density of the form Pe=αNeγ with γ~0.5.
Liu, Li; Chen, Weiping; Nie, Min; Zhang, Fengjuan; Wang, Yu; He, Ailing; Wang, Xiaonan; Yan, Gen
2016-11-01
To handle the emergence of the regional healthcare ecosystem, physicians and surgeons in various departments and healthcare institutions must process medical images securely, conveniently, and efficiently, and must integrate them with electronic medical records (EMRs). In this manuscript, we propose a software as a service (SaaS) cloud called the iMAGE cloud. A three-layer hybrid cloud was created to provide medical image processing services in the smart city of Wuxi, China, in April 2015. In the first step, medical images and EMR data were received and integrated via the hybrid regional healthcare network. Then, traditional and advanced image processing functions were proposed and computed in a unified manner in the high-performance cloud units. Finally, the image processing results were delivered to regional users using the virtual desktop infrastructure (VDI) technology. Security infrastructure was also taken into consideration. Integrated information query and many advanced medical image processing functions-such as coronary extraction, pulmonary reconstruction, vascular extraction, intelligent detection of pulmonary nodules, image fusion, and 3D printing-were available to local physicians and surgeons in various departments and healthcare institutions. Implementation results indicate that the iMAGE cloud can provide convenient, efficient, compatible, and secure medical image processing services in regional healthcare networks. The iMAGE cloud has been proven to be valuable in applications in the regional healthcare system, and it could have a promising future in the healthcare system worldwide.
NASA Astrophysics Data System (ADS)
Taylor, R.; Wünsch, R.; Palouš, J.
2018-05-01
Most detected neutral atomic hydrogen (HI) at low redshift is associated with optically bright galaxies. However, a handful of HI clouds are known which appear to be optically dark and have no nearby potential progenitor galaxies, making tidal debris an unlikely explanation. In particular, 6 clouds identified by the Arecibo Galaxy Environment Survey are interesting due to the combination of their small size, isolation, and especially their broad line widths atypical of other such clouds. A recent suggestion is that these clouds exist in pressure equilibrium with the intracluster medium, with the line width arising from turbulent internal motions. Here we explore that possibility by using the FLASH code to perform a series of 3D hydro simulations. Our clouds are modelled using spherical Gaussian density profiles, embedded in a hot, low-density gas representing the intracluster medium. The simulations account for heating and cooling of the gas, and we vary the structure and strength of their internal motions. We create synthetic HI spectra, and find that none of our simulations reproduce the observed cloud parameters for longer than ˜100 Myr : the clouds either collapse, disperse, or experience rapid heating which would cause ionisation and render them undetectable to HI surveys. While the turbulent motions required to explain the high line widths generate structures which appear to be inherently unstable, making this an unlikely explanation for the observed clouds, these simulations demonstrate the importance of including the intracluster medium in any model seeking to explain the existence of these objects.
Point Cloud Management Through the Realization of the Intelligent Cloud Viewer Software
NASA Astrophysics Data System (ADS)
Costantino, D.; Angelini, M. G.; Settembrini, F.
2017-05-01
The paper presents a software dedicated to the elaboration of point clouds, called Intelligent Cloud Viewer (ICV), made in-house by AESEI software (Spin-Off of Politecnico di Bari), allowing to view point cloud of several tens of millions of points, also on of "no" very high performance systems. The elaborations are carried out on the whole point cloud and managed by means of the display only part of it in order to speed up rendering. It is designed for 64-bit Windows and is fully written in C ++ and integrates different specialized modules for computer graphics (Open Inventor by SGI, Silicon Graphics Inc), maths (BLAS, EIGEN), computational geometry (CGAL, Computational Geometry Algorithms Library), registration and advanced algorithms for point clouds (PCL, Point Cloud Library), advanced data structures (BOOST, Basic Object Oriented Supporting Tools), etc. ICV incorporates a number of features such as, for example, cropping, transformation and georeferencing, matching, registration, decimation, sections, distances calculation between clouds, etc. It has been tested on photographic and TLS (Terrestrial Laser Scanner) data, obtaining satisfactory results. The potentialities of the software have been tested by carrying out the photogrammetric survey of the Castel del Monte which was already available in previous laser scanner survey made from the ground by the same authors. For the aerophotogrammetric survey has been adopted a flight height of approximately 1000ft AGL (Above Ground Level) and, overall, have been acquired over 800 photos in just over 15 minutes, with a covering not less than 80%, the planned speed of about 90 knots.
Sukič, Primož; Štumberger, Gorazd
2017-05-13
Clouds moving at a high speed in front of the Sun can cause step changes in the output power of photovoltaic (PV) power plants, which can lead to voltage fluctuations and stability problems in the connected electricity networks. These effects can be reduced effectively by proper short-term cloud passing forecasting and suitable PV power plant output power control. This paper proposes a low-cost Internet of Things (IoT)-based solution for intra-minute cloud passing forecasting. The hardware consists of a Raspberry PI Model B 3 with a WiFi connection and an OmniVision OV5647 sensor with a mounted wide-angle lens, a circular polarizing (CPL) filter and a natural density (ND) filter. The completely new algorithm for cloud passing forecasting uses the green and blue colors in the photo to determine the position of the Sun, to recognize the clouds, and to predict their movement. The image processing is performed in several stages, considering selectively only a small part of the photo relevant to the movement of the clouds in the vicinity of the Sun in the next minute. The proposed algorithm is compact, fast and suitable for implementation on low cost processors with low computation power. The speed of the cloud parts closest to the Sun is used to predict when the clouds will cover the Sun. WiFi communication is used to transmit this data to the PV power plant control system in order to decrease the output power slowly and smoothly.
Sukič, Primož; Štumberger, Gorazd
2017-01-01
Clouds moving at a high speed in front of the Sun can cause step changes in the output power of photovoltaic (PV) power plants, which can lead to voltage fluctuations and stability problems in the connected electricity networks. These effects can be reduced effectively by proper short-term cloud passing forecasting and suitable PV power plant output power control. This paper proposes a low-cost Internet of Things (IoT)-based solution for intra-minute cloud passing forecasting. The hardware consists of a Raspberry PI Model B 3 with a WiFi connection and an OmniVision OV5647 sensor with a mounted wide-angle lens, a circular polarizing (CPL) filter and a natural density (ND) filter. The completely new algorithm for cloud passing forecasting uses the green and blue colors in the photo to determine the position of the Sun, to recognize the clouds, and to predict their movement. The image processing is performed in several stages, considering selectively only a small part of the photo relevant to the movement of the clouds in the vicinity of the Sun in the next minute. The proposed algorithm is compact, fast and suitable for implementation on low cost processors with low computation power. The speed of the cloud parts closest to the Sun is used to predict when the clouds will cover the Sun. WiFi communication is used to transmit this data to the PV power plant control system in order to decrease the output power slowly and smoothly. PMID:28505078
Cloud cover analysis associated to cut-off low-pressure systems over Europe using Meteosat Imagery
NASA Astrophysics Data System (ADS)
Delgado, G.; Redaño, A.; Lorente, J.; Nieto, R.; Gimeno, L.; Ribera, P.; Barriopedro, D.; García-Herrera, R.; Serrano, A.
2007-04-01
This paper reports a cloud cover analysis of cut-off low pressure systems (COL) using a pattern recognition method applied to IR and VIS bispectral histograms. 35 COL occurrences were studied over five years (1994-1998). Five cloud types were identified in COLs, of which high clouds (HCC) and deep convective clouds (DCC) were found to be the most relevant to characterize COL systems, though not the most numerous. Cloud cover in a COL is highly dependent on its stage of development, but a higher percentage of cloud cover is always present in the frontal zone, attributable due to higher amounts of high and deep convective clouds. These general characteristics are most marked during the first stage (when the amplitude of the geopotencial wave increases) and second stage (characterized by the development of a cold upper level low), closed cyclonic circulation minimizing differences between rearward and frontal zones during the third stage. The probability of heavy rains during this stage decreases considerably. The centres of mass of high and deep convective clouds move towards the COL-axis centre during COL evolution.
Modeling the partitioning of organic chemical species in cloud phases with CLEPS (1.1)
NASA Astrophysics Data System (ADS)
Rose, Clémence; Chaumerliac, Nadine; Deguillaume, Laurent; Perroux, Hélène; Mouchel-Vallon, Camille; Leriche, Maud; Patryl, Luc; Armand, Patrick
2018-02-01
The new detailed aqueous-phase mechanism Cloud Explicit Physico-chemical Scheme (CLEPS 1.0), which describes the oxidation of isoprene-derived water-soluble organic compounds, is coupled with a warm microphysical module simulating the activation of aerosol particles into cloud droplets. CLEPS 1.0 was then extended to CLEPS 1.1 to include the chemistry of the newly added dicarboxylic acids dissolved from the particulate phase. The resulting coupled model allows the prediction of the aqueous-phase concentrations of chemical compounds originating from particle scavenging, mass transfer from the gas-phase and in-cloud aqueous chemical reactivity. The aim of the present study was more particularly to investigate the effect of particle scavenging on cloud chemistry. Several simulations were performed to assess the influence of various parameters on model predictions and to interpret long-term measurements conducted at the top of Puy de Dôme (PUY, France) in marine air masses. Specific attention was paid to carboxylic acids, whose predicted concentrations are on average in the lower range of the observations, with the exception of formic acid, which is rather overestimated in the model. The different sensitivity runs highlight the fact that formic and acetic acids mainly originate from the gas phase and have highly variable aqueous-phase reactivity depending on the cloud acidity, whereas C3-C4 carboxylic acids mainly originate from the particulate phase and are supersaturated in the cloud.
Study on the application of mobile internet cloud computing platform
NASA Astrophysics Data System (ADS)
Gong, Songchun; Fu, Songyin; Chen, Zheng
2012-04-01
The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.
A Deep Machine Learning Algorithm to Optimize the Forecast of Atmospherics
NASA Astrophysics Data System (ADS)
Russell, A. M.; Alliss, R. J.; Felton, B. D.
Space-based applications from imaging to optical communications are significantly impacted by the atmosphere. Specifically, the occurrence of clouds and optical turbulence can determine whether a mission is a success or a failure. In the case of space-based imaging applications, clouds produce atmospheric transmission losses that can make it impossible for an electro-optical platform to image its target. Hence, accurate predictions of negative atmospheric effects are a high priority in order to facilitate the efficient scheduling of resources. This study seeks to revolutionize our understanding of and our ability to predict such atmospheric events through the mining of data from a high-resolution Numerical Weather Prediction (NWP) model. Specifically, output from the Weather Research and Forecasting (WRF) model is mined using a Random Forest (RF) ensemble classification and regression approach in order to improve the prediction of low cloud cover over the Haleakala summit of the Hawaiian island of Maui. RF techniques have a number of advantages including the ability to capture non-linear associations between the predictors (in this case physical variables from WRF such as temperature, relative humidity, wind speed and pressure) and the predictand (clouds), which becomes critical when dealing with the complex non-linear occurrence of clouds. In addition, RF techniques are capable of representing complex spatial-temporal dynamics to some extent. Input predictors to the WRF-based RF model are strategically selected based on expert knowledge and a series of sensitivity tests. Ultimately, three types of WRF predictors are chosen: local surface predictors, regional 3D moisture predictors and regional inversion predictors. A suite of RF experiments is performed using these predictors in order to evaluate the performance of the hybrid RF-WRF technique. The RF model is trained and tuned on approximately half of the input dataset and evaluated on the other half. The RF approach is validated using in-situ observations of clouds. All of the hybrid RF-WRF experiments demonstrated here significantly outperform the base WRF local low cloud cover forecasts in terms of the probability of detection and the overall bias. In particular, RF experiments that use only regional three-dimensional moisture predictors from the WRF model produce the highest accuracy when compared to RF experiments that use local surface predictors only or regional inversion predictors only. Furthermore, adding multiple types of WRF predictors and additional WRF predictors to the RF algorithm does not necessarily add more value in the resulting forecasts, indicating that it is better to have a small set of meaningful predictors than to have a vast set of indiscriminately-chosen predictors. This work also reveals that the WRF-based RF approach is highly sensitive to the time period over which the algorithm is trained and evaluated. Future work will focus on developing a similar WRF-based RF model for high cloud prediction and expanding the algorithm to two-dimensions horizontally.
NASA Astrophysics Data System (ADS)
Zhong, Bo; Chen, Wuhan; Wu, Shanlong; Liu, Qinhuo
2016-10-01
Cloud detection of satellite imagery is very important for quantitative remote sensing research and remote sensing applications. However, many satellite sensors don't have enough bands for a quick, accurate, and simple detection of clouds. Particularly, the newly launched moderate to high spatial resolution satellite sensors of China, such as the charge-coupled device on-board the Chinese Huan Jing 1 (HJ-1/CCD) and the wide field of view (WFV) sensor on-board the Gao Fen 1 (GF-1), only have four available bands including blue, green, red, and near infrared bands, which are far from the requirements of most could detection methods. In order to solve this problem, an improved and automated cloud detection method for Chinese satellite sensors called OCM (Object oriented Cloud and cloud-shadow Matching method) is presented in this paper. It firstly modified the Automatic Cloud Cover Assessment (ACCA) method, which was developed for Landsat-7 data, to get an initial cloud map. The modified ACCA method is mainly based on threshold and different threshold setting produces different cloud map. Subsequently, a strict threshold is used to produce a cloud map with high confidence and large amount of cloud omission and a loose threshold is used to produce a cloud map with low confidence and large amount of commission. Secondly, a corresponding cloud-shadow map is also produced using the threshold of near-infrared band. Thirdly, the cloud maps and cloud-shadow map are transferred to cloud objects and cloud-shadow objects. Cloud and cloud-shadow are usually in pairs; consequently, the final cloud and cloud-shadow maps are made based on the relationship between cloud and cloud-shadow objects. OCM method was tested using almost 200 HJ-1/CCD images across China and the overall accuracy of cloud detection is close to 90%.
Isotopic modeling of the sub-cloud evaporation effect in precipitation.
Salamalikis, V; Argiriou, A A; Dotsika, E
2016-02-15
In dry and warm environments sub-cloud evaporation influences the falling raindrops modifying their final stable isotopic content. During their descent from the cloud base towards the ground surface, through the unsaturated atmosphere, hydrometeors are subjected to evaporation whereas the kinetic fractionation results to less depleted or enriched isotopic signatures compared to the initial isotopic composition of the raindrops at cloud base. Nowadays the development of Generalized Climate Models (GCMs) that include isotopic content calculation modules are of great interest for the isotopic tracing of the global hydrological cycle. Therefore the accurate description of the underlying processes affecting stable isotopic content can improve the performance of iso-GCMs. The aim of this study is to model the sub-cloud evaporation effect using a) mixing and b) numerical isotope evaporation models. The isotope-mixing evaporation model simulates the isotopic enrichment (difference between the ground and the cloud base isotopic composition of raindrops) in terms of raindrop size, ambient temperature and relative humidity (RH) at ground level. The isotopic enrichment (Δδ) varies linearly with the evaporated raindrops mass fraction of the raindrop resulting to higher values at drier atmospheres and for smaller raindrops. The relationship between Δδ and RH is described by a 'heat capacity' model providing high correlation coefficients for both isotopes (R(2)>80%) indicating that RH is an ideal indicator of the sub-cloud evaporation effect. Vertical distribution of stable isotopes in falling raindrops is also investigated using a numerical isotope-evaporation model. Temperature and humidity dependence of the vertical isotopic variation is clearly described by the numerical isotopic model showing an increase in the isotopic values with increasing temperature and decreasing RH. At an almost saturated atmosphere (RH=95%) sub-cloud evaporation is negligible and the isotopic composition hardly changes even at high temperatures while at drier and warm conditions the enrichment of (18)Ο reaches up to 20‰, depending on the raindrop size and the initial meteorological conditions. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Ohno, Kazumasa; Okuzumi, Satoshi
2018-05-01
The ubiquity of clouds in the atmospheres of exoplanets, especially of super-Earths, is one of the outstanding issues for the transmission spectra survey. Understanding the formation process of clouds in super-Earths is necessary to interpret the observed spectra correctly. In this study, we investigate the vertical distributions of particle size and mass density of mineral clouds in super-Earths using a microphysical model that takes into account the vertical transport and growth of cloud particles in a self-consistent manner. We demonstrate that the vertical profiles of mineral clouds significantly vary with the concentration of cloud condensation nuclei and atmospheric metallicity. We find that the height of the cloud top increases with increasing metallicity as long as the metallicity is lower than the threshold. If the metallicity is larger than the threshold, the cloud-top height no longer increases appreciably with metallicity because coalescence yields larger particles of higher settling velocities. We apply our cloud model to GJ1214 b and GJ436 b, for which recent transmission observations suggest the presence of high-altitude opaque clouds. For GJ436 b, we show that KCl particles can ascend high enough to explain the observation. For GJ1214 b, by contrast, the height of KCl clouds predicted from our model is too low to explain its flat transmission spectrum. Clouds made of highly porous KCl particles could explain the observations if the atmosphere is highly metal-rich, and hence the particle microstructure might be a key to interpret the flat spectrum of GJ1214 b.
Virtual Sensors: Using Data Mining to Efficiently Estimate Spectra
NASA Technical Reports Server (NTRS)
Srivastava, Ashok; Oza, Nikunj; Stroeve, Julienne
2004-01-01
Detecting clouds within a satellite image is essential for retrieving surface geophysical parameters, such as albedo and temperature, from optical and thermal imagery because the retrieval methods tend to be valid for clear skies only. Thus, routine satellite data processing requires reliable automated cloud detection algorithms that are applicable to many surface types. Unfortunately, cloud detection over snow and ice is difficult due to the lack of spectral contrast between clouds and snow. Snow and clouds are both highly reflective in the visible wavelen,ats and often show little contrast in the thermal Infrared. However, at 1.6 microns, the spectral signatures of snow and clouds differ enough to allow improved snow/ice/cloud discrimination. The recent Terra and Aqua Moderate Resolution Imaging Spectro-Radiometer (MODIS) sensors have a channel (channel 6) at 1.6 microns. Presently the most comprehensive, long-term information on surface albedo and temperature over snow- and ice-covered surfaces comes from the Advanced Very High Resolution Radiometer ( AVHRR) sensor that has been providing imagery since July 1981. The earlier AVHRR sensors (e.g. AVHRR/2) did not however have a channel designed for discriminating clouds from snow, such as the 1.6 micron channel available on the more recent AVHRR/3 or the MODIS sensors. In the absence of the 1.6 micron channel, the AVHRR Polar Pathfinder (APP) product performs cloud detection using a combination of time-series analysis and multispectral threshold tests based on the satellite's measuring channels to produce a cloud mask. The method has been found to work reasonably well over sea ice, but not so well over the ice sheets. Thus, improving the cloud mask in the APP dataset would be extremely helpful toward increasing the accuracy of the albedo and temperature retrievals, as well as extending the time-series of albedo and temperature retrievals from the more recent sensors to the historical ones. In this work, we use data mining methods to construct a model of MODIS channel 6 as a function of other channels that are common to both MODIS and AVHRR. The idea is to use the model to generate the equivalent of MODIS channel 6 for AVHRR as a function of the AVHRR equivalents to MODIS channels. We call this a Virtual Sensor because it predicts unmeasured spectra. The goal is to use this virtual channel 6. to yield a cloud mask superior to what is currently used in APP . Our results show that several data mining methods such as multilayer perceptrons (MLPs), ensemble methods (e.g., bagging), and kernel methods (e.g., support vector machines) generate channel 6 for unseen MODIS images with high accuracy. Because the true channel 6 is not available for AVHRR images, we qualitatively assess the virtual channel 6 for several AVHRR images.
High-mass star formation possibly triggered by cloud-cloud collision in the H II region RCW 34
NASA Astrophysics Data System (ADS)
Hayashi, Katsuhiro; Sano, Hidetoshi; Enokiya, Rei; Torii, Kazufumi; Hattori, Yusuke; Kohno, Mikito; Fujita, Shinji; Nishimura, Atsushi; Ohama, Akio; Yamamoto, Hiroaki; Tachihara, Kengo; Hasegawa, Yutaka; Kimura, Kimihiro; Ogawa, Hideo; Fukui, Yasuo
2018-05-01
We report on the possibility that the high-mass star located in the H II region RCW 34 was formed by a triggering induced by a collision of molecular clouds. Molecular gas distributions of the 12CO and 13CO J = 2-1 and 12CO J = 3-2 lines in the direction of RCW 34 were measured using the NANTEN2 and ASTE telescopes. We found two clouds with velocity ranges of 0-10 km s-1 and 10-14 km s-1. Whereas the former cloud is as massive as ˜1.4 × 104 M⊙ and has a morphology similar to the ring-like structure observed in the infrared wavelengths, the latter cloud, with a mass of ˜600 M⊙, which has not been recognized by previous observations, is distributed to just cover the bubble enclosed by the other cloud. The high-mass star with a spectral type of O8.5V is located near the boundary of the two clouds. The line intensity ratio of 12CO J = 3-2/J = 2-1 yields high values (≳1.0), suggesting that these clouds are associated with the massive star. We also confirm that the obtained position-velocity diagram shows a similar distribution to that derived by a numerical simulation of the supersonic collision of two clouds. Using the relative velocity between the two clouds (˜5 km s-1), the collisional time scale is estimated to be ˜0.2 Myr with the assumption of a distance of 2.5 kpc. These results suggest that the high-mass star in RCW 34 was formed rapidly within a time scale of ˜0.2 Myr via a triggering of a cloud-cloud collision.
Digital Photograph Security: What Plastic Surgeons Need to Know.
Thomas, Virginia A; Rugeley, Patricia B; Lau, Frank H
2015-11-01
Sharing and storing digital patient photographs occur daily in plastic surgery. Two major risks associated with the practice, data theft and Health Insurance Portability and Accountability Act (HIPAA) violations, have been dramatically amplified by high-speed data connections and digital camera ubiquity. The authors review what plastic surgeons need to know to mitigate those risks and provide recommendations for implementing an ideal, HIPAA-compliant solution for plastic surgeons' digital photography needs: smartphones and cloud storage. Through informal discussions with plastic surgeons, the authors identified the most common photograph sharing and storage methods. For each method, a literature search was performed to identify the risks of data theft and HIPAA violations. HIPAA violation risks were confirmed by the second author (P.B.R.), a compliance liaison and privacy officer. A comprehensive review of HIPAA-compliant cloud storage services was performed. When possible, informal interviews with cloud storage services representatives were conducted. The most common sharing and storage methods are not HIPAA compliant, and several are prone to data theft. The authors' review of cloud storage services identified six HIPAA-compliant vendors that have strong to excellent security protocols and policies. These options are reasonably priced. Digital photography and technological advances offer major benefits to plastic surgeons but are not without risks. A proper understanding of data security and HIPAA regulations needs to be applied to these technologies to safely capture their benefits. Cloud storage services offer efficient photograph sharing and storage with layers of security to ensure HIPAA compliance and mitigate data theft risk.
The role of dedicated data computing centers in the age of cloud computing
NASA Astrophysics Data System (ADS)
Caramarcu, Costin; Hollowell, Christopher; Strecker-Kellogg, William; Wong, Antonio; Zaytsev, Alexandr
2017-10-01
Brookhaven National Laboratory (BNL) anticipates significant growth in scientific programs with large computing and data storage needs in the near future and has recently reorganized support for scientific computing to meet these needs. A key component is the enhanced role of the RHIC-ATLAS Computing Facility (RACF) in support of high-throughput and high-performance computing (HTC and HPC) at BNL. This presentation discusses the evolving role of the RACF at BNL, in light of its growing portfolio of responsibilities and its increasing integration with cloud (academic and for-profit) computing activities. We also discuss BNL’s plan to build a new computing center to support the new responsibilities of the RACF and present a summary of the cost benefit analysis done, including the types of computing activities that benefit most from a local data center vs. cloud computing. This analysis is partly based on an updated cost comparison of Amazon EC2 computing services and the RACF, which was originally conducted in 2012.
Validation of On-board Cloud Cover Assessment Using EO-1
NASA Technical Reports Server (NTRS)
Mandl, Dan; Miller, Jerry; Griffin, Michael; Burke, Hsiao-hua
2003-01-01
The purpose of this NASA Earth Science Technology Office funded effort was to flight validate an on-board cloud detection algorithm and to determine the performance that can be achieved with a Mongoose V flight computer. This validation was performed on the EO-1 satellite, which is operational, by uploading new flight code to perform the cloud detection. The algorithm was developed by MIT/Lincoln Lab and is based on the use of the Hyperion hyperspectral instrument using selected spectral bands from 0.4 to 2.5 microns. The Technology Readiness Level (TRL) of this technology at the beginning of the task was level 5 and was TRL 6 upon completion. In the final validation, an 8 second (0.75 Gbytes) Hyperion image was processed on-board and assessed for percentage cloud cover within 30 minutes. It was expected to take many hours and perhaps a day considering that the Mongoose V is only a 6-8 MIP machine in performance. To accomplish this test, the image taken had to have level 0 and level 1 processing performed on-board before the cloud algorithm was applied. For almost all of the ground test cases and all of the flight cases, the cloud assessment was within 5% of the correct value and in most cases within 1-2%.
The Cloud Detection and UV Monitoring Experiment (CLUE)
NASA Technical Reports Server (NTRS)
Barbier, L.; Loh, E.; Sokolsky, P.; Streitmatter, R.
2004-01-01
We propose a large-area, low-power instrument to perform CLoud detection and Ultraviolet monitoring, CLUE. CLUE will combine the W detection capabilities of the NIGHTGLOW payload, with an array of infrared sensors to perform cloud slicing measurements. Missions such as EUSO and OWL which seek to measure UHE cosmic-rays at 1W20 eV use the atmosphere as a fluorescence detector. CLUE will provide several important correlated measurements for these missions, including: monitoring the atmospheric W emissions &om 330 - 400 nm, determining the ambient cloud cover during those W measurements (with active LIDAR), measuring the optical depth of the clouds (with an array of narrow band-pass IR sensors), and correlating LIDAR and IR cloud cover measurements. This talk will describe the instrument as we envision it.
Modeling the Cloud to Enhance Capabilities for Crises and Catastrophe Management
2016-11-16
order for cloud computing infrastructures to be successfully deployed in real world scenarios as tools for crisis and catastrophe management, where...Statement of the Problem Studied As cloud computing becomes the dominant computational infrastructure[1] and cloud technologies make a transition to hosting...1. Formulate rigorous mathematical models representing technological capabilities and resources in cloud computing for performance modeling and
Observations and Modeling of the Green Ocean Amazon 2014/15. CHUVA Field Campaign Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Machado, L. A. T.
2016-03-01
The physical processes inside clouds are one of the most unknown components of weather and climate systems. A description of cloud processes through the use of standard meteorological parameters in numerical models has to be strongly improved to accurately describe the characteristics of hydrometeors, latent heating profiles, radiative balance, air entrainment, and cloud updrafts and downdrafts. Numerical models have been improved to run at higher spatial resolutions where it is necessary to explicitly describe these cloud processes. For instance, to analyze the effects of global warming in a given region it is necessary to perform simulations taking into account allmore » of the cloud processes described above. Another important application that requires this knowledge is satellite precipitation estimation. The analysis will be performed focusing on the microphysical evolution and cloud life cycle, different precipitation estimation algorithms, the development of thunderstorms and lightning formation, processes in the boundary layer, and cloud microphysical modeling. This project intends to extend the knowledge of these cloud processes to reduce the uncertainties in precipitation estimation, mainly from warm clouds, and, consequently, improve knowledge of the water and energy budget and cloud microphysics.« less
Estimating cirrus cloud properties from MIPAS data
NASA Astrophysics Data System (ADS)
Mendrok, J.; Schreier, F.; Höpfner, M.
2007-04-01
High resolution mid-infrared limb emission spectra observed by the spaceborne Michelson Interferometer for Passive Atmospheric Sounding (MIPAS) showing evidence of cloud interference are analyzed. Using the new line-by-line multiple scattering [Approximate] Spherical Atmospheric Radiative Transfer code (SARTre), a sensitivity study with respect to cirrus cloud parameters, e.g., optical thickness and particle size distribution, is performed. Cirrus properties are estimated by fitting spectra in three distinct microwindows between 8 and 12 μm. For a cirrus with extremely low ice water path (IWP = 0.1 g/m2) and small effective particle size (D e = 10 μm) simulated spectra are in close agreement with observations in broadband signal and fine structures. We show that a multi-microwindow technique enhances reliability of MIPAS cirrus retrievals compared to single microwindow methods.
The EPOS Vision for the Open Science Cloud
NASA Astrophysics Data System (ADS)
Jeffery, Keith; Harrison, Matt; Cocco, Massimo
2016-04-01
Cloud computing offers dynamic elastic scalability for data processing on demand. For much research activity, demand for computing is uneven over time and so CLOUD computing offers both cost-effectiveness and capacity advantages. However, as reported repeatedly by the EC Cloud Expert Group, there are barriers to the uptake of Cloud Computing: (1) security and privacy; (2) interoperability (avoidance of lock-in); (3) lack of appropriate systems development environments for application programmers to characterise their applications to allow CLOUD middleware to optimize their deployment and execution. From CERN, the Helix-Nebula group has proposed the architecture for the European Open Science Cloud. They are discussing with other e-Infrastructure groups such as EGI (GRIDs), EUDAT (data curation), AARC (network authentication and authorisation) and also with the EIROFORUM group of 'international treaty' RIs (Research Infrastructures) and the ESFRI (European Strategic Forum for Research Infrastructures) RIs including EPOS. Many of these RIs are either e-RIs (electronic-RIs) or have an e-RI interface for access and use. The EPOS architecture is centred on a portal: ICS (Integrated Core Services). The architectural design already allows for access to e-RIs (which may include any or all of data, software, users and resources such as computers or instruments). Those within any one domain (subject area) of EPOS are considered within the TCS (Thematic Core Services). Those outside, or available across multiple domains of EPOS, are ICS-d (Integrated Core Services-Distributed) since the intention is that they will be used by any or all of the TCS via the ICS. Another such service type is CES (Computational Earth Science); effectively an ICS-d specializing in high performance computation, analytics, simulation or visualization offered by a TCS for others to use. Already discussions are underway between EPOS and EGI, EUDAT, AARC and Helix-Nebula for those offerings to be considered as ICS-ds by EPOS.. Provision of access to ICS-Ds from ICS-C concerns several aspects: (a) Technical : it may be more or less difficult to connect and pass from ICS-C to the ICS-d/ CES the 'package' (probably a virtual machine) of data and software; (b) Security/privacy : including passing personal information e.g. related to AAAI (Authentication, authorization, accounting Infrastructure); (c) financial and legal : such as payment, licence conditions; Appropriate interfaces from ICS-C to ICS-d are being designed to accommodate these aspects. The Open Science Cloud is timely because it provides a framework to discuss governance and sustainability for computational resource provision as well as an effective interpretation of federated approach to HPC(High Performance Computing) -HTC (High Throughput Computing). It will be a unique opportunity to share and adopt procurement policies to provide access to computational resources for RIs. The current state of discussions and expected roadmap for the EPOS-Open Science Cloud relationship are presented.
Cloud-based hospital information system as a service for grassroots healthcare institutions.
Yao, Qin; Han, Xiong; Ma, Xi-Kun; Xue, Yi-Feng; Chen, Yi-Jun; Li, Jing-Song
2014-09-01
Grassroots healthcare institutions (GHIs) are the smallest administrative levels of medical institutions, where most patients access health services. The latest report from the National Bureau of Statistics of China showed that 96.04 % of 950,297 medical institutions in China were at the grassroots level in 2012, including county-level hospitals, township central hospitals, community health service centers, and rural clinics. In developing countries, these institutions are facing challenges involving a shortage of funds and talent, inconsistent medical standards, inefficient information sharing, and difficulties in management during the adoption of health information technologies (HIT). Because of the necessity and gravity for GHIs, our aim is to provide hospital information services for GHIs using Cloud computing technologies and service modes. In this medical scenario, the computing resources are pooled by means of a Cloud-based Virtual Desktop Infrastructure (VDI) to serve multiple GHIs, with different hospital information systems dynamically assigned and reassigned according to demand. This paper is concerned with establishing a Cloud-based Hospital Information Service Center to provide hospital information software as a service (HI-SaaS) with the aim of providing GHIs with an attractive and high-performance medical information service. Compared with individually establishing all hospital information systems, this approach is more cost-effective and affordable for GHIs and does not compromise HIT performance.
Physical conditions in CaFe interstellar clouds
NASA Astrophysics Data System (ADS)
Gnaciński, P.; Krogulec, M.
2008-01-01
Interstellar clouds that exhibit strong Ca I and Fe I lines are called CaFe clouds. Ionisation equilibrium equations were used to model the column densities of Ca II, Ca I, K I, Na I, Fe I and Ti II in CaFe clouds. We find that the chemical composition of CaFe clouds is solar and that there is no depletion into dust grains. CaFe clouds have high electron densities, n_e≈1 cm-3, that lead to high column densities of neutral Ca and Fe.
Formation of massive, dense cores by cloud-cloud collisions
NASA Astrophysics Data System (ADS)
Takahira, Ken; Shima, Kazuhiro; Habe, Asao; Tasker, Elizabeth J.
2018-03-01
We performed sub-parsec (˜ 0.014 pc) scale simulations of cloud-cloud collisions of two idealized turbulent molecular clouds (MCs) with different masses in the range of (0.76-2.67) × 104 M_{⊙} and with collision speeds of 5-30 km s-1. Those parameters are larger than in Takahira, Tasker, and Habe (2014, ApJ, 792, 63), in which study the colliding system showed a partial gaseous arc morphology that supports the NANTEN observations of objects indicated to be colliding MCs using numerical simulations. Gas clumps with density greater than 10-20 g cm-3 were identified as pre-stellar cores and tracked through the simulation to investigate the effects of the mass of colliding clouds and the collision speeds on the resulting core population. Our results demonstrate that the smaller cloud property is more important for the results of cloud-cloud collisions. The mass function of formed cores can be approximated by a power-law relation with an index γ = -1.6 in slower cloud-cloud collisions (v ˜ 5 km s-1), and is in good agreement with observation of MCs. A faster relative speed increases the number of cores formed in the early stage of collisions and shortens the gas accretion phase of cores in the shocked region, leading to the suppression of core growth. The bending point appears in the high-mass part of the core mass function and the bending point mass decreases with increase in collision speed for the same combination of colliding clouds. The higher-mass part of the core mass function than the bending point mass can be approximated by a power law with γ = -2-3 that is similar to the power index of the massive part of the observed stellar initial mass function. We discuss implications of our results for the massive-star formation in our Galaxy.
Formation of massive, dense cores by cloud-cloud collisions
NASA Astrophysics Data System (ADS)
Takahira, Ken; Shima, Kazuhiro; Habe, Asao; Tasker, Elizabeth J.
2018-05-01
We performed sub-parsec (˜ 0.014 pc) scale simulations of cloud-cloud collisions of two idealized turbulent molecular clouds (MCs) with different masses in the range of (0.76-2.67) × 104 M_{⊙} and with collision speeds of 5-30 km s-1. Those parameters are larger than in Takahira, Tasker, and Habe (2014, ApJ, 792, 63), in which study the colliding system showed a partial gaseous arc morphology that supports the NANTEN observations of objects indicated to be colliding MCs using numerical simulations. Gas clumps with density greater than 10-20 g cm-3 were identified as pre-stellar cores and tracked through the simulation to investigate the effects of the mass of colliding clouds and the collision speeds on the resulting core population. Our results demonstrate that the smaller cloud property is more important for the results of cloud-cloud collisions. The mass function of formed cores can be approximated by a power-law relation with an index γ = -1.6 in slower cloud-cloud collisions (v ˜ 5 km s-1), and is in good agreement with observation of MCs. A faster relative speed increases the number of cores formed in the early stage of collisions and shortens the gas accretion phase of cores in the shocked region, leading to the suppression of core growth. The bending point appears in the high-mass part of the core mass function and the bending point mass decreases with increase in collision speed for the same combination of colliding clouds. The higher-mass part of the core mass function than the bending point mass can be approximated by a power law with γ = -2-3 that is similar to the power index of the massive part of the observed stellar initial mass function. We discuss implications of our results for the massive-star formation in our Galaxy.
NASA Astrophysics Data System (ADS)
O, K. T.; Wood, R.; Bretherton, C. S.; Eastman, R. M.; Tseng, H. H.
2016-12-01
During the 2015 Cloud System Evolution in the Trades (CSET) field program (CSET, Jul-Aug 2015, subtropical NE Pacific), the NSF/NCAR G-V aircraft frequently encountered ultra clean layers (hereafter UCLs) with extremely low accumulation mode aerosol (i.e. diameter da> 100nm) concentration (hereafter Na), and low albedo ( 0.2) warm clouds (termed "gray clouds" in our study) with low droplet concentration (hereafter Nd). The analysis of CSET aircraft data shows that (1) UCLs and gray clouds are mostly commonly found at a height of 1.5-2km, typically close to the top of the MBL, (2) UCLs and gray cloud coverage as high as 40-60% between 135W and 155W (i.e. Sc-Cu transition region) but occur very infrequently east of 130W (i.e. shallow, near-coastal stratocumulus region), and (3) UCLs and gray clouds exhibit remarkably low turbulence compared with non-UCL clear sky and clouds. It should be noted that most previous aircraft sampling of low clouds occurred close to the Californian coast, so the prevalence of UCLs and gray clouds has not been previously noted. Based on the analysis of aircraft data, we hypothesize that gray clouds result from detrainment of cloud close to the top of precipitating trade cumuli, and UCLs are remnants of these layers when gray clouds evaporates. The simulations in our study are performed using 2-D bin spectral cloud parcel model and version 6.9 of the System for Atmospheric Modeling (SAM). Our idealized simulations suggest that collision-coalescence plays a crucial role in reducing Nd such that gray clouds can easily form via collision-coalescence in layers detrained from the cloud top at trade cumulus regime, but can not form at stratocumulus regime. Upon evaporation of gray clouds, only few accumulation mode aerosols are returned to the clear sky, leaving horizontally-extensive UCLs (i.e. clean clear sky). Analysis of CSET flight data and idealized model simulations both suggest cloud top/PBL height may play an important role in the formation of UCLs and gray clouds. In our satellite observation study, the comparison between PBL height (from COSMIC and MODIS) and fraction of low optical depth cloud (from MODIS and CALIPSO) at NEP trade cumulus regime (20-35N, 140-155W) also suggest a strong positive correlation.
Fielding, M. D.; Chiu, J. C.; Hogan, R. J.; ...
2015-07-02
Active remote sensing of marine boundary-layer clouds is challenging as drizzle drops often dominate the observed radar reflectivity. We present a new method to simultaneously retrieve cloud and drizzle vertical profiles in drizzling boundary-layer clouds using surface-based observations of radar reflectivity, lidar attenuated backscatter, and zenith radiances under conditions when precipitation does not reach the surface. Specifically, the vertical structure of droplet size and water content of both cloud and drizzle is characterised throughout the cloud. An ensemble optimal estimation approach provides full error statistics given the uncertainty in the observations. To evaluate the new method, we first perform retrievalsmore » using synthetic measurements from large-eddy simulation snapshots of cumulus under stratocumulus, where cloud water path is retrieved with an error of 31 g m -2. The method also performs well in non-drizzling clouds where no assumption of the cloud profile is required. We then apply the method to observations of marine stratocumulus obtained during the Atmospheric Radiation Measurement MAGIC deployment in the Northeast Pacific. Here, retrieved cloud water path agrees well with independent three-channel microwave radiometer retrievals, with a root mean square difference of 10–20 g m -2.« less
NASA Astrophysics Data System (ADS)
Enokiya, Rei; Sano, Hidetoshi; Hayashi, Katsuhiro; Tachihara, Kengo; Torii, Kazufumi; Yamamoto, Hiroaki; Hattori, Yusuke; Hasegawa, Yutaka; Ohama, Akio; Kimura, Kimihiro; Ogawa, Hideo; Fukui, Yasuo
2018-05-01
We performed CO(J = 1-0, 2-1, and 3-2) observations toward an H II region RCW 32 in the Vela Molecular Ridge. The CO gas distribution associated with the H II region was revealed for the first time at a high resolution of 22″. The results revealed three distinct velocity components which show correspondence with the optical dark lanes and/or Hα distribution. Two of the components show complementary spatial distribution which suggests collisional interaction between them at a relative velocity of ˜ 4 km s-1. Based on these results, we present a hypothesis that a cloud-cloud collision determined the cloud distribution and triggered formation of the exciting star ionizing RCW 32. The collision time scale is estimated from the cloud size and the velocity separation to be ˜2 Myr and the collision terminated ˜1 Myr ago, which is consistent with the age of the exciting star and the associated cluster. By combing the previous works on the H II regions in the Vela Molecular Ridge, we argue that the majority (at least four) of the H II regions in the Ridge were formed by triggering of cloud-cloud collision.
Warming ancient Mars with water clouds
NASA Astrophysics Data System (ADS)
Hartwick, V.; Toon, B.
2017-12-01
High clouds in the present day Mars atmosphere nucleate on interplanetary dust particles (IDPs) that burn up on entry into the Mars atmosphere. Clouds form when superstaturated water vapor condenses on suspended aerosols. Radiatively active water ice clouds may play a crucial role in warming the early Mars climate. Urata and Toon (2011) simulate a stable warm paleo-climate for Mars if clouds form high in the atmosphere and if particles are sufficiently large (r > 10 μm). The annual fluence of micrometeoroids at Mars was larger early on in the evolution of our solar system. Additionally, the water vapor budget throughout the middle and high atmosphere was likely heightened . Both factors should contribute to enhanced nucleation and growth of water ice cloud particles at high altitudes. Here, we use the MarsCAM-CARMA general circulation model (GCM) to examine the radiative impact of high altitude water ice clouds on the early Mars climate and as a possible solution to the faint young sun problem for Mars.
Tseng, Yu-Ting; Chang, Elizabeth H; Kuo, Li-Na; Shen, Wan-Chen; Bai, Kuan-Jen; Wang, Chih-Chi; Chen, Hsiang-Yin
2017-10-01
The PharmaCloud system, a cloud-based medication system, was launched by the Taiwan National Health Insurance Administration (NHIA) in 2013 to integrate patients' medication lists among different medical institutions. The aim of the preliminary study was to evaluate satisfaction with this system among physicians and pharmacists at the early stage of system implementation. A questionnaire was developed through a review of the literature and discussion in 6 focus groups to understand the level of satisfaction, attitudes, and intentions of physicians and pharmacists using the PharmaCloud system. It was then administered nationally in Taiwan in July to September 2015. Descriptive statistics and multiple regression were performed to identify variables influencing satisfaction and intention to use the system. In total, 895 pharmacist and 105 physician questionnaires were valid for analysis. The results showed that satisfaction with system quality warranted improvement. Positive attitudes toward medication reconciliation among physicians and pharmacists, which were significant predictors of the intention to use the system (β= 0.223, p < 0.001). Most physicians and pharmacists agreed that obtaining signed patient consent was needed but preferred that it be conducted by the NHIA rather than by individual medical institutions (4.02 ± 1.19 vs. 3.49 ± 1.40, p < 0.01). The preliminary study results indicated a moderate satisfaction toward the PharmaCloud system. Hospital pharmacists had a high satisfaction rate, but neither are physicians and community pharmacists. Continuously improvement on system quality has been performing based on the results of this preliminary survey. Policies and standardization processes, including privacy protection, are still warranted further actions to make the Taiwan PharmaCloud system a convenient platform for medication reconciliation. Copyright © 2017 Elsevier B.V. All rights reserved.
Molecular cloud formation in high-shear, magnetized colliding flows
NASA Astrophysics Data System (ADS)
Fogerty, E.; Frank, A.; Heitsch, F.; Carroll-Nellenback, J.; Haig, C.; Adams, M.
2016-08-01
The colliding flows (CF) model is a well-supported mechanism for generating molecular clouds. However, to-date most CF simulations have focused on the formation of clouds in the normal-shock layer between head-on colliding flows. We performed simulations of magnetized colliding flows that instead meet at an oblique-shock layer. Oblique shocks generate shear in the post-shock environment, and this shear creates inhospitable environments for star formation. As the degree of shear increases (I.e. the obliquity of the shock increases), we find that it takes longer for sink particles to form, they form in lower numbers, and they tend to be less massive. With regard to magnetic fields, we find that even a weak field stalls gravitational collapse within forming clouds. Additionally, an initially oblique collision interface tends to reorient over time in the presence of a magnetic field, so that it becomes normal to the oncoming flows. This was demonstrated by our most oblique shock interface, which became fully normal by the end of the simulation.
Point cloud registration from local feature correspondences-Evaluation on challenging datasets.
Petricek, Tomas; Svoboda, Tomas
2017-01-01
Registration of laser scans, or point clouds in general, is a crucial step of localization and mapping with mobile robots or in object modeling pipelines. A coarse alignment of the point clouds is generally needed before applying local methods such as the Iterative Closest Point (ICP) algorithm. We propose a feature-based approach to point cloud registration and evaluate the proposed method and its individual components on challenging real-world datasets. For a moderate overlap between the laser scans, the method provides a superior registration accuracy compared to state-of-the-art methods including Generalized ICP, 3D Normal-Distribution Transform, Fast Point-Feature Histograms, and 4-Points Congruent Sets. Compared to the surface normals, the points as the underlying features yield higher performance in both keypoint detection and establishing local reference frames. Moreover, sign disambiguation of the basis vectors proves to be an important aspect in creating repeatable local reference frames. A novel method for sign disambiguation is proposed which yields highly repeatable reference frames.
Dynamic VMs placement for energy efficiency by PSO in cloud computing
NASA Astrophysics Data System (ADS)
Dashti, Seyed Ebrahim; Rahmani, Amir Masoud
2016-03-01
Recently, cloud computing is growing fast and helps to realise other high technologies. In this paper, we propose a hieratical architecture to satisfy both providers' and consumers' requirements in these technologies. We design a new service in the PaaS layer for scheduling consumer tasks. In the providers' perspective, incompatibility between specification of physical machine and user requests in cloud leads to problems such as energy-performance trade-off and large power consumption so that profits are decreased. To guarantee Quality of service of users' tasks, and reduce energy efficiency, we proposed to modify Particle Swarm Optimisation to reallocate migrated virtual machines in the overloaded host. We also dynamically consolidate the under-loaded host which provides power saving. Simulation results in CloudSim demonstrated that whatever simulation condition is near to the real environment, our method is able to save as much as 14% more energy and the number of migrations and simulation time significantly reduces compared with the previous works.
Microphysics in Multi-scale Modeling System with Unified Physics
NASA Technical Reports Server (NTRS)
Tao, Wei-Kuo
2012-01-01
Recently, a multi-scale modeling system with unified physics was developed at NASA Goddard. It consists of (1) a cloud-resolving model (Goddard Cumulus Ensemble model, GCE model), (2) a regional scale model (a NASA unified weather research and forecast, WRF), (3) a coupled CRM and global model (Goddard Multi-scale Modeling Framework, MMF), and (4) a land modeling system. The same microphysical processes, long and short wave radiative transfer and land processes and the explicit cloud-radiation, and cloud-land surface interactive processes are applied in this multi-scale modeling system. This modeling system has been coupled with a multi-satellite simulator to use NASA high-resolution satellite data to identify the strengths and weaknesses of cloud and precipitation processes simulated by the model. In this talk, a review of developments and applications of the multi-scale modeling system will be presented. In particular, the microphysics development and its performance for the multi-scale modeling system will be presented.
NASA Astrophysics Data System (ADS)
Sánchez-Martínez, V.; Borges, G.; Borrego, C.; del Peso, J.; Delfino, M.; Gomes, J.; González de la Hoz, S.; Pacheco Pages, A.; Salt, J.; Sedov, A.; Villaplana, M.; Wolters, H.
2014-06-01
In this contribution we describe the performance of the Iberian (Spain and Portugal) ATLAS cloud during the first LHC running period (March 2010-January 2013) in the context of the GRID Computing and Data Distribution Model. The evolution of the resources for CPU, disk and tape in the Iberian Tier-1 and Tier-2s is summarized. The data distribution over all ATLAS destinations is shown, focusing on the number of files transferred and the size of the data. The status and distribution of simulation and analysis jobs within the cloud are discussed. The Distributed Analysis tools used to perform physics analysis are explained as well. Cloud performance in terms of the availability and reliability of its sites is discussed. The effect of the changes in the ATLAS Computing Model on the cloud is analyzed. Finally, the readiness of the Iberian Cloud towards the first Long Shutdown (LS1) is evaluated and an outline of the foreseen actions to take in the coming years is given. The shutdown will be a good opportunity to improve and evolve the ATLAS Distributed Computing system to prepare for the future challenges of the LHC operation.
GenomeVIP: a cloud platform for genomic variant discovery and interpretation
Mashl, R. Jay; Scott, Adam D.; Huang, Kuan-lin; Wyczalkowski, Matthew A.; Yoon, Christopher J.; Niu, Beifang; DeNardo, Erin; Yellapantula, Venkata D.; Handsaker, Robert E.; Chen, Ken; Koboldt, Daniel C.; Ye, Kai; Fenyö, David; Raphael, Benjamin J.; Wendl, Michael C.; Ding, Li
2017-01-01
Identifying genomic variants is a fundamental first step toward the understanding of the role of inherited and acquired variation in disease. The accelerating growth in the corpus of sequencing data that underpins such analysis is making the data-download bottleneck more evident, placing substantial burdens on the research community to keep pace. As a result, the search for alternative approaches to the traditional “download and analyze” paradigm on local computing resources has led to a rapidly growing demand for cloud-computing solutions for genomics analysis. Here, we introduce the Genome Variant Investigation Platform (GenomeVIP), an open-source framework for performing genomics variant discovery and annotation using cloud- or local high-performance computing infrastructure. GenomeVIP orchestrates the analysis of whole-genome and exome sequence data using a set of robust and popular task-specific tools, including VarScan, GATK, Pindel, BreakDancer, Strelka, and Genome STRiP, through a web interface. GenomeVIP has been used for genomic analysis in large-data projects such as the TCGA PanCanAtlas and in other projects, such as the ICGC Pilots, CPTAC, ICGC-TCGA DREAM Challenges, and the 1000 Genomes SV Project. Here, we demonstrate GenomeVIP's ability to provide high-confidence annotated somatic, germline, and de novo variants of potential biological significance using publicly available data sets. PMID:28522612
GRDC. A Collaborative Framework for Radiological Background and Contextual Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brian J. Quiter; Ramakrishnan, Lavanya; Mark S. Bandstra
The Radiation Mobile Analysis Platform (RadMAP) is unique in its capability to collect both high quality radiological data from both gamma-ray detectors and fast neutron detectors and a broad array of contextual data that includes positioning and stance data, high-resolution 3D radiological data from weather sensors, LiDAR, and visual and hyperspectral cameras. The datasets obtained from RadMAP are both voluminous and complex and require analyses from highly diverse communities within both the national laboratory and academic communities. Maintaining a high level of transparency will enable analysis products to further enrich the RadMAP dataset. It is in this spirit of openmore » and collaborative data that the RadMAP team proposed to collect, calibrate, and make available online data from the RadMAP system. The Berkeley Data Cloud (BDC) is a cloud-based data management framework that enables web-based data browsing visualization, and connects curated datasets to custom workflows such that analysis products can be managed and disseminated while maintaining user access rights. BDC enables cloud-based analyses of large datasets in a manner that simulates real-time data collection, such that BDC can be used to test algorithm performance on real and source-injected datasets. Using the BDC framework, a subset of the RadMAP datasets have been disseminated via the Gamma Ray Data Cloud (GRDC) that is hosted through the National Energy Research Science Computing (NERSC) Center, enabling data access to over 40 users at 10 institutions.« less
NASA Astrophysics Data System (ADS)
Meenu, S.; Rajeev, K.; Parameswaran, K.; Suresh Raju, C.
2006-12-01
Quantitative estimates of the spatio-temporal variations in deep convective events over the Indian subcontinent, Arabian Sea, Bay of Bengal, and tropical Indian Ocean are carried out using the data obtained from Advanced Very High Resolution Radiometer (AVHRR) onboard NOAA-14 and NOAA-16 during the period 1996-2003. Pixels having thermal IR brightness temperature (BT) less than 245K are considered as high altitude clouds and those having BT<220 K are considered as very high altitude clouds. Very deep convective clouds are observed over north Bay of Bengal during the Asian summer monsoon season when the mean cloud top temperature reaches as low as 190K. Over the Head Bay of Bengal (HBoB) from June to September, more than 50% of the observed clouds are deep convective type and more than half of these deep convective clouds are very deep convective clouds. Histogram analysis of the cloud top temperatures during this period shows that over HBoB the most prominent cloud top temperature of the deep convective clouds is ~205K over the HBoB while that over southeast Arabian Sea (SEAS) is ~220K. This indicates that most probably the cloud top altitude over HBoB is ~2 km larger than that over SEAS during the Asian summer monsoon period. Another remarkable feature observed during the Asian summer monsoon period is the significantly low values of deep convective clouds observed over the south Bay of Bengal close to Srilanka, which appears as a large pool of reduced cloud amount surrounded by regions of large-scale deep convection. Over both SEAS and HBoB, the total, deep convective and very deep convective cloud amounts as well as their corresponding cloud top temperatures (or the altitude of the cloud top) undergo large seasonal variations, while such variations are less prominent over the eastern equatorial Indian Ocean.
Cloud properties inferred from 8-12 micron data
NASA Technical Reports Server (NTRS)
Strabala, Kathleen I.; Ackerman, Steven A.; Menzel, W. Paul
1994-01-01
A trispectral combination of observations at 8-, 11-, and 12-micron bands is suggested for detecting cloud and cloud properties in the infrared. Atmospheric ice and water vapor absorption peak in opposite halves of the window region so that positive 8-minus-11-micron brightness temperature differences indicate cloud, while near-zero or negative differences indicate clear regions. The absorption coefficient for water increases more between 11 and 12 microns than between 8 and 11 microns, while for ice, the reverse is true. Cloud phases is determined by a scatter diagram of 8-minus-11-micron versus 11-minus-12-micron brightness temperature differences; ice cloud shows a slope greater than 1 and water cloud less than 1. The trispectral brightness temperature method was tested upon high-resolution interferometer data resulting in clear-cloud and cloud-phase delineation. Simulations using differing 8-micron bandwidths revealed no significant degradation of cloud property detection. Thus, the 8-micron bandwidth for future satellites can be selected based on the requirements of other applications, such as surface characterization studies. Application of the technique to current polar-orbiting High-Resolution Infrared Sounder (HIRS)-Advanced Very High Resolution Radiometer (AVHRR) datasets is constrained by the nonuniformity of the cloud scenes sensed within the large HIRS field of view. Analysis of MAS (MODIS Airborne Simulator) high-spatial resolution (500 m) data with all three 8-, 11-, and 12-micron bands revealed sharp delineation of differing cloud and background scenes, from which a simple automated threshold technique was developed. Cloud phase, clear-sky, and qualitative differences in cloud emissivity and cloud height were identified on a case study segment from 24 November 1991, consistent with the scene. More rigorous techniques would allow further cloud parameter clarification. The opportunities for global cloud delineation with the Moderate-Resolution Imaging Spectrometer (MODIS) appear excellent. The spectral selection, the spatial resolution, and the global coverage are all well suited for significant advances.
HAMP - the microwave package on the High Altitude and LOng range research aircraft (HALO)
NASA Astrophysics Data System (ADS)
Mech, M.; Orlandi, E.; Crewell, S.; Ament, F.; Hirsch, L.; Hagen, M.; Peters, G.; Stevens, B.
2014-12-01
An advanced package of microwave remote sensing instrumentation has been developed for the operation on the new German High Altitude LOng range research aircraft (HALO). The HALO Microwave Package, HAMP, consists of two nadir-looking instruments: a cloud radar at 36 GHz and a suite of passive microwave radiometers with 26 frequencies in different bands between 22.24 and 183.31 ± 12.5 GHz. We present a description of HAMP's instrumentation together with an illustration of its potential. To demonstrate this potential, synthetic measurements for the implemented passive microwave frequencies and the cloud radar based on cloud-resolving and radiative transfer model calculations were performed. These illustrate the advantage of HAMP's chosen frequency coverage, which allows for improved detection of hydrometeors both via the emission and scattering of radiation. Regression algorithms compare HAMP retrieval with standard satellite instruments from polar orbiters and show its advantages particularly for the lower atmosphere with a root-mean-square error reduced by 5 and 15% for temperature and humidity, respectively. HAMP's main advantage is the high spatial resolution of about 1 km, which is illustrated by first measurements from test flights. Together these qualities make it an exciting tool for gaining a better understanding of cloud processes, testing retrieval algorithms, defining future satellite instrument specifications, and validating platforms after they have been placed in orbit.
HAMP - the microwave package on the High Altitude and LOng range research aircraft HALO
NASA Astrophysics Data System (ADS)
Mech, M.; Orlandi, E.; Crewell, S.; Ament, F.; Hirsch, L.; Hagen, M.; Peters, G.; Stevens, B.
2014-05-01
An advanced package of microwave remote sensing instrumentation has been developed for the operation on the new German High Altitude LOng range research aircraft (HALO). The HALO Microwave Package, HAMP, consists of two nadir looking instruments: a cloud radar at 36 GHz and a suite of passive microwave radiometers with 26 frequencies in different bands between 22.24 and 183.31 ± 12.5 GHz. We present a description of HAMP's instrumentation together with an illustration of its potential. To demonstrate this potential synthetic measurements for the implemented passive microwave frequencies and the cloud radar based on cloud resolving and radiative transfer model calculations were performed. These illustrate the advantage of HAMP's chosen frequency coverage, which allows for improved detection of hydrometeors both via the emission and scattering of radiation. Regression algorithms compare HAMP retrieval with standard satellite instruments from polar orbiters and show its advantages particularly for the lower atmosphere with a reduced root mean square error by 5 and 15% for temperature and humidity, respectively. HAMP's main advantage is the high spatial resolution of about 1 km which is illustrated by first measurements from test flights. Together these qualities make it an exciting tool for gaining better understanding of cloud processes, testing retrieval algorithms, defining future satellite instrument specifications, and validating platforms after they have been placed in orbit.
Experience in using commercial clouds in CMS
NASA Astrophysics Data System (ADS)
Bauerdick, L.; Bockelman, B.; Dykstra, D.; Fuess, S.; Garzoglio, G.; Girone, M.; Gutsche, O.; Holzman, B.; Hufnagel, D.; Kim, H.; Kennedy, R.; Mason, D.; Spentzouris, P.; Timm, S.; Tiradani, A.; Vaandering, E.; CMS Collaboration
2017-10-01
Historically high energy physics computing has been performed on large purpose-built computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is a growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.
Experience in using commercial clouds in CMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bauerdick, L.; Bockelman, B.; Dykstra, D.
Historically high energy physics computing has been performed on large purposebuilt computing systems. In the beginning there were single site computing facilities, which evolved into the Worldwide LHC Computing Grid (WLCG) used today. The vast majority of the WLCG resources are used for LHC computing and the resources are scheduled to be continuously used throughout the year. In the last several years there has been an explosion in capacity and capability of commercial and academic computing clouds. Cloud resources are highly virtualized and intended to be able to be flexibly deployed for a variety of computing tasks. There is amore » growing interest amongst the cloud providers to demonstrate the capability to perform large scale scientific computing. In this presentation we will discuss results from the CMS experiment using the Fermilab HEPCloud Facility, which utilized both local Fermilab resources and Amazon Web Services (AWS). The goal was to work with AWS through a matching grant to demonstrate a sustained scale approximately equal to half of the worldwide processing resources available to CMS. We will discuss the planning and technical challenges involved in organizing the most IO intensive CMS workflows on a large-scale set of virtualized resource provisioned by the Fermilab HEPCloud. We will describe the data handling and data management challenges. Also, we will discuss the economic issues and cost and operational efficiency comparison to our dedicated resources. At the end we will consider the changes in the working model of HEP computing in a domain with the availability of large scale resources scheduled at peak times.« less
Modelled and measured effects of clouds on UV Aerosol Indices on a local, regional, and global scale
NASA Astrophysics Data System (ADS)
Penning de Vries, M.; Wagner, T.
2011-12-01
The UV Aerosol Indices (UVAI) form one of very few available tools in satellite remote sensing that provide information on aerosol absorption. The UVAI are also quite insensitive to surface type and are determined in the presence of clouds - situations where most aerosol retrieval algorithms do not work. The UVAI are most sensitive to elevated layers of absorbing aerosols, such as mineral dust and smoke, but they can also be used to study non-absorbing aerosols, such as sulphate and secondary organic aerosols. Although UVAI are determined for cloud-contaminated pixels, clouds do affect the value of UVAI in several ways: (1) they shield the underlying scene (potentially containing aerosols) from view, (2) they enhance the apparent surface albedo of an elevated aerosol layer, and (3) clouds unpolluted by aerosols also yield non-zero UVAI, here referred to as "cloudUVAI". The main purpose of this paper is to demonstrate that clouds can cause significant UVAI and that this cloudUVAI can be well modelled using simple assumptions on cloud properties. To this aim, we modelled cloudUVAI by using measured cloud optical parameters - either with low spatial resolution from SCIAMACHY, or high resolution from MERIS - as input. The modelled cloudUVAI were compared with UVAI determined from SCIAMACHY reflectances on different spatial (local, regional and global) and temporal scales (single measurement, daily means and seasonal means). The general dependencies of UVAI on cloud parameters were quite well reproduced, but several issues remain unclear: compared to the modelled cloudUVAI, measured UVAI show a bias, in particular for large cloud fractions. Also, the spread in measured UVAI is larger than in modelled cloudUVAI. In addition to the original, Lambert Equivalent Reflector (LER)-based UVAI algorithm, we have also investigated the effects of clouds on UVAI determined using the so-called Modified LER (MLER) algorithm (currently applied to TOMS and OMI data). For medium-sized clouds the MLER algorithm performs better (UVAI are closer to 0), but like for LER UVAI, MLER UVAI can become as large as -1.2 for small clouds and deviate significantly from zero for cloud fractions near 1. The effects of clouds should therefore also be taken into account when MLER UVAI data are used. Because the effects of clouds and aerosols on UVAI are not independent, a simple subtraction of modelled cloudUVAI from measured UVAI does not yield a UVAI representative of a cloud-free scene when aerosols are present. We here propose a first, simple approach for the correction of cloud effects on UVAI. The method is shown to work reasonably well for small to medium-sized clouds located above aerosols.
Static Memory Deduplication for Performance Optimization in Cloud Computing.
Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan
2017-04-27
In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible.
Static Memory Deduplication for Performance Optimization in Cloud Computing
Jia, Gangyong; Han, Guangjie; Wang, Hao; Yang, Xuan
2017-01-01
In a cloud computing environment, the number of virtual machines (VMs) on a single physical server and the number of applications running on each VM are continuously growing. This has led to an enormous increase in the demand of memory capacity and subsequent increase in the energy consumption in the cloud. Lack of enough memory has become a major bottleneck for scalability and performance of virtualization interfaces in cloud computing. To address this problem, memory deduplication techniques which reduce memory demand through page sharing are being adopted. However, such techniques suffer from overheads in terms of number of online comparisons required for the memory deduplication. In this paper, we propose a static memory deduplication (SMD) technique which can reduce memory capacity requirement and provide performance optimization in cloud computing. The main innovation of SMD is that the process of page detection is performed offline, thus potentially reducing the performance cost, especially in terms of response time. In SMD, page comparisons are restricted to the code segment, which has the highest shared content. Our experimental results show that SMD efficiently reduces memory capacity requirement and improves performance. We demonstrate that, compared to other approaches, the cost in terms of the response time is negligible. PMID:28448434
MIE Lidar proposed for the German Space Shuttle Mission D2
NASA Technical Reports Server (NTRS)
Renger, W.; Endemann, M.; Quenzel, H.; Werner, C.
1986-01-01
Firm plans for a second German Spacelab mission (D2-mission), originally scheduled for late 1988 is basically a zero-g mission, but will also include earth observation experiments. On board the D2-facility will allow performance of a number of different measurements with the goal to obtain performance data (cloud top heights, height of the planetary boundary layer, optical thickness, and cloud base height of thin and medium thick clouds, ice/water phase discriminatin for clouds, tropopause height, tropaspheric height, tropospheric aerosols, and stratospheric aerosols.
Cardiovascular imaging environment: will the future be cloud-based?
Kawel-Boehm, Nadine; Bluemke, David A
2017-07-01
In cardiovascular CT and MR imaging large datasets have to be stored, post-processed, analyzed and distributed. Beside basic assessment of volume and function in cardiac magnetic resonance imaging e.g., more sophisticated quantitative analysis is requested requiring specific software. Several institutions cannot afford various types of software and provide expertise to perform sophisticated analysis. Areas covered: Various cloud services exist related to data storage and analysis specifically for cardiovascular CT and MR imaging. Instead of on-site data storage, cloud providers offer flexible storage services on a pay-per-use basis. To avoid purchase and maintenance of specialized software for cardiovascular image analysis, e.g. to assess myocardial iron overload, MR 4D flow and fractional flow reserve, evaluation can be performed with cloud based software by the consumer or complete analysis is performed by the cloud provider. However, challenges to widespread implementation of cloud services include regulatory issues regarding patient privacy and data security. Expert commentary: If patient privacy and data security is guaranteed cloud imaging is a valuable option to cope with storage of large image datasets and offer sophisticated cardiovascular image analysis for institutions of all sizes.
Ground-based cloud classification by learning stable local binary patterns
NASA Astrophysics Data System (ADS)
Wang, Yu; Shi, Cunzhao; Wang, Chunheng; Xiao, Baihua
2018-07-01
Feature selection and extraction is the first step in implementing pattern classification. The same is true for ground-based cloud classification. Histogram features based on local binary patterns (LBPs) are widely used to classify texture images. However, the conventional uniform LBP approach cannot capture all the dominant patterns in cloud texture images, thereby resulting in low classification performance. In this study, a robust feature extraction method by learning stable LBPs is proposed based on the averaged ranks of the occurrence frequencies of all rotation invariant patterns defined in the LBPs of cloud images. The proposed method is validated with a ground-based cloud classification database comprising five cloud types. Experimental results demonstrate that the proposed method achieves significantly higher classification accuracy than the uniform LBP, local texture patterns (LTP), dominant LBP (DLBP), completed LBP (CLTP) and salient LBP (SaLBP) methods in this cloud image database and under different noise conditions. And the performance of the proposed method is comparable with that of the popular deep convolutional neural network (DCNN) method, but with less computation complexity. Furthermore, the proposed method also achieves superior performance on an independent test data set.
Speeding Clouds May Reveal Invisible Black Holes
NASA Astrophysics Data System (ADS)
Kohler, Susanna
2017-07-01
Several small, speeding clouds have been discovered at the center of our galaxy. A new study suggests that these unusual objects may reveal the lurking presence of inactive black holes.Peculiar Cloudsa) Velocity-integrated intensity map showing the location of the two high-velocity compact clouds, HCN0.0090.044 and HCN0.0850.094, in the context of larger molecular clouds. b) and c) Latitude-velocity and longitude-velocity maps for HCN0.0090.044 and HCN0.0850.094, respectively. d) and e) spectra for the two compacts clouds, respectively. Click for a closer look. [Takekawa et al. 2017]Sgr A*, the supermassive black hole marking the center of our galaxy, is surrounded by a region roughly 650 light-years across known as the Central Molecular Zone. This area at the heart of our galaxy is filled with large amounts of warm, dense molecular gas that has a complex distribution and turbulent kinematics.Several peculiar gas clouds have been discovered within the Central Molecular Zone within the past two decades. These clouds, dubbed high-velocity compact clouds, are characterized by their compact sizes and extremely broad velocity widths.What created this mysterious population of energetic clouds? The recent discovery of two new high-velocity compact clouds, reported on in a paper led by Shunya Takekawa (Keio University, Japan), may help us to answer this question.Two More to the CountUsing the James Clerk Maxwell Telescope in Hawaii, Takekawa and collaborators detected the small clouds near the circumnuclear disk at the centermost part of our galaxy. These two clouds have velocity spreads of -80 to -20 km/s and -80 to 0 km/s and compact sizes of just over 1 light-year. The clouds similar appearances and physical properties suggest that they may both have been formed by the same process.Takekawa and collaborators explore and discard several possible origins for these clouds, such as outflows from massive protostars (no massive, luminous stars have been detected affiliated with these clouds), interaction with supernova remnants (no supernova remnants have been detected toward the clouds), and cloudcloud collisions (such collisions leave other signs, like cavities in the parent cloud, which are not detected here).Masses and velocities of black holes that could create the two high-velocity compact clouds fall above the red and blue lines here. [Takekawa et al. 2017]Revealed on the PlungeAs an alternative explanation, Takekawa and collaborators propose that these two small,speeding cloudswere each created when a massive compact object plunged into a nearby molecular cloud. Since we dont seeany luminous stellar counterparts to the high-velocity compact clouds, this suggests that the responsibleobjects were invisible black holes. As each black hole tore through a molecular cloud, it dragged some of the clouds gas along behind it to form the high-velocity compact cloud.Does this explanation make sense statistically? The authors point out that the number of black holes predicted to silently lurk in the central 30 light-years of the Milky Way is around 10,000. This makes it entirely plausible that we could have caught sight of two of them as they revealed their presence while plunging through molecular clouds.If the authors interpretation is correct, then high-velocity compact clouds provide an excellent opportunity: we can search for these speeding bodiesto potentially discover inactive black holes that would otherwise go undetected.CitationShunya Takekawa et al 2017 ApJL 843 L11. doi:10.3847/2041-8213/aa79ee
Cloud vertical profiles derived from CALIPSO and CloudSat and a comparison with MODIS derived clouds
NASA Astrophysics Data System (ADS)
Kato, S.; Sun-Mack, S.; Miller, W. F.; Rose, F. G.; Minnis, P.; Wielicki, B. A.; Winker, D. M.; Stephens, G. L.; Charlock, T. P.; Collins, W. D.; Loeb, N. G.; Stackhouse, P. W.; Xu, K.
2008-05-01
CALIPSO and CloudSat from the a-train provide detailed information of vertical distribution of clouds and aerosols. The vertical distribution of cloud occurrence is derived from one month of CALIPSO and CloudSat data as a part of the effort of merging CALIPSO, CloudSat and MODIS with CERES data. This newly derived cloud profile is compared with the distribution of cloud top height derived from MODIS on Aqua from cloud algorithms used in the CERES project. The cloud base from MODIS is also estimated using an empirical formula based on the cloud top height and optical thickness, which is used in CERES processes. While MODIS detects mid and low level clouds over the Arctic in April fairly well when they are the topmost cloud layer, it underestimates high- level clouds. In addition, because the CERES-MODIS cloud algorithm is not able to detect multi-layer clouds and the empirical formula significantly underestimates the depth of high clouds, the occurrence of mid and low-level clouds is underestimated. This comparison does not consider sensitivity difference to thin clouds but we will impose an optical thickness threshold to CALIPSO derived clouds for a further comparison. The effect of such differences in the cloud profile to flux computations will also be discussed. In addition, the effect of cloud cover to the top-of-atmosphere flux over the Arctic using CERES SSF and FLASHFLUX products will be discussed.
Tiny, Dusty, Galactic HI Clouds: The GALFA-HI Compact Cloud Catalog
NASA Astrophysics Data System (ADS)
Saul, Destry R.; Putman, M. E.; Peek, J. G.
2013-01-01
The recently published GALFA-HI Compact Cloud Catalog contains 2000 nearby neutral hydrogen clouds under 20' in angular size detected with a machine-vision algorithm in the Galactic Arecibo L-Band Feed Array HI survey (GALFA-HI). At a distance of 1kpc, the compact clouds would typically be 1 solar mass and 1pc in size. We observe that nearly all of the compact clouds that are classified as high velocity (> 90 km/s) are near previously-identified high velocity complexes. We separate the compact clouds into populations based on velocity, linewidth, and position. We have begun to search for evidence of dust in these clouds using IRIS and have detections in several populations.
NASA Astrophysics Data System (ADS)
Ratajczak, M.; Wężyk, P.
2015-12-01
Rapid development of terrestrial laser scanning (TLS) in recent years resulted in its recognition and implementation in many industries, including forestry and nature conservation. The use of the 3D TLS point clouds in the process of inventory of trees and stands, as well as in the determination of their biometric features (trunk diameter, tree height, crown base, number of trunk shapes), trees and lumber size (volume of trees) is slowly becoming a practice. In addition to the measurement precision, the primary added value of TLS is the ability to automate the processing of the clouds of points 3D in the direction of the extraction of selected features of trees and stands. The paper presents the original software (GNOM) for the automatic measurement of selected features of trees, based on the cloud of points obtained by the ground laser scanner FARO. With the developed algorithms (GNOM), the location of tree trunks on the circular research surface was specified and the measurement was performed; the measurement covered the DBH (l: 1.3m), further diameters of tree trunks at different heights of the tree trunk, base of the tree crown and volume of the tree trunk (the selection measurement method), as well as the tree crown. Research works were performed in the territory of the Niepolomice Forest in an unmixed pine stand (Pinussylvestris L.) on the circular surface with a radius of 18 m, within which there were 16 pine trees (14 of them were cut down). It was characterized by a two-storey and even-aged construction (147 years old) and was devoid of undergrowth. Ground scanning was performed just before harvesting. The DBH of 16 pine trees was specified in a fully automatic way, using the algorithm GNOM with an accuracy of +2.1%, as compared to the reference measurement by the DBH measurement device. The medium, absolute measurement error in the cloud of points - using semi-automatic methods "PIXEL" (between points) and PIPE (fitting the cylinder) in the FARO Scene 5.x., showed the error, 3.5% and 5.0%,.respectively The reference height was assumed as the measurement performed by the tape on the cut tree. The average error of automatic determination of the tree height by the algorithm GNOM based on the TLS point clouds amounted to 6.3% and was slightly higher than when using the manual method of measurements on profiles in the TerraScan (Terrasolid; the error of 5.6%). The relatively high value of the error may be mainly related to the small number of points TLS in the upper parts of crowns. The crown height measurement showed the error of +9.5%. The reference in this case was the tape measurement performed already on the trunks of cut pine trees. Processing the clouds of points by the algorithms GNOM for 16 analyzed trees took no longer than 10 min. (37 sec. /tree). The paper mainly showed the TLS measurement innovation and its high precision in acquiring biometric data in forestry, and at the same time also the further need to increase the degree of automation of processing the clouds of points 3D from terrestrial laser scanning.
NASA Technical Reports Server (NTRS)
Dong, Xiquan; Xi, Baike; Kennedy, Aaron; Minnis, Patrick; Wood, Robert
2013-01-01
A 19-month record of total, and single-layered low (0-3 km), middle (3-6 km), and high (> 6 km) cloud fractions (CFs), and the single-layered marine boundary layer (MBL) cloud macrophysical and microphysical properties has been generated from ground-based measurements taken at the ARM Azores site between June 2009 and December 2010. It documents the most comprehensive and longest dataset on marine cloud fraction and MBL cloud properties to date. The annual means of total CF, and single-layered low, middle, and high CFs derived from ARM radar-lidar observations are 0.702, 0.271, 0.01 and 0.106, respectively. More total and single-layered high CFs occurred during winter, while single-layered low CFs were greatest during summer. The diurnal cycles for both total and low CFs are stronger during summer than during winter. The CFs are bimodally distributed in the vertical with a lower peak at approx. 1 km and higher one between 8 and 11 km during all seasons, except summer, when only the low peak occurs. The persistent high pressure and dry conditions produce more single-layered MBL clouds and fewer total clouds during summer, while the low pressure and moist air masses during winter generate more total and multilayered-clouds, and deep frontal clouds associated with midlatitude cyclones.
On the Modeling and Management of Cloud Data Analytics
NASA Astrophysics Data System (ADS)
Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni
A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.
NASA Technical Reports Server (NTRS)
Loeb, N. G.; Varnai, Tamas; Winker, David M.
1998-01-01
Recent observational studies have shown that satellite retrievals of cloud optical depth based on plane-parallel model theory suffer from systematic biases that depend on viewing geometry, even when observations are restricted to overcast marine stratus layers, arguably the closest to plane parallel in nature. At moderate to low sun elevations, the plane-parallel model significantly overestimates the reflectance dependence on view angle in the forward-scattering direction but shows a similar dependence in the backscattering direction. Theoretical simulations are performed that show that the likely cause for this discrepancy is because the plane-parallel model assumption does not account for subpixel, scale variations in cloud-top height (i.e., "cloud bumps"). Monte Carlo simulation, comparing ID model radiances to radiances from overcast cloud field with 1) cloud-top height variation, but constant cloud volume extinction; 2) flat tops but horizontal variations in cloud volume extinction; and 3) variations in both cloud top height and cloud extinction are performed over a approximately equal to 4 km x 4 km domain (roughly the size of an individual GAC AVHRR pixel). The comparisons show that when cloud-top height variations are included, departures from 1D theory are remarkably similar (qualitatively) to those obtained observationally. In contrast, when clouds are assumed flat and only cloud extinction is variable, reflectance differences are much smaller and do not show any view-angle dependence. When both cloud-top height and cloud extinction variations are included, however, large increases in cloud extinction variability can enhance reflectance difference. The reason 3D-1D reflectance differences are more sensitive to cloud-top height variations in the forward-scattering direction (at moderate to low, sun elevations) is because photons leaving the cloud field in that direction experience fewer scattering events (low-order scattering) and are restricted to the topmost portions of the cloud. While reflectance deviations from 1D theory are much larger for bumpy clouds than for flat clouds with variable cloud extinction, differences in cloud albedo are comparable for these two cases.
Research on cloud background infrared radiation simulation based on fractal and statistical data
NASA Astrophysics Data System (ADS)
Liu, Xingrun; Xu, Qingshan; Li, Xia; Wu, Kaifeng; Dong, Yanbing
2018-02-01
Cloud is an important natural phenomenon, and its radiation causes serious interference to infrared detector. Based on fractal and statistical data, a method is proposed to realize cloud background simulation, and cloud infrared radiation data field is assigned using satellite radiation data of cloud. A cloud infrared radiation simulation model is established using matlab, and it can generate cloud background infrared images for different cloud types (low cloud, middle cloud, and high cloud) in different months, bands and sensor zenith angles.
Atmospheric cloud physics laboratory project study
NASA Technical Reports Server (NTRS)
Schultz, W. E.; Stephen, L. A.; Usher, L. H.
1976-01-01
Engineering studies were performed for the Zero-G Cloud Physics Experiment liquid cooling and air pressure control systems. A total of four concepts for the liquid cooling system was evaluated, two of which were found to closely approach the systems requirements. Thermal insulation requirements, system hardware, and control sensor locations were established. The reservoir sizes and initial temperatures were defined as well as system power requirements. In the study of the pressure control system, fluid analyses by the Atmospheric Cloud Physics Laboratory were performed to determine flow characteristics of various orifice sizes, vacuum pump adequacy, and control systems performance. System parameters predicted in these analyses as a function of time include the following for various orifice sizes: (1) chamber and vacuum pump mass flow rates, (2) the number of valve openings or closures, (3) the maximum cloud chamber pressure deviation from the allowable, and (4) cloud chamber and accumulator pressure.
NASA Technical Reports Server (NTRS)
Xu, Kuan-Man; Wong, Takmeng; Wielicki, Bruce a.; Parker, Lindsay; Lin, Bing; Eitzen, Zachary A.; Branson, Mark
2006-01-01
Characteristics of tropical deep convective cloud objects observed over the tropical Pacific during January-August 1998 are examined using the Tropical Rainfall Measuring Mission/ Clouds and the Earth s Radiant Energy System single scanner footprint (SSF) data. These characteristics include the frequencies of occurrence and statistical distributions of cloud physical properties. Their variations with cloud-object size, sea surface temperature (SST), and satellite precessing cycle are analyzed in detail. A cloud object is defined as a contiguous patch of the Earth composed of satellite footprints within a single dominant cloud-system type. It is found that statistical distributions of cloud physical properties are significantly different among three size categories of cloud objects with equivalent diameters of 100 - 150 km (small), 150 - 300 km (medium), and > 300 km (large), respectively, except for the distributions of ice particle size. The distributions for the larger-size category of cloud objects are more skewed towards high SSTs, high cloud tops, low cloud-top temperature, large ice water path, high cloud optical depth, low outgoing longwave (LW) radiation, and high albedo than the smaller-size category. As SST varied from one satellite precessing cycle to another, the changes in macrophysical properties of cloud objects over the entire tropical Pacific were small for the large-size category of cloud objects, relative to those of the small- and medium-size categories. This result suggests that the fixed anvil temperature hypothesis of Hartmann and Larson may be valid for the large-size category. Combining with the result that a higher percentage of the large-size category of cloud objects occurs during higher SST subperiods, this implies that macrophysical properties of cloud objects would be less sensitive to further warming of the climate. On the other hand, when cloud objects are classified according to SSTs where large-scale dynamics plays important roles, statistical characteristics of cloud microphysical properties, optical depth and albedo are not sensitive to the SST, but those of cloud macrophysical properties are strongly dependent upon the SST. Frequency distributions of vertical velocity from the European Center for Medium-range Weather Forecasts model that is matched to each cloud object are used to interpret some of the findings in this study.
NASA Astrophysics Data System (ADS)
Liu, Run; Liou, Kuo-Nan; Su, Hui; Gu, Yu; Zhao, Bin; Jiang, Jonathan H.; Liu, Shaw Chen
2017-05-01
The global mean precipitation is largely constrained by atmospheric radiative cooling rates (Qr), which are sensitive to changes in high cloud fraction. We investigate variations of high cloud fraction with surface temperature (Ts) from July 2002 to June 2015 and compute their radiative effects on Qr using the Fu-Liou-Gu plane-parallel radiation model. We find that the tropical mean (30°S-30°N) high cloud fraction decreases with increasing Ts at a rate of about -1.0 ± 0.34% K-1 from 2002 to 2015, which leads to an enhanced atmospheric cooling around 0.86 W m-2 K-1. On the other hand, the northern midlatitudes (30°N-60°N) high cloud fraction increases with surface warming at a rate of 1.85 ± 0.65% K-1 and the near-global mean (60°S-60°N) high cloud fraction shows a statistically insignificant decreasing trend with increasing Ts over the analysis period. Dividing high clouds into cirrus, cirrostratus, and deep convective clouds, we find that cirrus cloud fraction increases with surface warming at a rate of 0.32 ± 0.11% K-1 (0.01 ± 0.17% K-1) for the near-global mean (tropical mean), while cirrostratus and deep convective clouds decrease with surface warming at a rate of -0.02 ± 0.18% K-1 and -0.33 ± 0.18% K-1 for the near-global mean and -0.64 ± 0.23% K-1 and -0.37 ± 0.13% K-1 for the tropical mean, respectively. High cloud fraction response to feedback to Ts accounts for approximately 1.9 ± 0.7% and 16.0 ± 6.1% of the increase in precipitation per unit surface warming over the period of 2002-2015 for the near-global mean and the tropical mean, respectively.
NASA Astrophysics Data System (ADS)
Krämer, Martina
2016-04-01
Numerous airborne field campaigns were performed in the last decades to record cirrus clouds microphysical properties. Beside the understanding of the processes of cirrus formation and evolution, an additional motivation for those studies is to provide a database to evaluate the representation of cirrus clouds in global climate models. This is of importance for an improved certainty of climate predictions, which are affected by the poor understanding of the microphysical processes of ice clouds (IPCC, 2013). To this end, the observations should ideally cover the complete respective parameter range and not be influenced by instrumental artifacts. However, due to the difficulties in measuring cirrus properties on fast-flying, high-altitude aircraft, some issues with respect to the measurements %evolved have arisen. In particular, concerns about the relative humidity in and around cirrus clouds and the ice crystal number concentrations were under discussion. Too high ice supersaturations as well as ice number concentrations were often reported. These issues have made more challenging the goal of compiling a large database using data from a suite of different instruments that were used on different campaigns. In this study, we have have addressed these challenges and compiled a large data set of cirrus clouds, sampled during eighteen field campaigns between 75°N and 25°S, representing measurements fulfilling the above mentioned requirements. The most recent campaigns were performed in 2014; namely, the ATTREX campaign with the research aircraft Global Hawk and the ML-CIRRUS and ACRIDICON campaigns with HALO. % The observations include ice water content (IWC: 130 hours of observations), ice crystal numbers (N_ice: 83 hours), ice crystal mean mass size (Rice: 83 hours) and relative humidity (RH_ice) in- and outside of cirrus clouds (78 and 140 hours). % We will present the parameters as PDFs versus temperature and derive medians and core ranges (including the most frequent observations) for each parameter. The new large data sets confirm the earlier results presented by Schiller et al. (JGR, 2008), Krämer et al. (ACP, 2009) and Luebke et al. (ACP, 2013), which are all based on much smaller datasets. Further, we will show the geographical and altitude distribution of IWC, N_ice, R_ice and RH_ice.
Hurricane Irma's Cloud Structure as Seen by NASA's AIRS
2017-09-08
The large-scale structure of clouds in and around Hurricane Irma is seen in this animation and still image created with data from the Atmospheric Infrared Sounder (AIRS) instrument on NASA's Aqua satellite. The clouds are typical of tropical areas both nearby and away from tropical cyclones. Observations were taken at 1 p.m. EDT (5 p.m. UTC) on Tuesday, Sept. 5, 2017, as Irma approached the Caribbean islands and was just becoming a powerful Category 5 storm. Each cylinder represents a volume of cloud detected by AIRS. The oval cylinder ends represent a region viewed by AIRS, with the oval sizes adjusted to reflect the proportion of clouds filling the area viewed. The largest ovals are about 30 miles (45 kilometers) across. The height of the cylinders indicates the cloud thickness, with thickest clouds reaching down to the surface. The vertical scale is exaggerated 15 times. Colors represent temperatures at the tops of the clouds. The perspective views the storm diagonally from above with an initial view toward the north-northwest, with the perspective rotating clockwise for a full circle. The area depicted is about 1,000 miles by 800 miles across (1,600 by 1,300 kilometers). At the start of the loop, North America is seen at the top of the image, and coastal Venezuela at the lower right. In the initial perspective, cirrus clouds (thin and blue), associated with flow outward from the top of the hurricane, overlie warmer (pink and red) shallow clouds. About five seconds into the loop, the deep clouds in the middle of Irma are easily seen. The most dangerous parts of Irma are within the region of high and cold (blue), thick clouds surrounding the central eye. The clouds are cold because they are carried to high, cold altitudes by vigorous thunderstorms within the hurricane. The eye itself is nearly cloud free, but the few clouds within it are low and warm. As the perspective shift toward the south-southeast around seven seconds into the loop, another storm system well north of Irma can be seen. It contains high, thick clouds, with more cirrus carried outward over shallow clouds. At about nine seconds, more outflow from Irma is seen, with high, thin clouds over shallow clouds once again apparent. Shortly afterward when the view is toward the southwest, yet more deep clouds and their outflowing cirrus clouds are apparent. This image depicts many of the clouds typical of the tropics even when cyclones are not present: high, cold thunderstorms pushing cirrus clouds over nearby regions containing many warm, shallow clouds. The animation also shows the structure typical of tropical cyclones around the world: very strong thunderstorms lifting clouds into cold parts of the atmosphere, with strong outflow at upper levels carrying cirrus clouds away from the storm center, and the storm organized symmetrically around a central eye. https://photojournal.jpl.nasa.gov/catalog/PIA21950
Design and Performance of McRas in SCMs and GEOS I/II GCMs
NASA Technical Reports Server (NTRS)
Sud, Yogesh C.; Einaudi, Franco (Technical Monitor)
2000-01-01
The design of a prognostic cloud scheme named McRAS (Microphysics of clouds with Relaxed Arakawa-Schubert Scheme) for general circulation models (GCMs) will be discussed. McRAS distinguishes three types of clouds: (1) convective, (2) stratiform, and (3) boundary-layer types. The convective clouds transform and merge into stratiform clouds on an hourly time-scale, while the boundary-layer clouds merge into the stratiform clouds instantly. The cloud condensate converts into precipitation following the auto-conversion equations of Sundqvist that contain a parametric adaptation for the Bergeron-Findeisen process of ice crystal growth and collection of cloud condensate by precipitation. All clouds convect, advect, as well as diffuse both horizontally and vertically with a fully interactive cloud-microphysics throughout the life-cycle of the cloud, while the optical properties of clouds are derived from the statistical distribution of hydrometeors and idealized cloud geometry. An evaluation of McRAS in a single column model (SCM) with the GATE Phase III and 5-ARN CART datasets has shown that together with the rest of the model physics, McRAS can simulate the observed temperature, humidity, and precipitation without many systematic errors. The time history and time mean incloud water and ice distribution, fractional cloudiness, cloud optical thickness, origin of precipitation in the convective anvil and towers, and the convective updraft and downdraft velocities and mass fluxes all show a realistic behavior. Performance of McRAS in GEOS 11 GCM shows several satisfactory features but some of the remaining deficiencies suggest need for additional research involving convective triggers and inhibitors, provision for continuously detraining updraft, a realistic scheme for cumulus gravity wave drag, and refinements to physical conditions for ascertaining cloud detrainment level.
NASA Astrophysics Data System (ADS)
Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock
2017-01-01
The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.
NASA Astrophysics Data System (ADS)
Weatherill, Daniel P.; Stefanov, Konstantin D.; Greig, Thomas A.; Holland, Andrew D.
2014-07-01
Pixellated monolithic silicon detectors operated in a photon-counting regime are useful in spectroscopic imaging applications. Since a high energy incident photon may produce many excess free carriers upon absorption, both energy and spatial information can be recovered by resolving each interaction event. The performance of these devices in terms of both the energy and spatial resolution is in large part determined by the amount of diffusion which occurs during the collection of the charge cloud by the pixels. Past efforts to predict the X-ray performance of imaging sensors have used either analytical solutions to the diffusion equation or simplified monte carlo electron transport models. These methods are computationally attractive and highly useful but may be complemented using more physically detailed models based on TCAD simulations of the devices. Here we present initial results from a model which employs a full transient numerical solution of the classical semiconductor equations to model charge collection in device pixels under stimulation from initially Gaussian photogenerated charge clouds, using commercial TCAD software. Realistic device geometries and doping are included. By mapping the pixel response to different initial interaction positions and charge cloud sizes, the charge splitting behaviour of the model sensor under various illuminations and operating conditions is investigated. Experimental validation of the model is presented from an e2v CCD30-11 device under varying substrate bias, illuminated using an Fe-55 source.
NASA Technical Reports Server (NTRS)
Xi, B.; Minnis, P.
2006-01-01
Data collected at the Department of Energy Atmospheric Radiation Measurement (ARM) Southern Great Plains (SGP) Central Facility (SCF) are analyzed to determine the monthly and hourly variations of cloud fraction and radiative forcing between January 1997 and December 2002. Cloud fractions are estimated for total cloud cover and for single-layered low (0-3 km), middle (3-6 km), and high clouds (more than 6 km) using ARM SCG ground-based paired lidar-radar measurements. Shortwave (SW) and longwave (LW) fluxes are derived from up- and down-looking standard precision spectral pyranometers and precision infrared radiometer measurements with uncertainties of approximately 10 Wm(exp -2). The annual averages of total, and single-layered low, middle and high cloud fractions are 0.49, 0.11, 0.03, and 0.17, respectively. Both total and low cloud amounts peak during January and February and reach a minimum during July and August, high clouds occur more frequently than other types of clouds with a peak in summer. The average annual downwelling surface SW fluxes for total and low clouds (151 and 138 Wm(exp-2), respectively) are less than those under middle and high clouds (188 and 201 Wm(exp -2), respectively), but the downwelling LW fluxes (349 and 356 Wm(exp -2)) underneath total and low clouds are greater than those from middle and high clouds (337 and 333 Wm(exp -2)). Low clouds produce the largest LW warming (55 Wm(exp -2) and SW cooling (-91 Wm(exp -2)) effects with maximum and minimum absolute values in spring and summer, respectively. High clouds have the smallest LW warming (17 Wm(exp -2)) and SW cooling (-37 Wm(exp -2)) effects at the surface. All-sky SW CRF decreases and LW CRF increases with increasing cloud fraction with mean slopes of -0.984 and 0.616 Wm(exp -2)%(exp -1), respectively. Over the entire diurnal cycle, clouds deplete the amount of surface insolation more than they add to the downwelling LW flux. The calculated CRFs do not appear to be significantly affected by uncertainties in data sampling and clear-sky screening. Traditionally, cloud radiative forcing includes, not only the radiative impact of the hydrometeors, but also the changes in the environment. Taken together over the ARM SCF, changes in humidity and surface albedo between clear and cloudy conditions offset approximately 20% of the NET radiative forcing caused by the cloud hydrometeors alone. Variations in water vapor, on average, account for 10% and 83% of the SW and LW CRFs, respectively, in total cloud cover conditions. The error analysis further reveals that the cloud hydrometeors dominate the SW CRF, while water vapor changes are most important for LW flux changes in cloudy skies. Similar studies over other locales are encouraged where water and surface albedo changes from clear to cloudy conditions may be much different than observed over the ARM SCF.
Carbon Dioxide Clouds at High Altitude in the Tropics and in an Early Dense Martian Atmosphere
NASA Technical Reports Server (NTRS)
Colaprete, Anthony; Toon, Owen B.
2001-01-01
We use a time dependent, microphysical cloud model to study the formation of carbon dioxide clouds in the Martian atmosphere. Laboratory studies by Glandor et al. show that high critical supersaturations are required for cloud particle nucleation and that surface kinetic growth is not limited. These conditions, which are similar to those for cirrus clouds on Earth, lead to the formation of carbon dioxide ice particles with radii greater than 500 micrometers and concentrations of less than 0.1 cm(exp -3) for typical atmospheric conditions. Within the current Martian atmosphere, CO2 cloud formation is possible at the poles during winter and at high altitudes in the tropics during periods of increased atmospheric dust loading. In both cases, temperature perturbations of several degrees below the CO2 saturation temperature are required to nucleate new cloud particles suggesting that dynamical processes are the most common initiators of carbon dioxide clouds rather than diabatic cooling. The microphysical cloud model, coupled to a two-stream radiative transfer model, is used to reexamine the impact of CO2 clouds on the surface temperature within a dense CO2 atmosphere. The formation of carbon dioxide clouds leads to a warmer surface than what would be expected for clear sky conditions. The amount of warming is sensitive to the presence of dust and water vapor in the atmosphere, both of which act to dampen cloud effects. The radiative warming associated with cloud formation, as well as latent heating, work to dissipate the clouds when present. Thus, clouds never last for periods much longer than several days, limiting their overall effectiveness for warming the surface. The time average cloud optical depth is approximately unity leading to a 5-10 K warming, depending on the surface pressure. However, the surface temperature does not rise about the freezing point of liquid water even for pressures as high as 5 bars, at a solar luminosity of 75% the current value.
Research on Influence of Cloud Environment on Traditional Network Security
NASA Astrophysics Data System (ADS)
Ming, Xiaobo; Guo, Jinhua
2018-02-01
Cloud computing is a symbol of the progress of modern information network, cloud computing provides a lot of convenience to the Internet users, but it also brings a lot of risk to the Internet users. Second, one of the main reasons for Internet users to choose cloud computing is that the network security performance is great, it also is the cornerstone of cloud computing applications. This paper briefly explores the impact on cloud environment on traditional cybersecurity, and puts forward corresponding solutions.
NASA Technical Reports Server (NTRS)
Sohn, Byung-Ju; Smith, Eric A.
1992-01-01
This paper focuses on the role of cloud- and surface-atmosphere forcing on the net radiation balance and their potential impact on the general circulation at climate time scales. The globally averaged cloud-forcing estimates and cloud sensitivity values taken from various recent studies are summarized. It is shown that the net radiative heating over the tropics is principally due to high clouds, while the net cooling in mid- and high latitudes is dominated by low and middle clouds.
Solar radiation measurements and their applications in climate research
NASA Astrophysics Data System (ADS)
Yin, Bangsheng
Aerosols and clouds play important roles in the climate system through their radiative effects and their vital link in the hydrological cycle. Accurate measurements of aerosol and cloud optical and microphysical properties are crucial for the study of climate and climate change. This study develops/improves retrieval algorithms for aerosol single scattering albedo (SSA) and low liquid water path (LWP) cloud optical properties, evaluates a new spectrometer, and applies long-term measurements to establish climatology of aerosol and cloud optical properties. The following results were obtained. (1) The ratio of diffuse horizontal and direct normal fluxes measured from Multifilter Rotating Shadowband Radiometer (MFRSR) has been used to derive the aerosol SSA. Various issues have impacts on the accuracy of SSA retrieval, from measurements (e.g., calibration accuracy, cosine respond correction, and forward scattering correction) to input parameters and assumptions (e.g., asymmetry factor, Rayleigh scattering optical depth, and surface albedo). This study carefully analyzed these issues and extensively assessed their impacts on the retrieval accuracy. Furthermore, the retrievals of aerosol SSA from MFRSR are compared with independent measurements from co-located instruments. (2) The Thin-Cloud Rotating Shadowband Radiometer (TCRSR) has been used to derive simultaneously the cloud optical depth (COD) and cloud drop effective radius (DER), subsequently inferring the cloud liquid-water path (LWP). The evaluation of the TCRSR indicates that the error of radiometric calibration has limited impact on the cloud DER retrievals. However, the retrieval accuracy of cloud DER is sensitive to the uncertainties of background setting (e.g., aerosol loading and the existence of ice cloud) and the measured solar aureole shape. (3) A new high resolution oxygen A-band spectrometer (HABS) has been developed, which has the ability to measure both direct-beam and zenith diffuse solar radiation with polarization capability. The HABS exhibits excellent performance: stable spectral response ratio, high SNR, high spectrum resolution (0.16 nm), and high Out-of-Band Rejection (10-5). The HABS measured spectra and polarization spectra are basically consistent with the related simulated spectra. The main difference between them occurs at or near the strong oxygen absorption line centers. Furthermore, our study demonstrates that it is a good method to derive the degree of polarization-oxygen absorption optical depth (DOP-k) relationship through a polynomial fitting in the DOP-k space. (4) The long-term MFRSR measurements at Darwin (Australia), Nauru (Nauru), and Manus (Papua New Guinea) sites have been processed to develop the climatology of aerosols and clouds in the Tropical Warm Pool (TWP) region at the interannual, seasonal, and diurnal temporal scales. Due to the association of these three sites with large-scale circulation patterns, aerosol and cloud properties exhibit distinctive characteristics. The cloud optical depth (COD) and cloud fraction (CF) exhibit apparent increasing trends from 1998 to 2007 and decreasing trends after 2007. The monthly anomaly values, to some extent, are bifurcately correlated with SOI, depending on the phase of ENSO. At the two oceanic sites of Manus and Nauru, aerosols, clouds, and precipitation are modulated by the meteorological changes associated with MJO events. (5) The long-term measurements at Barrow and Atqasuk sites also have been processed to develop the climatology of aerosol and cloud properties in the North Slope of Alaska (NSA) region at interannual, seasonal, and diurnal temporal scales. Due to Arctic climate warming, at these two sites, the snow melting day arrives earlier and the non-snow-cover duration increases. Aerosol optical depth (AOD) increased during the periods of 2001-2003 and 2005-2009, and decreased during 2003-2005. The LWP, COD, and CF exhibit apparently decreasing trends from 2002 to 2007 and increased significantly after 2008. (Abstract shortened by UMI.)
CUVE - Cubesat UV Experiment: Unveil Venus' UV Absorber with Cubesat UV Mapping Spectrometer
NASA Astrophysics Data System (ADS)
Cottini, V.; Aslam, S.; D'Aversa, E.; Glaze, L.; Gorius, N.; Hewagama, T.; Ignatiev, N.; Piccioni, G.
2017-09-01
Our Venus mission concept Cubesat UV Experiment (CUVE) is one of ten proposals selected for funding by the NASA PSDS3 Program - Planetary Science Deep Space SmallSat Studies. CUVE concept is to insert a CubeSat spacecraft into a Venusian orbit and perform remote sensing of the UV spectral region using a high spectral resolution point spectrometer to resolve UV molecular bands, observe nightglow, and characterize the unidentified main UV absorber. The UV spectrometer is complemented by an imaging UV camera with multiple bands in the UV absorber main band range for contextual imaging. CUVE Science Objectives are: the nature of the "Unknown" UV-absorber; the abundances and distributions of SO2 and SO at and above Venus's cloud tops and their correlation with the UV absorber; the atmospheric dynamics at the cloud tops, structure of upper clouds and wind measurements from cloud-tracking; the nightglow emissions: NO, CO, O2. This mission will therefore be an excellent platform to study Venus' cloud top atmospheric properties where the UV absorption drives the planet's energy balance. CUVE would complement past, current and future Venus missions with conventional spacecraft, and address critical science questions cost effectively.
The cloud paradigm applied to e-Health.
Vilaplana, Jordi; Solsona, Francesc; Abella; Filgueira, Rosa; Rius, Josep
2013-03-14
Cloud computing is a new paradigm that is changing how enterprises, institutions and people understand, perceive and use current software systems. With this paradigm, the organizations have no need to maintain their own servers, nor host their own software. Instead, everything is moved to the cloud and provided on demand, saving energy, physical space and technical staff. Cloud-based system architectures provide many advantages in terms of scalability, maintainability and massive data processing. We present the design of an e-health cloud system, modelled by an M/M/m queue with QoS capabilities, i.e. maximum waiting time of requests. Detailed results for the model formed by a Jackson network of two M/M/m queues from the queueing theory perspective are presented. These results show a significant performance improvement when the number of servers increases. Platform scalability becomes a critical issue since we aim to provide the system with high Quality of Service (QoS). In this paper we define an architecture capable of adapting itself to different diseases and growing numbers of patients. This platform could be applied to the medical field to greatly enhance the results of those therapies that have an important psychological component, such as addictions and chronic diseases.
NASA Technical Reports Server (NTRS)
Holz, Robert E.; Ackerman, Steve; Antonelli, Paolo; Nagle, Fred; McGill, Matthew; Hlavka, Dennis L.; Hart, William D.
2005-01-01
This paper presents a comparison of cloud-top altitude retrieval methods applied to S-HIS (Scanning High Resolution Interferometer Sounder) measurements. Included in this comparison is an improvement to the traditional CO2 Slicing method. The new method, CO2 Sorting, determines optimal channel pairs to apply the CO2 Slicing. Measurements from collocated samples of the Cloud Physics Lidar (CPL) and Modis Airborne Simulator (MAS) instruments assist in the comparison. For optically thick clouds good correlation between the S-HIS and lidar cloud-top retrievals are found. For tenuous ice clouds there can be large differences between lidar (CPL) and S-HIS retrieved cloud-tops. It is found that CO2 Sorting significantly reduces the cloud height biases for the optically thin cloud (total optical depths less then 1.0). For geometrically thick but optically thin cirrus clouds large differences between the S-HIS infrared cloud top retrievals and the CPL detected cloud top where found. For these cases the cloud height retrieved by the S-HIS cloud retrievals correlated closely with the level the CPL integrated cloud optical depth was approximately 1.0.
A microphysical pathway analysis to investigate aerosol effects on convective clouds
NASA Astrophysics Data System (ADS)
Heikenfeld, Max; White, Bethan; Labbouz, Laurent; Stier, Philip
2017-04-01
The impact of aerosols on ice- and mixed-phase processes in convective clouds remains highly uncertain, which has strong implications for estimates of the role of aerosol-cloud interactions in the climate system. The wide range of interacting microphysical processes are still poorly understood and generally not resolved in global climate models. To understand and visualise these processes and to conduct a detailed pathway analysis, we have added diagnostic output of all individual process rates for number and mass mixing ratios to two commonly-used cloud microphysics schemes (Thompson and Morrison) in WRF. This allows us to investigate the response of individual processes to changes in aerosol conditions and the propagation of perturbations throughout the development of convective clouds. Aerosol effects on cloud microphysics could strongly depend on the representation of these interactions in the model. We use different model complexities with regard to aerosol-cloud interactions ranging from simulations with different levels of fixed cloud droplet number concentration (CDNC) as a proxy for aerosol, to prognostic CDNC with fixed modal aerosol distributions. Furthermore, we have implemented the HAM aerosol model in WRF-chem to also perform simulations with a fully interactive aerosol scheme. We employ a hierarchy of simulation types to understand the evolution of cloud microphysical perturbations in atmospheric convection. Idealised supercell simulations are chosen to present and test the analysis methods for a strongly confined and well-studied case. We then extend the analysis to large case study simulations of tropical convection over the Amazon rainforest. For both cases we apply our analyses to individually tracked convective cells. Our results show the impact of model uncertainties on the understanding of aerosol-convection interactions and have implications for improving process representation in models.
Fast Transverse Beam Instability Caused by Electron Cloud Trapped in Combined Function Magnets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Antipov, Sergey
Electron cloud instabilities affect the performance of many circular high-intensity particle accelerators. They usually have a fast growth rate and might lead to an increase of the transverse emittance and beam loss. A peculiar example of such an instability is observed in the Fermilab Recycler proton storage ring. Although this instability might pose a challenge for future intensity upgrades, its nature had not been completely understood. The phenomena has been studied experimentally by comparing the dynamics of stable and unstable beam, numerically by simulating the build-up of the electron cloud and its interaction with the beam, and analytically by constructing a model of an electron cloud driven instability with the electrons trapped in combined function dipoles. Stabilization of the beam by a clearing bunch reveals that the instability is caused by the electron cloud, trapped in beam optics magnets. Measurements of microwave propagation confirm the presence of the cloud in the combined function dipoles. Numerical simulations show that up to 10more » $$^{-2}$$ of the particles can be trapped by their magnetic field. Since the process of electron cloud build-up is exponential, once trapped this amount of electrons significantly increases the density of the cloud on the next revolution. In a combined function dipole this multi-turn accumulation allows the electron cloud reaching final intensities orders of magnitude greater than in a pure dipole. The estimated fast instability growth rate of about 30 revolutions and low mode frequency of 0.4 MHz are consistent with experimental observations and agree with the simulations. The created instability model allows investigating the beam stability for the future intensity upgrades.« less
High-performance web services for querying gene and variant annotation.
Xin, Jiwen; Mark, Adam; Afrasiabi, Cyrus; Tsueng, Ginger; Juchler, Moritz; Gopal, Nikhil; Stupp, Gregory S; Putman, Timothy E; Ainscough, Benjamin J; Griffith, Obi L; Torkamani, Ali; Whetzel, Patricia L; Mungall, Christopher J; Mooney, Sean D; Su, Andrew I; Wu, Chunlei
2016-05-06
Efficient tools for data management and integration are essential for many aspects of high-throughput biology. In particular, annotations of genes and human genetic variants are commonly used but highly fragmented across many resources. Here, we describe MyGene.info and MyVariant.info, high-performance web services for querying gene and variant annotation information. These web services are currently accessed more than three million times permonth. They also demonstrate a generalizable cloud-based model for organizing and querying biological annotation information. MyGene.info and MyVariant.info are provided as high-performance web services, accessible at http://mygene.info and http://myvariant.info . Both are offered free of charge to the research community.
Using Cloud-based Storage Technologies for Earth Science Data
NASA Astrophysics Data System (ADS)
Michaelis, A.; Readey, J.; Votava, P.
2016-12-01
Cloud based infrastructure may offer several key benefits of scalability, built in redundancy and reduced total cost of ownership as compared with a traditional data center approach. However, most of the tools and software systems developed for NASA data repositories were not developed with a cloud based infrastructure in mind and do not fully take advantage of commonly available cloud-based technologies. Object storage services are provided through all the leading public (Amazon Web Service, Microsoft Azure, Google Cloud, etc.) and private (Open Stack) clouds, and may provide a more cost-effective means of storing large data collections online. We describe a system that utilizes object storage rather than traditional file system based storage to vend earth science data. The system described is not only cost effective, but shows superior performance for running many different analytics tasks in the cloud. To enable compatibility with existing tools and applications, we outline client libraries that are API compatible with existing libraries for HDF5 and NetCDF4. Performance of the system is demonstrated using clouds services running on Amazon Web Services.
An Architecture for Cross-Cloud System Management
NASA Astrophysics Data System (ADS)
Dodda, Ravi Teja; Smith, Chris; van Moorsel, Aad
The emergence of the cloud computing paradigm promises flexibility and adaptability through on-demand provisioning of compute resources. As the utilization of cloud resources extends beyond a single provider, for business as well as technical reasons, the issue of effectively managing such resources comes to the fore. Different providers expose different interfaces to their compute resources utilizing varied architectures and implementation technologies. This heterogeneity poses a significant system management problem, and can limit the extent to which the benefits of cross-cloud resource utilization can be realized. We address this problem through the definition of an architecture to facilitate the management of compute resources from different cloud providers in an homogenous manner. This preserves the flexibility and adaptability promised by the cloud computing paradigm, whilst enabling the benefits of cross-cloud resource utilization to be realized. The practical efficacy of the architecture is demonstrated through an implementation utilizing compute resources managed through different interfaces on the Amazon Elastic Compute Cloud (EC2) service. Additionally, we provide empirical results highlighting the performance differential of these different interfaces, and discuss the impact of this performance differential on efficiency and profitability.
Using Deep Learning Model for Meteorological Satellite Cloud Image Prediction
NASA Astrophysics Data System (ADS)
Su, X.
2017-12-01
A satellite cloud image contains much weather information such as precipitation information. Short-time cloud movement forecast is important for precipitation forecast and is the primary means for typhoon monitoring. The traditional methods are mostly using the cloud feature matching and linear extrapolation to predict the cloud movement, which makes that the nonstationary process such as inversion and deformation during the movement of the cloud is basically not considered. It is still a hard task to predict cloud movement timely and correctly. As deep learning model could perform well in learning spatiotemporal features, to meet this challenge, we could regard cloud image prediction as a spatiotemporal sequence forecasting problem and introduce deep learning model to solve this problem. In this research, we use a variant of Gated-Recurrent-Unit(GRU) that has convolutional structures to deal with spatiotemporal features and build an end-to-end model to solve this forecast problem. In this model, both the input and output are spatiotemporal sequences. Compared to Convolutional LSTM(ConvLSTM) model, this model has lower amount of parameters. We imply this model on GOES satellite data and the model perform well.
New Icing Cloud Simulation System at the NASA Glenn Research Center Icing Research Tunnel
NASA Technical Reports Server (NTRS)
Irvine, Thomas B.; Oldenburg, John R.; Sheldon, David W.
1999-01-01
A new spray bar system was designed, fabricated, and installed in the NASA Glenn Research Center's Icing Research Tunnel (IRT). This system is key to the IRT's ability to do aircraft in-flight icing cloud simulation. The performance goals and requirements levied on the design of the new spray bar system included increased size of the uniform icing cloud in the IRT test section, faster system response time, and increased coverage of icing conditions as defined in Appendix C of the Federal Aviation Regulation (FAR), Part 25 and Part 29. Through significant changes to the mechanical and electrical designs of the previous-generation spray bar system, the performance goals and requirements were realized. Postinstallation aerodynamic and icing cloud calibrations were performed to quantify the changes and improvements made to the IRT test section flow quality and icing cloud characteristics. The new and improved capability to simulate aircraft encounters with in-flight icing clouds ensures that the 1RT will continue to provide a satisfactory icing ground-test simulation method to the aeronautics community.
Performance Evaluation of sUAS Equipped with Velodyne HDL-32E LiDAR Sensor
NASA Astrophysics Data System (ADS)
Jozkow, G.; Wieczorek, P.; Karpina, M.; Walicka, A.; Borkowski, A.
2017-08-01
The Velodyne HDL-32E laser scanner is used more frequently as main mapping sensor in small commercial UASs. However, there is still little information about the actual accuracy of point clouds collected with such UASs. This work evaluates empirically the accuracy of the point cloud collected with such UAS. Accuracy assessment was conducted in four aspects: impact of sensors on theoretical point cloud accuracy, trajectory reconstruction quality, and internal and absolute point cloud accuracies. Theoretical point cloud accuracy was evaluated by calculating 3D position error knowing errors of used sensors. The quality of trajectory reconstruction was assessed by comparing position and attitude differences from forward and reverse EKF solution. Internal and absolute accuracies were evaluated by fitting planes to 8 point cloud samples extracted for planar surfaces. In addition, the absolute accuracy was also determined by calculating point 3D distances between LiDAR UAS and reference TLS point clouds. Test data consisted of point clouds collected in two separate flights performed over the same area. Executed experiments showed that in tested UAS, the trajectory reconstruction, especially attitude, has significant impact on point cloud accuracy. Estimated absolute accuracy of point clouds collected during both test flights was better than 10 cm, thus investigated UAS fits mapping-grade category.
Lidar Measurements of Wind and Cloud Around Venus from an Orbiting or Floating/flying Platform
NASA Technical Reports Server (NTRS)
Singh, Upendra N.; Limaye, Sanjay; Emmitt, George D.; Refaat, Tamer F.; Kavaya, Michael J.; Yu, Jirong; Petros, Mulugeta
2015-01-01
Given the presence of clouds and haze in the upper portion of the Venus atmosphere, it is reasonable to consider a Doppler wind lidar (DWL) for making remote measurements of the 3-dimensional winds within the tops of clouds and the overlying haze layer. Assuming an orbit altitude of 250 kilometers and cloud tops at 60 kilometers (within the upper cloud layer), an initial performance assessment of an orbiting DWL was made using a numerical instrument and atmospheres model developed for both Earth and Mars. It is reasonable to expect vertical profiles of the 3-dimensional wind speed with 1 kilometer vertical resolution and horizontal spacing of 25 kilometers to several 100 kilometers depending upon the desired integration times. These profiles would begin somewhere just below the tops of the highest clouds and extend into the overlying haze layer to some to-be-determined height. Getting multiple layers of cloud returns is also possible with no negative impact on velocity measurement accuracy. The knowledge and expertise for developing coherent Doppler wind lidar technologies and techniques, for Earth related mission at NASA Langley Research Center is being leveraged to develop an appropriate system suitable for wind measurement around Venus. We are considering a fiber-laser-based lidar system of high efficiency and smaller size and advancing the technology level to meet the requirements for DWL system for Venus from an orbiting or floating/flying platform. This presentation will describe the concept, simulation and technology development plan for wind and cloud measurements on Venus.
Probing the gas density in our Galactic Centre: moving mesh simulations of G2
NASA Astrophysics Data System (ADS)
Steinberg, Elad; Sari, Re'em; Gnat, Orly; Gillessen, Stefan; Plewa, Philipp; Genzel, Reinhard; Eisenhauer, Frank; Ott, Thomas; Pfuhl, Oliver; Habibi, Maryam; Waisberg, Idel; von Fellenberg, Sebastiano; Dexter, Jason; Bauböck, Michi; Rosales, Alejandra Jimenez
2018-01-01
The G2 object has recently passed its pericentre passage in our Galactic Centre. While the Brγ emission shows clear signs of tidal interaction, the change in the observed luminosity is only of about a factor of 2, in contention with all previous predictions. We present high-resolution simulations performed with the moving mesh code, RICH, together with simple analytical arguments that reproduce the observed Brγ emission. In our model, G2 is a gas cloud that undergoes tidal disruption in a dilute ambient medium. We find that during pericentre passage, the efficient cooling of the cloud results in a vertical collapse, compressing the cloud by a factor of ∼5000. By properly taking into account the ionization state of the gas, we find that the cloud is UV starved and are able to reproduce the observed Brγ luminosity. For densities larger than ≈500 cm-3 at pericentre, the cloud fragments due to cooling instabilities and the emitted radiation is inconsistent with observations. For lower densities, the cloud survives the pericentre passage intact and its emitted radiation matches the observed light curve. From the duration of Brγ emission that contains both redshifted and blueshifted components, we show that the cloud is not spherical but rather elongated with a size ratio of 4 at year 2001. The simulated cloud's elongation grows as it travels towards pericentre and is consistent with observations, due to viewing angles. The simulation is also consistent with having a spherical shape at apocentre.
NASA Astrophysics Data System (ADS)
Ohama, Akio; Kohno, Mikito; Fujita, Shinji; Tsutsumi, Daichi; Hattori, Yusuke; Torii, Kazufumi; Nishimura, Atsushi; Sano, Hidetoshi; Yamamoto, Hiroaki; Tachihara, Kengo; Fukui, Yasuo
2018-05-01
Young H II regions are an important site for the study of O star formation based on distributions of ionized and molecular gas. We reveal that two molecular clouds at ˜48 km s-1 and ˜53 km s-1 are associated with the H II regions G018.149-00.283 in RCW 166 by using the JCMT CO High-Resolution Survey (COHRS) of the 12CO(J = 3-2) emission. G018.149-00.283 comprises a bright ring at 8 μm and an extended H II region inside the ring. The ˜48 km s-1 cloud delineates the ring, and the ˜53 km s-1 cloud is located within the ring, indicating a complementary distribution between the two molecular components. We propose a hypothesis that high-mass stars within G018.149-00.283 were formed by triggering during cloud-cloud collision at a projected velocity separation of ˜5 km s-1. We argue that G018.149-00.283 is in an early evolutionary stage, ˜0.1 Myr after the collision according to the scheme detailed by Habe and Ohta (1992, PASJ, 44, 203), which will be followed by a bubble formation stage like RCW 120. We also suggest that nearby H II regions N21 and N22 are candidates for bubbles possibly formed by cloud-cloud collision. Inoue and Fukui (2013, ApJ, 774, L31) showed that the interface gas becomes highly turbulent and realizes a high-mass accretion rate of 10-3-10-4 M⊙ yr-1 by magnetohydrodynamical numerical simulations, which offers an explanation of the O-star formation. The fairly high frequency of cloud-cloud collision in RCW 166 is probably due to the high cloud density in this part of the Scutum arm.
Evaluating the cloud radiative forcing over East Asia during summer simulated by CMIP5 models
NASA Astrophysics Data System (ADS)
Lin, Z.; Wang, Y.; Liu, X.
2017-12-01
A large degree of uncertainty in global climate models (GCMs) can be attributed to the representation of clouds and its radiative forcing (CRF). In this study, the simulated CRFs, total cloud fraction (CF) and cloud properties over East Asia from 20 CMIP5 AMIP models are evaluated and compared with multiple satellite observations, and the possible causes for the CRF bias in the CMIP5 models are then investigated. Based on the satellite observation, strong Long wave CRF (LWCRF) and Short wave CRF (SWCRF) are found to be located over Southwestern China, with minimum SWCRF less than -130Wm-2 and this is associated with the large amount of cloud in the region. By contrast, weak CRFs are located over Northwest China and Western Pacific region because of less cloud amount. In Northeastern China, the strong SWCRF and week LWCRF can be found due to the dominant low-level cloud. In Eastern China, the CRFs is moderate due to the co-existence of the multi-layer cloud. CMIP5 models can basically capture the structure of CRFs in East Asia, with the spatial correlation coefficient between 0.5 and 0.9. But most models underestimate CRFs in East Asia, which is highly associated with the underestimation of cloud amount in the region. The performance of CMIP5 models varies in different part of East Asian region, with a larger deviation in Eastern China (EC). Further investigation suggests that, underestimation of the cloud amount in EC can lead to the weak bias of CRFs in EC, however, this CRF bias can be cancelled out by the overestimation effect of CRF due to excessive cloud optical depth (COD) simulated by the models. The annual cycle of simulated CRF over Eastern China is also examined, and it is found, CMIP models are unable to reproduce the northward migration of CRF in summer monsoon season, which is closely related with northward shift of East Asian summer monsoon rain belt.
An Interface for Biomedical Big Data Processing on the Tianhe-2 Supercomputer.
Yang, Xi; Wu, Chengkun; Lu, Kai; Fang, Lin; Zhang, Yong; Li, Shengkang; Guo, Guixin; Du, YunFei
2017-12-01
Big data, cloud computing, and high-performance computing (HPC) are at the verge of convergence. Cloud computing is already playing an active part in big data processing with the help of big data frameworks like Hadoop and Spark. The recent upsurge of high-performance computing in China provides extra possibilities and capacity to address the challenges associated with big data. In this paper, we propose Orion-a big data interface on the Tianhe-2 supercomputer-to enable big data applications to run on Tianhe-2 via a single command or a shell script. Orion supports multiple users, and each user can launch multiple tasks. It minimizes the effort needed to initiate big data applications on the Tianhe-2 supercomputer via automated configuration. Orion follows the "allocate-when-needed" paradigm, and it avoids the idle occupation of computational resources. We tested the utility and performance of Orion using a big genomic dataset and achieved a satisfactory performance on Tianhe-2 with very few modifications to existing applications that were implemented in Hadoop/Spark. In summary, Orion provides a practical and economical interface for big data processing on Tianhe-2.
A high-resolution oxygen A-band spectrometer (HABS) and its radiation closure
NASA Astrophysics Data System (ADS)
Min, Q.; Yin, B.; Li, S.; Berndt, J.; Harrison, L.; Joseph, E.; Duan, M.; Kiedron, P.
2014-06-01
Various studies indicate that high-resolution oxygen A-band spectrum has the capability to retrieve the vertical profiles of aerosol and cloud properties. To improve the understanding of oxygen A-band inversions and utility, we developed a high-resolution oxygen A-band spectrometer (HABS), and deployed it at Howard University Beltsville site during the NASA Discover Air-Quality Field Campaign in July, 2011. By using a single telescope, the HABS instrument measures the direct solar and the zenith diffuse radiation subsequently. HABS exhibits excellent performance: stable spectral response ratio, high signal-to-noise ratio (SNR), high-spectrum resolution (0.016 nm), and high out-of-band rejection (10-5). For the spectral retrievals of HABS measurements, a simulator is developed by combining a discrete ordinates radiative transfer code (DISORT) with the High Resolution Transmission (HITRAN) database HITRAN2008. The simulator uses a double-k approach to reduce the computational cost. The HABS-measured spectra are consistent with the related simulated spectra. For direct-beam spectra, the discrepancies between measurements and simulations, indicated by confidence intervals (95%) of relative difference, are (-0.06, 0.05) and (-0.08, 0.09) for solar zenith angles of 27 and 72°, respectively. For zenith diffuse spectra, the related discrepancies between measurements and simulations are (-0.06, 0.05) and (-0.08, 0.07) for solar zenith angles of 27 and 72°, respectively. The main discrepancies between measurements and simulations occur at or near the strong oxygen absorption line centers. They are mainly due to two kinds of causes: (1) measurement errors associated with the noise/spikes of HABS-measured spectra, as a result of combined effects of weak signal, low SNR, and errors in wavelength registration; (2) modeling errors in the simulation, including the error of model parameters setting (e.g., oxygen absorption line parameters, vertical profiles of temperature and pressure) and the lack of treatment of the rotational Raman scattering. The high-resolution oxygen A-band measurements from HABS can constrain the active radar retrievals for more accurate cloud optical properties (e.g., cloud optical depth, effective radius), particularly for multi-layer clouds and for mixed-phase clouds.
Development of a cloud-screening method for MAX-DOAS measurements
NASA Astrophysics Data System (ADS)
Gielen, Clio; Van Roozendael, Michel; Hendrik, Francois; Fayt, Caroline; Hermans, Christian; Pinardi, Gaia; Vlemmix, Tim
2013-04-01
In recent years, ground-based multi-axis differential absorption spectroscopy (MAX-DOAS) has shown to be ideally suited for the retrieval of tropospheric trace gases and deriving information on the aerosol properties. These measurements are invaluable to our understanding of the physics and chemistry of the atmospheric system, and the impact on the Earth's climate. Unfortunately, MAX-DOAS measurements are often performed under (partially) cloudy conditions, causing data quality degradation and higher uncertainties on the retrievals. A high aerosol load and/or a strong cloud cover can introduce additional photon absorption or multiple scattering. The first effect strongly impacts the retrieved differential slant columns (DSCDs) of the trace gases, leading to an underestimation of the atmospheric column density. Multiple scattering, on the other hand, becomes important for low clouds with a high optical depth, and cause a strong increase in the retrieved trace gas DSCDs. The presence of thin clouds can furthermore introduce a degeneracy in the retrieved aerosol optical depth, since they will have similar effect on the MAX-DOAS measurements. In this case, only information on the trace gas DSCDs can be successfully retrieved. If the cloud cover consists of broken or scattered clouds, the MAX-DOAS method becomes very unstable, since the different elevation angels will probe regions of the sky with strongly deviating properties. Here we present a method to qualify the sky and cloud conditions, using the colour index and O4 DSCDs, as derived from the MAX-DOAS measurements. The colour index is defined as the ratio of the intensities at the short- and long-wavelength part of the visible spectral range, typically at 400 nm and 670 nm. For increasing optical thickness due to clouds or aerosols, the colour index values decrease and values for different elevation angles converge. In the case of broken clouds, the colour index shows a strong and rapid temporal variation, which is easily detectable. Additional information is derived from the O4 DSCD measurements, since they are quite sensitive to the change of the light paths due to scattering at different altitudes. For example, thick clouds at low altitude show a very strong increase in the DSCD values due to scattering, combined with a low colour index value due to the intensity screening. In general, our method shows promising results to qualify the sky and cloud conditions of MAX- DOAS measurements, without the need for other external cloud-detection systems such as Brewer instruments or pyrheliometers.
Scalable cloud without dedicated storage
NASA Astrophysics Data System (ADS)
Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.
2015-05-01
We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Graham, Bruce
Cloud County Community College's (CCCC) Wind Energy Technology (WET) program is a leader in the renewable energy movement across Kansas and the USA. The field of renewable energy is a growing industry which continues to experience high demand for career opportunities. This CCCC/DOE project entailed two phases: 1) the installation of two Northwind 100 wind turbines, and 2) the continued development of the WET program curriculum, including enhancement of the CCCC Blade Repair Certificate program. This report provides a technical account of the total work performed, and is a comprehensive description of the results achieved.